Feb 01 06:38:37 localhost kernel: Linux version 5.14.0-284.11.1.el9_2.x86_64 (mockbuild@x86-vm-09.build.eng.bos.redhat.com) (gcc (GCC) 11.3.1 20221121 (Red Hat 11.3.1-4), GNU ld version 2.35.2-37.el9) #1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023
Feb 01 06:38:37 localhost kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Feb 01 06:38:37 localhost kernel: Command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 01 06:38:37 localhost kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Feb 01 06:38:37 localhost kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Feb 01 06:38:37 localhost kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Feb 01 06:38:37 localhost kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Feb 01 06:38:37 localhost kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format.
Feb 01 06:38:37 localhost kernel: signal: max sigframe size: 1776
Feb 01 06:38:37 localhost kernel: BIOS-provided physical RAM map:
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Feb 01 06:38:37 localhost kernel: BIOS-e820: [mem 0x0000000100000000-0x000000043fffffff] usable
Feb 01 06:38:37 localhost kernel: NX (Execute Disable) protection: active
Feb 01 06:38:37 localhost kernel: SMBIOS 2.8 present.
Feb 01 06:38:37 localhost kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Feb 01 06:38:37 localhost kernel: Hypervisor detected: KVM
Feb 01 06:38:37 localhost kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Feb 01 06:38:37 localhost kernel: kvm-clock: using sched offset of 2960669896 cycles
Feb 01 06:38:37 localhost kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Feb 01 06:38:37 localhost kernel: tsc: Detected 2799.998 MHz processor
Feb 01 06:38:37 localhost kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved
Feb 01 06:38:37 localhost kernel: e820: remove [mem 0x000a0000-0x000fffff] usable
Feb 01 06:38:37 localhost kernel: last_pfn = 0x440000 max_arch_pfn = 0x400000000
Feb 01 06:38:37 localhost kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Feb 01 06:38:37 localhost kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Feb 01 06:38:37 localhost kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Feb 01 06:38:37 localhost kernel: Using GB pages for direct mapping
Feb 01 06:38:37 localhost kernel: RAMDISK: [mem 0x2eef4000-0x33771fff]
Feb 01 06:38:37 localhost kernel: ACPI: Early table checksum verification disabled
Feb 01 06:38:37 localhost kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Feb 01 06:38:37 localhost kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 01 06:38:37 localhost kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 01 06:38:37 localhost kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 01 06:38:37 localhost kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Feb 01 06:38:37 localhost kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 01 06:38:37 localhost kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Feb 01 06:38:37 localhost kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Feb 01 06:38:37 localhost kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Feb 01 06:38:37 localhost kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Feb 01 06:38:37 localhost kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Feb 01 06:38:37 localhost kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Feb 01 06:38:37 localhost kernel: No NUMA configuration found
Feb 01 06:38:37 localhost kernel: Faking a node at [mem 0x0000000000000000-0x000000043fffffff]
Feb 01 06:38:37 localhost kernel: NODE_DATA(0) allocated [mem 0x43ffd3000-0x43fffdfff]
Feb 01 06:38:37 localhost kernel: Reserving 256MB of memory at 2800MB for crashkernel (System RAM: 16383MB)
Feb 01 06:38:37 localhost kernel: Zone ranges:
Feb 01 06:38:37 localhost kernel:   DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Feb 01 06:38:37 localhost kernel:   DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Feb 01 06:38:37 localhost kernel:   Normal   [mem 0x0000000100000000-0x000000043fffffff]
Feb 01 06:38:37 localhost kernel:   Device   empty
Feb 01 06:38:37 localhost kernel: Movable zone start for each node
Feb 01 06:38:37 localhost kernel: Early memory node ranges
Feb 01 06:38:37 localhost kernel:   node   0: [mem 0x0000000000001000-0x000000000009efff]
Feb 01 06:38:37 localhost kernel:   node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Feb 01 06:38:37 localhost kernel:   node   0: [mem 0x0000000100000000-0x000000043fffffff]
Feb 01 06:38:37 localhost kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000043fffffff]
Feb 01 06:38:37 localhost kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Feb 01 06:38:37 localhost kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Feb 01 06:38:37 localhost kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Feb 01 06:38:37 localhost kernel: ACPI: PM-Timer IO Port: 0x608
Feb 01 06:38:37 localhost kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Feb 01 06:38:37 localhost kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Feb 01 06:38:37 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Feb 01 06:38:37 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Feb 01 06:38:37 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Feb 01 06:38:37 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Feb 01 06:38:37 localhost kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Feb 01 06:38:37 localhost kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Feb 01 06:38:37 localhost kernel: TSC deadline timer available
Feb 01 06:38:37 localhost kernel: smpboot: Allowing 8 CPUs, 0 hotplug CPUs
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Feb 01 06:38:37 localhost kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Feb 01 06:38:37 localhost kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Feb 01 06:38:37 localhost kernel: Booting paravirtualized kernel on KVM
Feb 01 06:38:37 localhost kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Feb 01 06:38:37 localhost kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Feb 01 06:38:37 localhost kernel: percpu: Embedded 55 pages/cpu s188416 r8192 d28672 u262144
Feb 01 06:38:37 localhost kernel: pcpu-alloc: s188416 r8192 d28672 u262144 alloc=1*2097152
Feb 01 06:38:37 localhost kernel: pcpu-alloc: [0] 0 1 2 3 4 5 6 7 
Feb 01 06:38:37 localhost kernel: kvm-guest: PV spinlocks disabled, no host support
Feb 01 06:38:37 localhost kernel: Fallback order for Node 0: 0 
Feb 01 06:38:37 localhost kernel: Built 1 zonelists, mobility grouping on.  Total pages: 4128475
Feb 01 06:38:37 localhost kernel: Policy zone: Normal
Feb 01 06:38:37 localhost kernel: Kernel command line: BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 01 06:38:37 localhost kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64", will be passed to user space.
Feb 01 06:38:37 localhost kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear)
Feb 01 06:38:37 localhost kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Feb 01 06:38:37 localhost kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Feb 01 06:38:37 localhost kernel: software IO TLB: area num 8.
Feb 01 06:38:37 localhost kernel: Memory: 2826284K/16776676K available (14342K kernel code, 5536K rwdata, 10180K rodata, 2792K init, 7524K bss, 741268K reserved, 0K cma-reserved)
Feb 01 06:38:37 localhost kernel: random: get_random_u64 called from kmem_cache_open+0x1e/0x210 with crng_init=0
Feb 01 06:38:37 localhost kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Feb 01 06:38:37 localhost kernel: ftrace: allocating 44803 entries in 176 pages
Feb 01 06:38:37 localhost kernel: ftrace: allocated 176 pages with 3 groups
Feb 01 06:38:37 localhost kernel: Dynamic Preempt: voluntary
Feb 01 06:38:37 localhost kernel: rcu: Preemptible hierarchical RCU implementation.
Feb 01 06:38:37 localhost kernel: rcu:         RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Feb 01 06:38:37 localhost kernel:         Trampoline variant of Tasks RCU enabled.
Feb 01 06:38:37 localhost kernel:         Rude variant of Tasks RCU enabled.
Feb 01 06:38:37 localhost kernel:         Tracing variant of Tasks RCU enabled.
Feb 01 06:38:37 localhost kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Feb 01 06:38:37 localhost kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Feb 01 06:38:37 localhost kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Feb 01 06:38:37 localhost kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Feb 01 06:38:37 localhost kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Feb 01 06:38:37 localhost kernel: random: crng init done (trusting CPU's manufacturer)
Feb 01 06:38:37 localhost kernel: Console: colour VGA+ 80x25
Feb 01 06:38:37 localhost kernel: printk: console [tty0] enabled
Feb 01 06:38:37 localhost kernel: printk: console [ttyS0] enabled
Feb 01 06:38:37 localhost kernel: ACPI: Core revision 20211217
Feb 01 06:38:37 localhost kernel: APIC: Switch to symmetric I/O mode setup
Feb 01 06:38:37 localhost kernel: x2apic enabled
Feb 01 06:38:37 localhost kernel: Switched APIC routing to physical x2apic.
Feb 01 06:38:37 localhost kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Feb 01 06:38:37 localhost kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Feb 01 06:38:37 localhost kernel: pid_max: default: 32768 minimum: 301
Feb 01 06:38:37 localhost kernel: LSM: Security Framework initializing
Feb 01 06:38:37 localhost kernel: Yama: becoming mindful.
Feb 01 06:38:37 localhost kernel: SELinux:  Initializing.
Feb 01 06:38:37 localhost kernel: LSM support for eBPF active
Feb 01 06:38:37 localhost kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Feb 01 06:38:37 localhost kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear)
Feb 01 06:38:37 localhost kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Feb 01 06:38:37 localhost kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Feb 01 06:38:37 localhost kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Feb 01 06:38:37 localhost kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Feb 01 06:38:37 localhost kernel: Spectre V2 : Mitigation: Retpolines
Feb 01 06:38:37 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch
Feb 01 06:38:37 localhost kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT
Feb 01 06:38:37 localhost kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Feb 01 06:38:37 localhost kernel: RETBleed: Mitigation: untrained return thunk
Feb 01 06:38:37 localhost kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Feb 01 06:38:37 localhost kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Feb 01 06:38:37 localhost kernel: Freeing SMP alternatives memory: 36K
Feb 01 06:38:37 localhost kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Feb 01 06:38:37 localhost kernel: cblist_init_generic: Setting adjustable number of callback queues.
Feb 01 06:38:37 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 01 06:38:37 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 01 06:38:37 localhost kernel: cblist_init_generic: Setting shift to 3 and lim to 1.
Feb 01 06:38:37 localhost kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Feb 01 06:38:37 localhost kernel: ... version:                0
Feb 01 06:38:37 localhost kernel: ... bit width:              48
Feb 01 06:38:37 localhost kernel: ... generic registers:      6
Feb 01 06:38:37 localhost kernel: ... value mask:             0000ffffffffffff
Feb 01 06:38:37 localhost kernel: ... max period:             00007fffffffffff
Feb 01 06:38:37 localhost kernel: ... fixed-purpose events:   0
Feb 01 06:38:37 localhost kernel: ... event mask:             000000000000003f
Feb 01 06:38:37 localhost kernel: rcu: Hierarchical SRCU implementation.
Feb 01 06:38:37 localhost kernel: rcu:         Max phase no-delay instances is 400.
Feb 01 06:38:37 localhost kernel: smp: Bringing up secondary CPUs ...
Feb 01 06:38:37 localhost kernel: x86: Booting SMP configuration:
Feb 01 06:38:37 localhost kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Feb 01 06:38:37 localhost kernel: smp: Brought up 1 node, 8 CPUs
Feb 01 06:38:37 localhost kernel: smpboot: Max logical packages: 8
Feb 01 06:38:37 localhost kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Feb 01 06:38:37 localhost kernel: node 0 deferred pages initialised in 20ms
Feb 01 06:38:37 localhost kernel: devtmpfs: initialized
Feb 01 06:38:37 localhost kernel: x86/mm: Memory block size: 128MB
Feb 01 06:38:37 localhost kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Feb 01 06:38:37 localhost kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Feb 01 06:38:37 localhost kernel: pinctrl core: initialized pinctrl subsystem
Feb 01 06:38:37 localhost kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Feb 01 06:38:37 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations
Feb 01 06:38:37 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Feb 01 06:38:37 localhost kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Feb 01 06:38:37 localhost kernel: audit: initializing netlink subsys (disabled)
Feb 01 06:38:37 localhost kernel: audit: type=2000 audit(1769927916.087:1): state=initialized audit_enabled=0 res=1
Feb 01 06:38:37 localhost kernel: thermal_sys: Registered thermal governor 'fair_share'
Feb 01 06:38:37 localhost kernel: thermal_sys: Registered thermal governor 'step_wise'
Feb 01 06:38:37 localhost kernel: thermal_sys: Registered thermal governor 'user_space'
Feb 01 06:38:37 localhost kernel: cpuidle: using governor menu
Feb 01 06:38:37 localhost kernel: HugeTLB: can optimize 4095 vmemmap pages for hugepages-1048576kB
Feb 01 06:38:37 localhost kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Feb 01 06:38:37 localhost kernel: PCI: Using configuration type 1 for base access
Feb 01 06:38:37 localhost kernel: PCI: Using configuration type 1 for extended access
Feb 01 06:38:37 localhost kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Feb 01 06:38:37 localhost kernel: HugeTLB: can optimize 7 vmemmap pages for hugepages-2048kB
Feb 01 06:38:37 localhost kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages
Feb 01 06:38:37 localhost kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages
Feb 01 06:38:37 localhost kernel: cryptd: max_cpu_qlen set to 1000
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(Module Device)
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(Processor Device)
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(Processor Aggregator Device)
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(Linux-Dell-Video)
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio)
Feb 01 06:38:37 localhost kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics)
Feb 01 06:38:37 localhost kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Feb 01 06:38:37 localhost kernel: ACPI: Interpreter enabled
Feb 01 06:38:37 localhost kernel: ACPI: PM: (supports S0 S3 S4 S5)
Feb 01 06:38:37 localhost kernel: ACPI: Using IOAPIC for interrupt routing
Feb 01 06:38:37 localhost kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Feb 01 06:38:37 localhost kernel: PCI: Using E820 reservations for host bridge windows
Feb 01 06:38:37 localhost kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Feb 01 06:38:37 localhost kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Feb 01 06:38:37 localhost kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [3] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [4] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [5] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [6] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [7] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [8] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [9] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [10] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [11] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [12] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [13] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [14] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [15] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [16] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [17] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [18] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [19] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [20] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [21] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [22] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [23] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [24] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [25] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [26] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [27] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [28] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [29] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [30] registered
Feb 01 06:38:37 localhost kernel: acpiphp: Slot [31] registered
Feb 01 06:38:37 localhost kernel: PCI host bridge to bus 0000:00
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: root bus resource [mem 0x440000000-0x4bfffffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.1: reg 0x20: [io  0xc140-0xc14f]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io  0x01f0-0x01f7]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io  0x03f6]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io  0x0170-0x0177]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io  0x0376]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.2: reg 0x20: [io  0xc100-0xc11f]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: reg 0x18: [mem 0xfe800000-0xfe803fff 64bit pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000
Feb 01 06:38:37 localhost kernel: pci 0000:00:03.0: reg 0x10: [io  0xc080-0xc0bf]
Feb 01 06:38:37 localhost kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:03.0: reg 0x20: [mem 0xfe804000-0xfe807fff 64bit pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000
Feb 01 06:38:37 localhost kernel: pci 0000:00:04.0: reg 0x10: [io  0xc000-0xc07f]
Feb 01 06:38:37 localhost kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:04.0: reg 0x20: [mem 0xfe808000-0xfe80bfff 64bit pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00
Feb 01 06:38:37 localhost kernel: pci 0000:00:05.0: reg 0x10: [io  0xc0c0-0xc0ff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:05.0: reg 0x20: [mem 0xfe80c000-0xfe80ffff 64bit pref]
Feb 01 06:38:37 localhost kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00
Feb 01 06:38:37 localhost kernel: pci 0000:00:06.0: reg 0x10: [io  0xc120-0xc13f]
Feb 01 06:38:37 localhost kernel: pci 0000:00:06.0: reg 0x20: [mem 0xfe810000-0xfe813fff 64bit pref]
Feb 01 06:38:37 localhost kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Feb 01 06:38:37 localhost kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Feb 01 06:38:37 localhost kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Feb 01 06:38:37 localhost kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Feb 01 06:38:37 localhost kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Feb 01 06:38:37 localhost kernel: iommu: Default domain type: Translated 
Feb 01 06:38:37 localhost kernel: iommu: DMA domain TLB invalidation policy: lazy mode 
Feb 01 06:38:37 localhost kernel: SCSI subsystem initialized
Feb 01 06:38:37 localhost kernel: ACPI: bus type USB registered
Feb 01 06:38:37 localhost kernel: usbcore: registered new interface driver usbfs
Feb 01 06:38:37 localhost kernel: usbcore: registered new interface driver hub
Feb 01 06:38:37 localhost kernel: usbcore: registered new device driver usb
Feb 01 06:38:37 localhost kernel: pps_core: LinuxPPS API ver. 1 registered
Feb 01 06:38:37 localhost kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Feb 01 06:38:37 localhost kernel: PTP clock support registered
Feb 01 06:38:37 localhost kernel: EDAC MC: Ver: 3.0.0
Feb 01 06:38:37 localhost kernel: NetLabel: Initializing
Feb 01 06:38:37 localhost kernel: NetLabel:  domain hash size = 128
Feb 01 06:38:37 localhost kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Feb 01 06:38:37 localhost kernel: NetLabel:  unlabeled traffic allowed by default
Feb 01 06:38:37 localhost kernel: PCI: Using ACPI for IRQ routing
Feb 01 06:38:37 localhost kernel: PCI: pci_cache_line_size set to 64 bytes
Feb 01 06:38:37 localhost kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff]
Feb 01 06:38:37 localhost kernel: e820: reserve RAM buffer [mem 0xbffdb000-0xbfffffff]
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Feb 01 06:38:37 localhost kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Feb 01 06:38:37 localhost kernel: vgaarb: loaded
Feb 01 06:38:37 localhost kernel: clocksource: Switched to clocksource kvm-clock
Feb 01 06:38:37 localhost kernel: VFS: Disk quotas dquot_6.6.0
Feb 01 06:38:37 localhost kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Feb 01 06:38:37 localhost kernel: pnp: PnP ACPI init
Feb 01 06:38:37 localhost kernel: pnp 00:03: [dma 2]
Feb 01 06:38:37 localhost kernel: pnp: PnP ACPI: found 5 devices
Feb 01 06:38:37 localhost kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Feb 01 06:38:37 localhost kernel: NET: Registered PF_INET protocol family
Feb 01 06:38:37 localhost kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear)
Feb 01 06:38:37 localhost kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear)
Feb 01 06:38:37 localhost kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Feb 01 06:38:37 localhost kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Feb 01 06:38:37 localhost kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Feb 01 06:38:37 localhost kernel: TCP: Hash tables configured (established 131072 bind 65536)
Feb 01 06:38:37 localhost kernel: MPTCP token hash table entries: 16384 (order: 6, 393216 bytes, linear)
Feb 01 06:38:37 localhost kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear)
Feb 01 06:38:37 localhost kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear)
Feb 01 06:38:37 localhost kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Feb 01 06:38:37 localhost kernel: NET: Registered PF_XDP protocol family
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Feb 01 06:38:37 localhost kernel: pci_bus 0000:00: resource 8 [mem 0x440000000-0x4bfffffff window]
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Feb 01 06:38:37 localhost kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Feb 01 06:38:37 localhost kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Feb 01 06:38:37 localhost kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 27927 usecs
Feb 01 06:38:37 localhost kernel: PCI: CLS 0 bytes, default 64
Feb 01 06:38:37 localhost kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Feb 01 06:38:37 localhost kernel: Trying to unpack rootfs image as initramfs...
Feb 01 06:38:37 localhost kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Feb 01 06:38:37 localhost kernel: ACPI: bus type thunderbolt registered
Feb 01 06:38:37 localhost kernel: Initialise system trusted keyrings
Feb 01 06:38:37 localhost kernel: Key type blacklist registered
Feb 01 06:38:37 localhost kernel: workingset: timestamp_bits=36 max_order=22 bucket_order=0
Feb 01 06:38:37 localhost kernel: zbud: loaded
Feb 01 06:38:37 localhost kernel: integrity: Platform Keyring initialized
Feb 01 06:38:37 localhost kernel: NET: Registered PF_ALG protocol family
Feb 01 06:38:37 localhost kernel: xor: automatically using best checksumming function   avx       
Feb 01 06:38:37 localhost kernel: Key type asymmetric registered
Feb 01 06:38:37 localhost kernel: Asymmetric key parser 'x509' registered
Feb 01 06:38:37 localhost kernel: Running certificate verification selftests
Feb 01 06:38:37 localhost kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Feb 01 06:38:37 localhost kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Feb 01 06:38:37 localhost kernel: io scheduler mq-deadline registered
Feb 01 06:38:37 localhost kernel: io scheduler kyber registered
Feb 01 06:38:37 localhost kernel: io scheduler bfq registered
Feb 01 06:38:37 localhost kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Feb 01 06:38:37 localhost kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Feb 01 06:38:37 localhost kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Feb 01 06:38:37 localhost kernel: ACPI: button: Power Button [PWRF]
Feb 01 06:38:37 localhost kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Feb 01 06:38:37 localhost kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Feb 01 06:38:37 localhost kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Feb 01 06:38:37 localhost kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Feb 01 06:38:37 localhost kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Feb 01 06:38:37 localhost kernel: Non-volatile memory driver v1.3
Feb 01 06:38:37 localhost kernel: rdac: device handler registered
Feb 01 06:38:37 localhost kernel: hp_sw: device handler registered
Feb 01 06:38:37 localhost kernel: emc: device handler registered
Feb 01 06:38:37 localhost kernel: alua: device handler registered
Feb 01 06:38:37 localhost kernel: libphy: Fixed MDIO Bus: probed
Feb 01 06:38:37 localhost kernel: ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver
Feb 01 06:38:37 localhost kernel: ehci-pci: EHCI PCI platform driver
Feb 01 06:38:37 localhost kernel: ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver
Feb 01 06:38:37 localhost kernel: ohci-pci: OHCI PCI platform driver
Feb 01 06:38:37 localhost kernel: uhci_hcd: USB Universal Host Controller Interface driver
Feb 01 06:38:37 localhost kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Feb 01 06:38:37 localhost kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Feb 01 06:38:37 localhost kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Feb 01 06:38:37 localhost kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Feb 01 06:38:37 localhost kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Feb 01 06:38:37 localhost kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Feb 01 06:38:37 localhost kernel: usb usb1: Product: UHCI Host Controller
Feb 01 06:38:37 localhost kernel: usb usb1: Manufacturer: Linux 5.14.0-284.11.1.el9_2.x86_64 uhci_hcd
Feb 01 06:38:37 localhost kernel: usb usb1: SerialNumber: 0000:00:01.2
Feb 01 06:38:37 localhost kernel: hub 1-0:1.0: USB hub found
Feb 01 06:38:37 localhost kernel: hub 1-0:1.0: 2 ports detected
Feb 01 06:38:37 localhost kernel: usbcore: registered new interface driver usbserial_generic
Feb 01 06:38:37 localhost kernel: usbserial: USB Serial support registered for generic
Feb 01 06:38:37 localhost kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Feb 01 06:38:37 localhost kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Feb 01 06:38:37 localhost kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Feb 01 06:38:37 localhost kernel: mousedev: PS/2 mouse device common for all mice
Feb 01 06:38:37 localhost kernel: rtc_cmos 00:04: RTC can wake from S4
Feb 01 06:38:37 localhost kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Feb 01 06:38:37 localhost kernel: rtc_cmos 00:04: registered as rtc0
Feb 01 06:38:37 localhost kernel: rtc_cmos 00:04: setting system clock to 2026-02-01T06:38:36 UTC (1769927916)
Feb 01 06:38:37 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Feb 01 06:38:37 localhost kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Feb 01 06:38:37 localhost kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Feb 01 06:38:37 localhost kernel: hid: raw HID events driver (C) Jiri Kosina
Feb 01 06:38:37 localhost kernel: usbcore: registered new interface driver usbhid
Feb 01 06:38:37 localhost kernel: usbhid: USB HID core driver
Feb 01 06:38:37 localhost kernel: drop_monitor: Initializing network drop monitor service
Feb 01 06:38:37 localhost kernel: Initializing XFRM netlink socket
Feb 01 06:38:37 localhost kernel: NET: Registered PF_INET6 protocol family
Feb 01 06:38:37 localhost kernel: Segment Routing with IPv6
Feb 01 06:38:37 localhost kernel: NET: Registered PF_PACKET protocol family
Feb 01 06:38:37 localhost kernel: mpls_gso: MPLS GSO support
Feb 01 06:38:37 localhost kernel: IPI shorthand broadcast: enabled
Feb 01 06:38:37 localhost kernel: AVX2 version of gcm_enc/dec engaged.
Feb 01 06:38:37 localhost kernel: AES CTR mode by8 optimization enabled
Feb 01 06:38:37 localhost kernel: sched_clock: Marking stable (764870699, 178189124)->(1070359052, -127299229)
Feb 01 06:38:37 localhost kernel: registered taskstats version 1
Feb 01 06:38:37 localhost kernel: Loading compiled-in X.509 certificates
Feb 01 06:38:37 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Feb 01 06:38:37 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Feb 01 06:38:37 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Feb 01 06:38:37 localhost kernel: zswap: loaded using pool lzo/zbud
Feb 01 06:38:37 localhost kernel: page_owner is disabled
Feb 01 06:38:37 localhost kernel: Key type big_key registered
Feb 01 06:38:37 localhost kernel: Freeing initrd memory: 74232K
Feb 01 06:38:37 localhost kernel: Key type encrypted registered
Feb 01 06:38:37 localhost kernel: ima: No TPM chip found, activating TPM-bypass!
Feb 01 06:38:37 localhost kernel: Loading compiled-in module X.509 certificates
Feb 01 06:38:37 localhost kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kernel signing key: aaec4b640ef162b54684864066c7d4ffd428cd72'
Feb 01 06:38:37 localhost kernel: ima: Allocated hash algorithm: sha256
Feb 01 06:38:37 localhost kernel: ima: No architecture policies found
Feb 01 06:38:37 localhost kernel: evm: Initialising EVM extended attributes:
Feb 01 06:38:37 localhost kernel: evm: security.selinux
Feb 01 06:38:37 localhost kernel: evm: security.SMACK64 (disabled)
Feb 01 06:38:37 localhost kernel: evm: security.SMACK64EXEC (disabled)
Feb 01 06:38:37 localhost kernel: evm: security.SMACK64TRANSMUTE (disabled)
Feb 01 06:38:37 localhost kernel: evm: security.SMACK64MMAP (disabled)
Feb 01 06:38:37 localhost kernel: evm: security.apparmor (disabled)
Feb 01 06:38:37 localhost kernel: evm: security.ima
Feb 01 06:38:37 localhost kernel: evm: security.capability
Feb 01 06:38:37 localhost kernel: evm: HMAC attrs: 0x1
Feb 01 06:38:37 localhost kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Feb 01 06:38:37 localhost kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Feb 01 06:38:37 localhost kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Feb 01 06:38:37 localhost kernel: usb 1-1: Product: QEMU USB Tablet
Feb 01 06:38:37 localhost kernel: usb 1-1: Manufacturer: QEMU
Feb 01 06:38:37 localhost kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Feb 01 06:38:37 localhost kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Feb 01 06:38:37 localhost kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Feb 01 06:38:37 localhost kernel: Freeing unused decrypted memory: 2036K
Feb 01 06:38:37 localhost kernel: Freeing unused kernel image (initmem) memory: 2792K
Feb 01 06:38:37 localhost kernel: Write protecting the kernel read-only data: 26624k
Feb 01 06:38:37 localhost kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K
Feb 01 06:38:37 localhost kernel: Freeing unused kernel image (rodata/data gap) memory: 60K
Feb 01 06:38:37 localhost kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Feb 01 06:38:37 localhost kernel: Run /init as init process
Feb 01 06:38:37 localhost kernel:   with arguments:
Feb 01 06:38:37 localhost kernel:     /init
Feb 01 06:38:37 localhost kernel:   with environment:
Feb 01 06:38:37 localhost kernel:     HOME=/
Feb 01 06:38:37 localhost kernel:     TERM=linux
Feb 01 06:38:37 localhost kernel:     BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64
Feb 01 06:38:37 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 01 06:38:37 localhost systemd[1]: Detected virtualization kvm.
Feb 01 06:38:37 localhost systemd[1]: Detected architecture x86-64.
Feb 01 06:38:37 localhost systemd[1]: Running in initrd.
Feb 01 06:38:37 localhost systemd[1]: No hostname configured, using default hostname.
Feb 01 06:38:37 localhost systemd[1]: Hostname set to <localhost>.
Feb 01 06:38:37 localhost systemd[1]: Initializing machine ID from VM UUID.
Feb 01 06:38:37 localhost systemd[1]: Queued start job for default target Initrd Default Target.
Feb 01 06:38:37 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 01 06:38:37 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 01 06:38:37 localhost systemd[1]: Reached target Initrd /usr File System.
Feb 01 06:38:37 localhost systemd[1]: Reached target Local File Systems.
Feb 01 06:38:37 localhost systemd[1]: Reached target Path Units.
Feb 01 06:38:37 localhost systemd[1]: Reached target Slice Units.
Feb 01 06:38:37 localhost systemd[1]: Reached target Swaps.
Feb 01 06:38:37 localhost systemd[1]: Reached target Timer Units.
Feb 01 06:38:37 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 01 06:38:37 localhost systemd[1]: Listening on Journal Socket (/dev/log).
Feb 01 06:38:37 localhost systemd[1]: Listening on Journal Socket.
Feb 01 06:38:37 localhost systemd[1]: Listening on udev Control Socket.
Feb 01 06:38:37 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 01 06:38:37 localhost systemd[1]: Reached target Socket Units.
Feb 01 06:38:37 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 01 06:38:37 localhost systemd[1]: Starting Journal Service...
Feb 01 06:38:37 localhost systemd[1]: Starting Load Kernel Modules...
Feb 01 06:38:37 localhost systemd[1]: Starting Create System Users...
Feb 01 06:38:37 localhost systemd[1]: Starting Setup Virtual Console...
Feb 01 06:38:37 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 01 06:38:37 localhost systemd[1]: Finished Load Kernel Modules.
Feb 01 06:38:37 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 01 06:38:37 localhost systemd-journald[284]: Journal started
Feb 01 06:38:37 localhost systemd-journald[284]: Runtime Journal (/run/log/journal/b72fb79934724728b6e2ec98d2bbb61b) is 8.0M, max 314.7M, 306.7M free.
Feb 01 06:38:37 localhost systemd-modules-load[285]: Module 'msr' is built in
Feb 01 06:38:37 localhost systemd[1]: Started Journal Service.
Feb 01 06:38:37 localhost systemd[1]: Finished Setup Virtual Console.
Feb 01 06:38:37 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 01 06:38:37 localhost systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Feb 01 06:38:37 localhost systemd[1]: Starting dracut cmdline hook...
Feb 01 06:38:37 localhost systemd-sysusers[286]: Creating group 'sgx' with GID 997.
Feb 01 06:38:37 localhost systemd-sysusers[286]: Creating group 'users' with GID 100.
Feb 01 06:38:37 localhost systemd-sysusers[286]: Creating group 'dbus' with GID 81.
Feb 01 06:38:37 localhost systemd-sysusers[286]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Feb 01 06:38:37 localhost systemd[1]: Finished Create System Users.
Feb 01 06:38:37 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 01 06:38:37 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 01 06:38:37 localhost dracut-cmdline[291]: dracut-9.2 (Plow) dracut-057-21.git20230214.el9
Feb 01 06:38:37 localhost dracut-cmdline[291]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,gpt3)/vmlinuz-5.14.0-284.11.1.el9_2.x86_64 root=UUID=a3dd82de-ffc6-4652-88b9-80e003b8f20a console=tty0 console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M
Feb 01 06:38:37 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 01 06:38:37 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 01 06:38:37 localhost systemd[1]: Finished dracut cmdline hook.
Feb 01 06:38:37 localhost systemd[1]: Starting dracut pre-udev hook...
Feb 01 06:38:37 localhost kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Feb 01 06:38:37 localhost kernel: device-mapper: uevent: version 1.0.3
Feb 01 06:38:37 localhost kernel: device-mapper: ioctl: 4.47.0-ioctl (2022-07-28) initialised: dm-devel@redhat.com
Feb 01 06:38:37 localhost kernel: RPC: Registered named UNIX socket transport module.
Feb 01 06:38:37 localhost kernel: RPC: Registered udp transport module.
Feb 01 06:38:37 localhost kernel: RPC: Registered tcp transport module.
Feb 01 06:38:37 localhost kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Feb 01 06:38:37 localhost rpc.statd[408]: Version 2.5.4 starting
Feb 01 06:38:37 localhost rpc.statd[408]: Initializing NSM state
Feb 01 06:38:37 localhost rpc.idmapd[413]: Setting log level to 0
Feb 01 06:38:37 localhost systemd[1]: Finished dracut pre-udev hook.
Feb 01 06:38:37 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 01 06:38:37 localhost systemd-udevd[426]: Using default interface naming scheme 'rhel-9.0'.
Feb 01 06:38:37 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 01 06:38:37 localhost systemd[1]: Starting dracut pre-trigger hook...
Feb 01 06:38:37 localhost systemd[1]: Finished dracut pre-trigger hook.
Feb 01 06:38:37 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 01 06:38:37 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 01 06:38:37 localhost systemd[1]: Reached target System Initialization.
Feb 01 06:38:37 localhost systemd[1]: Reached target Basic System.
Feb 01 06:38:37 localhost systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 01 06:38:37 localhost systemd[1]: Reached target Network.
Feb 01 06:38:37 localhost systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Feb 01 06:38:37 localhost systemd[1]: Starting dracut initqueue hook...
Feb 01 06:38:37 localhost kernel: virtio_blk virtio2: [vda] 838860800 512-byte logical blocks (429 GB/400 GiB)
Feb 01 06:38:37 localhost kernel: libata version 3.00 loaded.
Feb 01 06:38:37 localhost kernel: ata_piix 0000:00:01.1: version 2.13
Feb 01 06:38:37 localhost kernel: scsi host0: ata_piix
Feb 01 06:38:37 localhost kernel: scsi host1: ata_piix
Feb 01 06:38:37 localhost kernel: GPT:Primary header thinks Alt. header is not at the end of the disk.
Feb 01 06:38:37 localhost kernel: GPT:20971519 != 838860799
Feb 01 06:38:37 localhost kernel: GPT:Alternate GPT header not at the end of the disk.
Feb 01 06:38:37 localhost kernel: GPT:20971519 != 838860799
Feb 01 06:38:37 localhost kernel: GPT: Use GNU Parted to correct GPT errors.
Feb 01 06:38:37 localhost kernel:  vda: vda1 vda2 vda3 vda4
Feb 01 06:38:37 localhost kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14
Feb 01 06:38:37 localhost kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15
Feb 01 06:38:37 localhost systemd-udevd[443]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 06:38:37 localhost systemd[1]: Found device /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Feb 01 06:38:37 localhost systemd[1]: Reached target Initrd Root Device.
Feb 01 06:38:37 localhost kernel: ata1: found unknown device (class 0)
Feb 01 06:38:37 localhost kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Feb 01 06:38:37 localhost kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Feb 01 06:38:37 localhost kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Feb 01 06:38:37 localhost kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Feb 01 06:38:37 localhost kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Feb 01 06:38:37 localhost kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0
Feb 01 06:38:38 localhost systemd[1]: Finished dracut initqueue hook.
Feb 01 06:38:38 localhost systemd[1]: Reached target Preparation for Remote File Systems.
Feb 01 06:38:38 localhost systemd[1]: Reached target Remote Encrypted Volumes.
Feb 01 06:38:38 localhost systemd[1]: Reached target Remote File Systems.
Feb 01 06:38:38 localhost systemd[1]: Starting dracut pre-mount hook...
Feb 01 06:38:38 localhost systemd[1]: Finished dracut pre-mount hook.
Feb 01 06:38:38 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a...
Feb 01 06:38:38 localhost systemd-fsck[513]: /usr/sbin/fsck.xfs: XFS file system.
Feb 01 06:38:38 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a.
Feb 01 06:38:38 localhost systemd[1]: Mounting /sysroot...
Feb 01 06:38:38 localhost kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Feb 01 06:38:38 localhost kernel: XFS (vda4): Mounting V5 Filesystem
Feb 01 06:38:38 localhost kernel: XFS (vda4): Ending clean mount
Feb 01 06:38:38 localhost systemd[1]: Mounted /sysroot.
Feb 01 06:38:38 localhost systemd[1]: Reached target Initrd Root File System.
Feb 01 06:38:38 localhost systemd[1]: Starting Mountpoints Configured in the Real Root...
Feb 01 06:38:38 localhost systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Finished Mountpoints Configured in the Real Root.
Feb 01 06:38:38 localhost systemd[1]: Reached target Initrd File Systems.
Feb 01 06:38:38 localhost systemd[1]: Reached target Initrd Default Target.
Feb 01 06:38:38 localhost systemd[1]: Starting dracut mount hook...
Feb 01 06:38:38 localhost systemd[1]: Finished dracut mount hook.
Feb 01 06:38:38 localhost systemd[1]: Starting dracut pre-pivot and cleanup hook...
Feb 01 06:38:38 localhost rpc.idmapd[413]: exiting on signal 15
Feb 01 06:38:38 localhost systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Finished dracut pre-pivot and cleanup hook.
Feb 01 06:38:38 localhost systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Feb 01 06:38:38 localhost systemd[1]: Stopped target Network.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Remote Encrypted Volumes.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Timer Units.
Feb 01 06:38:38 localhost systemd[1]: dbus.socket: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Closed D-Bus System Message Bus Socket.
Feb 01 06:38:38 localhost systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Initrd Default Target.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Basic System.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Initrd Root Device.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Initrd /usr File System.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Path Units.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Remote File Systems.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Preparation for Remote File Systems.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Slice Units.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Socket Units.
Feb 01 06:38:38 localhost systemd[1]: Stopped target System Initialization.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Local File Systems.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Swaps.
Feb 01 06:38:38 localhost systemd[1]: dracut-mount.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut mount hook.
Feb 01 06:38:38 localhost systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut pre-mount hook.
Feb 01 06:38:38 localhost systemd[1]: Stopped target Local Encrypted Volumes.
Feb 01 06:38:38 localhost systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Feb 01 06:38:38 localhost systemd[1]: dracut-initqueue.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut initqueue hook.
Feb 01 06:38:38 localhost systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Apply Kernel Variables.
Feb 01 06:38:38 localhost systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Load Kernel Modules.
Feb 01 06:38:38 localhost systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Create Volatile Files and Directories.
Feb 01 06:38:38 localhost systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Coldplug All udev Devices.
Feb 01 06:38:38 localhost systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut pre-trigger hook.
Feb 01 06:38:38 localhost systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 01 06:38:38 localhost systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Setup Virtual Console.
Feb 01 06:38:38 localhost systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 01 06:38:38 localhost systemd[1]: initrd-cleanup.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Feb 01 06:38:38 localhost systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Closed udev Control Socket.
Feb 01 06:38:38 localhost systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Closed udev Kernel Socket.
Feb 01 06:38:38 localhost systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut pre-udev hook.
Feb 01 06:38:38 localhost systemd[1]: dracut-cmdline.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped dracut cmdline hook.
Feb 01 06:38:38 localhost systemd[1]: Starting Cleanup udev Database...
Feb 01 06:38:38 localhost systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Create Static Device Nodes in /dev.
Feb 01 06:38:38 localhost systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Create List of Static Device Nodes.
Feb 01 06:38:38 localhost systemd[1]: systemd-sysusers.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Stopped Create System Users.
Feb 01 06:38:38 localhost systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Feb 01 06:38:38 localhost systemd[1]: Finished Cleanup udev Database.
Feb 01 06:38:38 localhost systemd[1]: Reached target Switch Root.
Feb 01 06:38:38 localhost systemd[1]: Starting Switch Root...
Feb 01 06:38:38 localhost systemd[1]: Switching root.
Feb 01 06:38:38 localhost systemd-journald[284]: Journal stopped
Feb 01 06:38:39 localhost systemd-journald[284]: Received SIGTERM from PID 1 (systemd).
Feb 01 06:38:39 localhost kernel: audit: type=1404 audit(1769927918.797:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability open_perms=1
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability always_check_network=0
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 06:38:39 localhost kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 06:38:39 localhost kernel: audit: type=1403 audit(1769927918.957:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Feb 01 06:38:39 localhost systemd[1]: Successfully loaded SELinux policy in 163.182ms.
Feb 01 06:38:39 localhost systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 35.512ms.
Feb 01 06:38:39 localhost systemd[1]: systemd 252-13.el9_2 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 01 06:38:39 localhost systemd[1]: Detected virtualization kvm.
Feb 01 06:38:39 localhost systemd[1]: Detected architecture x86-64.
Feb 01 06:38:39 localhost systemd-rc-local-generator[583]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 06:38:39 localhost systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 06:38:39 localhost systemd[1]: initrd-switch-root.service: Deactivated successfully.
Feb 01 06:38:39 localhost systemd[1]: Stopped Switch Root.
Feb 01 06:38:39 localhost systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Feb 01 06:38:39 localhost systemd[1]: Created slice Slice /system/getty.
Feb 01 06:38:39 localhost systemd[1]: Created slice Slice /system/modprobe.
Feb 01 06:38:39 localhost systemd[1]: Created slice Slice /system/serial-getty.
Feb 01 06:38:39 localhost systemd[1]: Created slice Slice /system/sshd-keygen.
Feb 01 06:38:39 localhost systemd[1]: Created slice Slice /system/systemd-fsck.
Feb 01 06:38:39 localhost systemd[1]: Created slice User and Session Slice.
Feb 01 06:38:39 localhost systemd[1]: Started Dispatch Password Requests to Console Directory Watch.
Feb 01 06:38:39 localhost systemd[1]: Started Forward Password Requests to Wall Directory Watch.
Feb 01 06:38:39 localhost systemd[1]: Set up automount Arbitrary Executable File Formats File System Automount Point.
Feb 01 06:38:39 localhost systemd[1]: Reached target Local Encrypted Volumes.
Feb 01 06:38:39 localhost systemd[1]: Stopped target Switch Root.
Feb 01 06:38:39 localhost systemd[1]: Stopped target Initrd File Systems.
Feb 01 06:38:39 localhost systemd[1]: Stopped target Initrd Root File System.
Feb 01 06:38:39 localhost systemd[1]: Reached target Local Integrity Protected Volumes.
Feb 01 06:38:39 localhost systemd[1]: Reached target Path Units.
Feb 01 06:38:39 localhost systemd[1]: Reached target rpc_pipefs.target.
Feb 01 06:38:39 localhost systemd[1]: Reached target Slice Units.
Feb 01 06:38:39 localhost systemd[1]: Reached target Swaps.
Feb 01 06:38:39 localhost systemd[1]: Reached target Local Verity Protected Volumes.
Feb 01 06:38:39 localhost systemd[1]: Listening on RPCbind Server Activation Socket.
Feb 01 06:38:39 localhost systemd[1]: Reached target RPC Port Mapper.
Feb 01 06:38:39 localhost systemd[1]: Listening on Process Core Dump Socket.
Feb 01 06:38:39 localhost systemd[1]: Listening on initctl Compatibility Named Pipe.
Feb 01 06:38:39 localhost systemd[1]: Listening on udev Control Socket.
Feb 01 06:38:39 localhost systemd[1]: Listening on udev Kernel Socket.
Feb 01 06:38:39 localhost systemd[1]: Mounting Huge Pages File System...
Feb 01 06:38:39 localhost systemd[1]: Mounting POSIX Message Queue File System...
Feb 01 06:38:39 localhost systemd[1]: Mounting Kernel Debug File System...
Feb 01 06:38:39 localhost systemd[1]: Mounting Kernel Trace File System...
Feb 01 06:38:39 localhost systemd[1]: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 01 06:38:39 localhost systemd[1]: Starting Create List of Static Device Nodes...
Feb 01 06:38:39 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 01 06:38:39 localhost systemd[1]: Starting Load Kernel Module drm...
Feb 01 06:38:39 localhost systemd[1]: Starting Load Kernel Module fuse...
Feb 01 06:38:39 localhost systemd[1]: Starting Read and set NIS domainname from /etc/sysconfig/network...
Feb 01 06:38:39 localhost systemd[1]: systemd-fsck-root.service: Deactivated successfully.
Feb 01 06:38:39 localhost systemd[1]: Stopped File System Check on Root Device.
Feb 01 06:38:39 localhost systemd[1]: Stopped Journal Service.
Feb 01 06:38:39 localhost systemd[1]: Starting Journal Service...
Feb 01 06:38:39 localhost systemd[1]: Starting Load Kernel Modules...
Feb 01 06:38:39 localhost kernel: fuse: init (API version 7.36)
Feb 01 06:38:39 localhost systemd[1]: Starting Generate network units from Kernel command line...
Feb 01 06:38:39 localhost systemd[1]: Starting Remount Root and Kernel File Systems...
Feb 01 06:38:39 localhost systemd[1]: Repartition Root Disk was skipped because no trigger condition checks were met.
Feb 01 06:38:39 localhost systemd[1]: Starting Coldplug All udev Devices...
Feb 01 06:38:39 localhost systemd[1]: Mounted Huge Pages File System.
Feb 01 06:38:39 localhost kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Feb 01 06:38:39 localhost systemd[1]: Mounted POSIX Message Queue File System.
Feb 01 06:38:39 localhost systemd[1]: Mounted Kernel Debug File System.
Feb 01 06:38:39 localhost systemd[1]: Mounted Kernel Trace File System.
Feb 01 06:38:39 localhost systemd-journald[619]: Journal started
Feb 01 06:38:39 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/00836dadc27b01f9fb0a211cca69e688) is 8.0M, max 314.7M, 306.7M free.
Feb 01 06:38:39 localhost systemd[1]: Queued start job for default target Multi-User System.
Feb 01 06:38:39 localhost systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 01 06:38:39 localhost systemd-modules-load[620]: Module 'msr' is built in
Feb 01 06:38:39 localhost systemd[1]: Started Journal Service.
Feb 01 06:38:39 localhost systemd[1]: Finished Create List of Static Device Nodes.
Feb 01 06:38:39 localhost kernel: ACPI: bus type drm_connector registered
Feb 01 06:38:39 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 01 06:38:39 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 01 06:38:39 localhost systemd[1]: modprobe@drm.service: Deactivated successfully.
Feb 01 06:38:39 localhost systemd[1]: Finished Load Kernel Module drm.
Feb 01 06:38:39 localhost systemd[1]: modprobe@fuse.service: Deactivated successfully.
Feb 01 06:38:39 localhost systemd[1]: Finished Load Kernel Module fuse.
Feb 01 06:38:39 localhost systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Feb 01 06:38:39 localhost systemd[1]: Finished Load Kernel Modules.
Feb 01 06:38:39 localhost systemd[1]: Finished Generate network units from Kernel command line.
Feb 01 06:38:39 localhost systemd[1]: Finished Remount Root and Kernel File Systems.
Feb 01 06:38:39 localhost systemd[1]: Mounting FUSE Control File System...
Feb 01 06:38:39 localhost systemd[1]: Mounting Kernel Configuration File System...
Feb 01 06:38:39 localhost systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 01 06:38:39 localhost systemd[1]: Starting Rebuild Hardware Database...
Feb 01 06:38:39 localhost systemd[1]: Starting Flush Journal to Persistent Storage...
Feb 01 06:38:39 localhost systemd[1]: Starting Load/Save Random Seed...
Feb 01 06:38:39 localhost systemd[1]: Starting Apply Kernel Variables...
Feb 01 06:38:39 localhost systemd[1]: Starting Create System Users...
Feb 01 06:38:39 localhost systemd-journald[619]: Runtime Journal (/run/log/journal/00836dadc27b01f9fb0a211cca69e688) is 8.0M, max 314.7M, 306.7M free.
Feb 01 06:38:39 localhost systemd-journald[619]: Received client request to flush runtime journal.
Feb 01 06:38:39 localhost systemd[1]: Mounted FUSE Control File System.
Feb 01 06:38:39 localhost systemd[1]: Mounted Kernel Configuration File System.
Feb 01 06:38:39 localhost systemd[1]: Finished Flush Journal to Persistent Storage.
Feb 01 06:38:39 localhost systemd[1]: Finished Load/Save Random Seed.
Feb 01 06:38:39 localhost systemd[1]: Finished Apply Kernel Variables.
Feb 01 06:38:39 localhost systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Feb 01 06:38:39 localhost systemd-sysusers[632]: Creating group 'sgx' with GID 989.
Feb 01 06:38:39 localhost systemd-sysusers[632]: Creating group 'systemd-oom' with GID 988.
Feb 01 06:38:39 localhost systemd-sysusers[632]: Creating user 'systemd-oom' (systemd Userspace OOM Killer) with UID 988 and GID 988.
Feb 01 06:38:39 localhost systemd[1]: Finished Create System Users.
Feb 01 06:38:39 localhost systemd[1]: Starting Create Static Device Nodes in /dev...
Feb 01 06:38:39 localhost systemd[1]: Finished Coldplug All udev Devices.
Feb 01 06:38:39 localhost systemd[1]: Finished Create Static Device Nodes in /dev.
Feb 01 06:38:39 localhost systemd[1]: Reached target Preparation for Local File Systems.
Feb 01 06:38:39 localhost systemd[1]: Set up automount EFI System Partition Automount.
Feb 01 06:38:40 localhost systemd[1]: Finished Rebuild Hardware Database.
Feb 01 06:38:40 localhost systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 01 06:38:40 localhost systemd-udevd[636]: Using default interface naming scheme 'rhel-9.0'.
Feb 01 06:38:40 localhost systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 01 06:38:40 localhost systemd[1]: Starting Load Kernel Module configfs...
Feb 01 06:38:40 localhost systemd[1]: modprobe@configfs.service: Deactivated successfully.
Feb 01 06:38:40 localhost systemd[1]: Finished Load Kernel Module configfs.
Feb 01 06:38:40 localhost systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Feb 01 06:38:40 localhost systemd-udevd[641]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 06:38:40 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/7B77-95E7 being skipped.
Feb 01 06:38:40 localhost systemd[1]: Starting File System Check on /dev/disk/by-uuid/7B77-95E7...
Feb 01 06:38:40 localhost kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Feb 01 06:38:40 localhost kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Feb 01 06:38:40 localhost systemd[1]: Condition check resulted in /dev/disk/by-uuid/b141154b-6a70-437a-a97f-d160c9ba37eb being skipped.
Feb 01 06:38:40 localhost systemd-fsck[681]: fsck.fat 4.2 (2021-01-31)
Feb 01 06:38:40 localhost systemd-fsck[681]: /dev/vda2: 12 files, 1782/51145 clusters
Feb 01 06:38:40 localhost systemd[1]: Finished File System Check on /dev/disk/by-uuid/7B77-95E7.
Feb 01 06:38:40 localhost kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Feb 01 06:38:40 localhost kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Feb 01 06:38:40 localhost kernel: Console: switching to colour dummy device 80x25
Feb 01 06:38:40 localhost kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Feb 01 06:38:40 localhost kernel: [drm] features: -context_init
Feb 01 06:38:40 localhost kernel: [drm] number of scanouts: 1
Feb 01 06:38:40 localhost kernel: [drm] number of cap sets: 0
Feb 01 06:38:40 localhost kernel: [drm] Initialized virtio_gpu 0.1.0 0 for virtio0 on minor 0
Feb 01 06:38:40 localhost kernel: virtio_gpu virtio0: [drm] drm_plane_enable_fb_damage_clips() not called
Feb 01 06:38:40 localhost kernel: Console: switching to colour frame buffer device 128x48
Feb 01 06:38:40 localhost kernel: SVM: TSC scaling supported
Feb 01 06:38:40 localhost kernel: kvm: Nested Virtualization enabled
Feb 01 06:38:40 localhost kernel: SVM: kvm: Nested Paging enabled
Feb 01 06:38:40 localhost kernel: virtio_gpu virtio0: [drm] fb0: virtio_gpudrmfb frame buffer device
Feb 01 06:38:40 localhost kernel: SVM: LBR virtualization supported
Feb 01 06:38:40 localhost systemd[1]: Mounting /boot...
Feb 01 06:38:40 localhost kernel: XFS (vda3): Mounting V5 Filesystem
Feb 01 06:38:40 localhost kernel: XFS (vda3): Ending clean mount
Feb 01 06:38:40 localhost kernel: xfs filesystem being mounted at /boot supports timestamps until 2038 (0x7fffffff)
Feb 01 06:38:40 localhost systemd[1]: Mounted /boot.
Feb 01 06:38:40 localhost systemd[1]: Mounting /boot/efi...
Feb 01 06:38:40 localhost systemd[1]: Mounted /boot/efi.
Feb 01 06:38:40 localhost systemd[1]: Reached target Local File Systems.
Feb 01 06:38:40 localhost systemd[1]: Starting Rebuild Dynamic Linker Cache...
Feb 01 06:38:40 localhost systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Feb 01 06:38:40 localhost systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Feb 01 06:38:40 localhost systemd[1]: Store a System Token in an EFI Variable was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 01 06:38:40 localhost systemd[1]: Starting Automatic Boot Loader Update...
Feb 01 06:38:40 localhost systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Feb 01 06:38:40 localhost systemd[1]: Starting Create Volatile Files and Directories...
Feb 01 06:38:40 localhost systemd[1]: efi.automount: Got automount request for /efi, triggered by 718 (bootctl)
Feb 01 06:38:40 localhost systemd[1]: Starting File System Check on /dev/vda2...
Feb 01 06:38:40 localhost systemd[1]: Finished File System Check on /dev/vda2.
Feb 01 06:38:40 localhost systemd[1]: Mounting EFI System Partition Automount...
Feb 01 06:38:40 localhost systemd[1]: Mounted EFI System Partition Automount.
Feb 01 06:38:40 localhost systemd[1]: Finished Automatic Boot Loader Update.
Feb 01 06:38:40 localhost systemd[1]: Finished Create Volatile Files and Directories.
Feb 01 06:38:40 localhost systemd[1]: Starting Security Auditing Service...
Feb 01 06:38:40 localhost systemd[1]: Starting RPC Bind...
Feb 01 06:38:40 localhost systemd[1]: Starting Rebuild Journal Catalog...
Feb 01 06:38:40 localhost auditd[727]: audit dispatcher initialized with q_depth=1200 and 1 active plugins
Feb 01 06:38:40 localhost auditd[727]: Init complete, auditd 3.0.7 listening for events (startup state enable)
Feb 01 06:38:40 localhost systemd[1]: Finished Rebuild Journal Catalog.
Feb 01 06:38:40 localhost systemd[1]: Started RPC Bind.
Feb 01 06:38:40 localhost augenrules[732]: /sbin/augenrules: No change
Feb 01 06:38:40 localhost augenrules[742]: No rules
Feb 01 06:38:40 localhost augenrules[742]: enabled 1
Feb 01 06:38:40 localhost augenrules[742]: failure 1
Feb 01 06:38:40 localhost augenrules[742]: pid 727
Feb 01 06:38:40 localhost augenrules[742]: rate_limit 0
Feb 01 06:38:40 localhost augenrules[742]: backlog_limit 8192
Feb 01 06:38:40 localhost augenrules[742]: lost 0
Feb 01 06:38:40 localhost augenrules[742]: backlog 4
Feb 01 06:38:40 localhost augenrules[742]: backlog_wait_time 60000
Feb 01 06:38:40 localhost augenrules[742]: backlog_wait_time_actual 0
Feb 01 06:38:40 localhost augenrules[742]: enabled 1
Feb 01 06:38:40 localhost augenrules[742]: failure 1
Feb 01 06:38:40 localhost augenrules[742]: pid 727
Feb 01 06:38:40 localhost augenrules[742]: rate_limit 0
Feb 01 06:38:40 localhost augenrules[742]: backlog_limit 8192
Feb 01 06:38:40 localhost augenrules[742]: lost 0
Feb 01 06:38:40 localhost augenrules[742]: backlog 4
Feb 01 06:38:40 localhost augenrules[742]: backlog_wait_time 60000
Feb 01 06:38:40 localhost augenrules[742]: backlog_wait_time_actual 0
Feb 01 06:38:40 localhost augenrules[742]: enabled 1
Feb 01 06:38:40 localhost augenrules[742]: failure 1
Feb 01 06:38:40 localhost augenrules[742]: pid 727
Feb 01 06:38:40 localhost augenrules[742]: rate_limit 0
Feb 01 06:38:40 localhost augenrules[742]: backlog_limit 8192
Feb 01 06:38:40 localhost augenrules[742]: lost 0
Feb 01 06:38:40 localhost augenrules[742]: backlog 4
Feb 01 06:38:40 localhost augenrules[742]: backlog_wait_time 60000
Feb 01 06:38:40 localhost augenrules[742]: backlog_wait_time_actual 0
Feb 01 06:38:40 localhost systemd[1]: Started Security Auditing Service.
Feb 01 06:38:40 localhost systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Feb 01 06:38:40 localhost systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Feb 01 06:38:41 localhost systemd[1]: Finished Rebuild Dynamic Linker Cache.
Feb 01 06:38:41 localhost systemd[1]: Starting Update is Completed...
Feb 01 06:38:41 localhost systemd[1]: Finished Update is Completed.
Feb 01 06:38:41 localhost systemd[1]: Reached target System Initialization.
Feb 01 06:38:41 localhost systemd[1]: Started dnf makecache --timer.
Feb 01 06:38:41 localhost systemd[1]: Started Daily rotation of log files.
Feb 01 06:38:41 localhost systemd[1]: Started Daily Cleanup of Temporary Directories.
Feb 01 06:38:41 localhost systemd[1]: Reached target Timer Units.
Feb 01 06:38:41 localhost systemd[1]: Listening on D-Bus System Message Bus Socket.
Feb 01 06:38:41 localhost systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Feb 01 06:38:41 localhost systemd[1]: Reached target Socket Units.
Feb 01 06:38:41 localhost systemd[1]: Starting Initial cloud-init job (pre-networking)...
Feb 01 06:38:41 localhost systemd[1]: Starting D-Bus System Message Bus...
Feb 01 06:38:41 localhost systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 01 06:38:41 localhost systemd[1]: Started D-Bus System Message Bus.
Feb 01 06:38:41 localhost systemd[1]: Reached target Basic System.
Feb 01 06:38:41 localhost dbus-broker-lau[752]: Ready
Feb 01 06:38:41 localhost systemd[1]: Starting NTP client/server...
Feb 01 06:38:41 localhost systemd[1]: Starting Restore /run/initramfs on shutdown...
Feb 01 06:38:41 localhost systemd[1]: Started irqbalance daemon.
Feb 01 06:38:41 localhost systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Feb 01 06:38:41 localhost systemd[1]: Starting System Logging Service...
Feb 01 06:38:41 localhost systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 06:38:41 localhost systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 06:38:41 localhost systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 06:38:41 localhost systemd[1]: Reached target sshd-keygen.target.
Feb 01 06:38:41 localhost systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Feb 01 06:38:41 localhost systemd[1]: Reached target User and Group Name Lookups.
Feb 01 06:38:41 localhost systemd[1]: Starting User Login Management...
Feb 01 06:38:41 localhost systemd[1]: Finished Restore /run/initramfs on shutdown.
Feb 01 06:38:41 localhost rsyslogd[760]: [origin software="rsyslogd" swVersion="8.2102.0-111.el9" x-pid="760" x-info="https://www.rsyslog.com"] start
Feb 01 06:38:41 localhost rsyslogd[760]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2040 ]
Feb 01 06:38:41 localhost systemd[1]: Started System Logging Service.
Feb 01 06:38:41 localhost chronyd[767]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 01 06:38:41 localhost chronyd[767]: Using right/UTC timezone to obtain leap second data
Feb 01 06:38:41 localhost chronyd[767]: Loaded seccomp filter (level 2)
Feb 01 06:38:41 localhost systemd[1]: Started NTP client/server.
Feb 01 06:38:41 localhost systemd-logind[761]: New seat seat0.
Feb 01 06:38:41 localhost systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 01 06:38:41 localhost systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 01 06:38:41 localhost systemd[1]: Started User Login Management.
Feb 01 06:38:41 localhost rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 06:38:41 localhost cloud-init[771]: Cloud-init v. 22.1-9.el9 running 'init-local' at Sun, 01 Feb 2026 06:38:41 +0000. Up 5.89 seconds.
Feb 01 06:38:41 localhost kernel: ISO 9660 Extensions: Microsoft Joliet Level 3
Feb 01 06:38:41 localhost kernel: ISO 9660 Extensions: RRIP_1991A
Feb 01 06:38:41 localhost systemd[1]: run-cloud\x2dinit-tmp-tmpyreazcb8.mount: Deactivated successfully.
Feb 01 06:38:42 localhost systemd[1]: Starting Hostname Service...
Feb 01 06:38:42 localhost systemd[1]: Started Hostname Service.
Feb 01 06:38:42 np0005604215.novalocal systemd-hostnamed[785]: Hostname set to <np0005604215.novalocal> (static)
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Finished Initial cloud-init job (pre-networking).
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Reached target Preparation for Network.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Network Manager...
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2537] NetworkManager (version 1.42.2-1.el9) is starting... (boot:f77db588-715c-4e22-a8c7-41daa1528c92)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2545] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Started Network Manager.
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2587] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Reached target Network.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting GSSAPI Proxy Daemon...
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Enable periodic update of entitlement certificates....
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2693] manager[0x55bdea407020]: monitoring kernel firmware directory '/lib/firmware'.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2729] hostname: hostname: using hostnamed
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2729] hostname: static hostname changed from (none) to "np0005604215.novalocal"
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2740] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Started Enable periodic update of entitlement certificates..
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Started GSSAPI Proxy Daemon.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Reached target NFS client services.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Reached target Preparation for Remote File Systems.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Reached target Remote File Systems.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2920] manager[0x55bdea407020]: rfkill: Wi-Fi hardware radio set enabled
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.2920] manager[0x55bdea407020]: rfkill: WWAN hardware radio set enabled
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3006] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3007] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3021] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3021] manager: Networking is enabled by state file
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3060] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3060] settings: Loaded settings plugin: keyfile (internal)
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3092] dhcp: init: Using DHCP client 'internal'
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3098] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3112] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3119] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3128] device (lo): Activation: starting connection 'lo' (d27cc6ff-3b23-4411-8524-3e0f36165c06)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3139] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3145] device (eth0): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3182] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3185] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3187] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3189] device (eth0): carrier: link connected
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3194] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3201] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3209] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3213] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3214] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3215] manager: NetworkManager state is now CONNECTING
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3216] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3223] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3226] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3261] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3263] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.3268] device (lo): Activation: successful, device activated.
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4366] dhcp4 (eth0): state changed new lease, address=38.102.83.164
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4376] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4427] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4449] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4453] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'managed')
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4460] manager: NetworkManager state is now CONNECTED_SITE
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4467] device (eth0): Activation: successful, device activated.
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4476] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 01 06:38:42 np0005604215.novalocal NetworkManager[790]: <info>  [1769927922.4482] manager: startup complete
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Initial cloud-init job (metadata service crawler)...
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Starting Authorization Manager...
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Started Dynamic System Tuning Daemon.
Feb 01 06:38:42 np0005604215.novalocal polkitd[1029]: Started polkitd version 0.117
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: Cloud-init v. 22.1-9.el9 running 'init' at Sun, 01 Feb 2026 06:38:42 +0000. Up 6.88 seconds.
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |  eth0  | True |        38.102.83.164         | 255.255.255.0 | global | fa:16:3e:d0:c8:c4 |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |  eth0  | True | fe80::f816:3eff:fed0:c8c4/64 |       .       |  link  | fa:16:3e:d0:c8:c4 |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Feb 01 06:38:42 np0005604215.novalocal cloud-init[1036]: ci-info: +-------+-------------+---------+-----------+-------+
Feb 01 06:38:42 np0005604215.novalocal polkitd[1029]: Loading rules from directory /etc/polkit-1/rules.d
Feb 01 06:38:42 np0005604215.novalocal polkitd[1029]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 01 06:38:42 np0005604215.novalocal polkitd[1029]: Finished loading, compiling and executing 4 rules
Feb 01 06:38:42 np0005604215.novalocal systemd[1]: Started Authorization Manager.
Feb 01 06:38:42 np0005604215.novalocal polkitd[1029]: Acquired the name org.freedesktop.PolicyKit1 on the system bus
Feb 01 06:38:45 np0005604215.novalocal useradd[1120]: new group: name=cloud-user, GID=1001
Feb 01 06:38:45 np0005604215.novalocal useradd[1120]: new user: name=cloud-user, UID=1001, GID=1001, home=/home/cloud-user, shell=/bin/bash, from=none
Feb 01 06:38:45 np0005604215.novalocal useradd[1120]: add 'cloud-user' to group 'adm'
Feb 01 06:38:45 np0005604215.novalocal useradd[1120]: add 'cloud-user' to group 'systemd-journal'
Feb 01 06:38:45 np0005604215.novalocal useradd[1120]: add 'cloud-user' to shadow group 'adm'
Feb 01 06:38:45 np0005604215.novalocal useradd[1120]: add 'cloud-user' to shadow group 'systemd-journal'
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Generating public/private rsa key pair.
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: The key fingerprint is:
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: SHA256:MAOwyvUX8xeemfJ1b1V1PqF7eCjCaNGXGuB7Hgwq9v4 root@np0005604215.novalocal
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: The key's randomart image is:
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: +---[RSA 3072]----+
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |  ...  .       .o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |   . .. o   . ..+|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |  ..  +* o + . .o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |... . .+@ = = + o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |.. o o =SX B = +.|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |  . o o o * o + o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |     .   . .    o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |    .          . |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |     ..E         |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: +----[SHA256]-----+
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Generating public/private ecdsa key pair.
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: The key fingerprint is:
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: SHA256:4cdo6wbuJ+6z8De1ig5y8uOBmXOaH2o/A5y25AffQFA root@np0005604215.novalocal
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: The key's randomart image is:
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: +---[ECDSA 256]---+
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |   .E            |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |  .              |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |   .    .        |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |    .  . +       |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: | . o    S o      |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |  B+. .. o.      |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: | +B=B+ ... .     |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |  +@**=o= .      |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: | .+==OOBoo       |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: +----[SHA256]-----+
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Generating public/private ed25519 key pair.
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: The key fingerprint is:
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: SHA256:7V//jOvOIaArPbv+zf3jwaAVfF9gL7Joj+dLzfYzdak root@np0005604215.novalocal
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: The key's randomart image is:
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: +--[ED25519 256]--+
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |              o  |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |            .. o |
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |            .o..o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |         . . oo.o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |        S = .o  o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |         + +ooo.o|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |       .. o.= Boo|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |      . o. B E X.|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: |       o==. *+O+@|
Feb 01 06:38:45 np0005604215.novalocal cloud-init[1036]: +----[SHA256]-----+
Feb 01 06:38:46 np0005604215.novalocal sm-notify[1133]: Version 2.5.4 starting
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Finished Initial cloud-init job (metadata service crawler).
Feb 01 06:38:46 np0005604215.novalocal crond[1139]: (CRON) STARTUP (1.5.7)
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Reached target Cloud-config availability.
Feb 01 06:38:46 np0005604215.novalocal crond[1139]: (CRON) INFO (Syslog will be used instead of sendmail.)
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Reached target Network is Online.
Feb 01 06:38:46 np0005604215.novalocal crond[1139]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 87% if used.)
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Starting Apply the settings specified in cloud-config...
Feb 01 06:38:46 np0005604215.novalocal crond[1139]: (CRON) INFO (running with inotify support)
Feb 01 06:38:46 np0005604215.novalocal sshd[1134]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Run Insights Client at boot was skipped because of an unmet condition check (ConditionPathExists=/etc/insights-client/.run_insights_client_next_boot).
Feb 01 06:38:46 np0005604215.novalocal sshd[1134]: Server listening on 0.0.0.0 port 22.
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Starting Crash recovery kernel arming...
Feb 01 06:38:46 np0005604215.novalocal sshd[1134]: Server listening on :: port 22.
Feb 01 06:38:45 np0005604215.novalocal systemd[1]: Starting Notify NFS peers of a restart...
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Starting OpenSSH server daemon...
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Starting Permit User Sessions...
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Started Notify NFS peers of a restart.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Finished Permit User Sessions.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Started Command Scheduler.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Started Getty on tty1.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Started Serial Getty on ttyS0.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Reached target Login Prompts.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Started OpenSSH server daemon.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Reached target Multi-User System.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Starting Record Runlevel Change in UTMP...
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Finished Record Runlevel Change in UTMP.
Feb 01 06:38:46 np0005604215.novalocal kdumpctl[1137]: kdump: No kdump initial ramdisk found.
Feb 01 06:38:46 np0005604215.novalocal kdumpctl[1137]: kdump: Rebuilding /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1282]: Cloud-init v. 22.1-9.el9 running 'modules:config' at Sun, 01 Feb 2026 06:38:46 +0000. Up 10.35 seconds.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Finished Apply the settings specified in cloud-config.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Starting Execute cloud user/final scripts...
Feb 01 06:38:46 np0005604215.novalocal dracut[1419]: dracut-057-21.git20230214.el9
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1437]: Cloud-init v. 22.1-9.el9 running 'modules:final' at Sun, 01 Feb 2026 06:38:46 +0000. Up 10.71 seconds.
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: Executing: /usr/bin/dracut --add kdumpbase --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  -o "plymouth resume ifcfg earlykdump" --mount "/dev/disk/by-uuid/a3dd82de-ffc6-4652-88b9-80e003b8f20a /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device -f /boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img 5.14.0-284.11.1.el9_2.x86_64
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1457]: #############################################################
Feb 01 06:38:46 np0005604215.novalocal sshd[1453]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1461]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1469]: 256 SHA256:4cdo6wbuJ+6z8De1ig5y8uOBmXOaH2o/A5y25AffQFA root@np0005604215.novalocal (ECDSA)
Feb 01 06:38:46 np0005604215.novalocal sshd[1453]: Connection closed by 38.102.83.114 port 57266 [preauth]
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1475]: 256 SHA256:7V//jOvOIaArPbv+zf3jwaAVfF9gL7Joj+dLzfYzdak root@np0005604215.novalocal (ED25519)
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1486]: 3072 SHA256:MAOwyvUX8xeemfJ1b1V1PqF7eCjCaNGXGuB7Hgwq9v4 root@np0005604215.novalocal (RSA)
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1488]: -----END SSH HOST KEY FINGERPRINTS-----
Feb 01 06:38:46 np0005604215.novalocal sshd[1477]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1494]: #############################################################
Feb 01 06:38:46 np0005604215.novalocal sshd[1504]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal sshd[1529]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal sshd[1477]: Unable to negotiate with 38.102.83.114 port 57282: no matching host key type found. Their offer: ssh-ed25519,ssh-ed25519-cert-v01@openssh.com [preauth]
Feb 01 06:38:46 np0005604215.novalocal sshd[1529]: Unable to negotiate with 38.102.83.114 port 57302: no matching host key type found. Their offer: ecdsa-sha2-nistp384,ecdsa-sha2-nistp384-cert-v01@openssh.com [preauth]
Feb 01 06:38:46 np0005604215.novalocal cloud-init[1437]: Cloud-init v. 22.1-9.el9 finished at Sun, 01 Feb 2026 06:38:46 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.96 seconds
Feb 01 06:38:46 np0005604215.novalocal sshd[1549]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal sshd[1549]: Unable to negotiate with 38.102.83.114 port 57318: no matching host key type found. Their offer: ecdsa-sha2-nistp521,ecdsa-sha2-nistp521-cert-v01@openssh.com [preauth]
Feb 01 06:38:46 np0005604215.novalocal sshd[1561]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 01 06:38:46 np0005604215.novalocal sshd[1579]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal sshd[1504]: Connection closed by 38.102.83.114 port 57290 [preauth]
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Reloading Network Manager...
Feb 01 06:38:46 np0005604215.novalocal sshd[1601]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 01 06:38:46 np0005604215.novalocal sshd[1601]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 06:38:46 np0005604215.novalocal NetworkManager[790]: <info>  [1769927926.9125] audit: op="reload" arg="0" pid=1598 uid=0 result="success"
Feb 01 06:38:46 np0005604215.novalocal NetworkManager[790]: <info>  [1769927926.9131] config: signal: SIGHUP (no changes from disk)
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Reloaded Network Manager.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Finished Execute cloud user/final scripts.
Feb 01 06:38:46 np0005604215.novalocal systemd[1]: Reached target Cloud-init target.
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 01 06:38:46 np0005604215.novalocal sshd[1620]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:38:46 np0005604215.novalocal sshd[1620]: Unable to negotiate with 38.102.83.114 port 57364: no matching host key type found. Their offer: ssh-dss,ssh-dss-cert-v01@openssh.com [preauth]
Feb 01 06:38:46 np0005604215.novalocal sshd[1561]: Connection closed by 38.102.83.114 port 57328 [preauth]
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'ifcfg' will not be installed, because it's in the list to be omitted!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'plymouth' will not be installed, because it's in the list to be omitted!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 01 06:38:46 np0005604215.novalocal sshd[1579]: Connection closed by 38.102.83.114 port 57344 [preauth]
Feb 01 06:38:46 np0005604215.novalocal dracut[1421]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'resume' will not be installed, because it's in the list to be omitted!
Feb 01 06:38:47 np0005604215.novalocal chronyd[767]: Selected source 209.227.173.244 (2.rhel.pool.ntp.org)
Feb 01 06:38:47 np0005604215.novalocal chronyd[767]: System clock TAI offset set to 37 seconds
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'earlykdump' will not be installed, because it's in the list to be omitted!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: memstrack is not available
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: memstrack is not available
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Feb 01 06:38:47 np0005604215.novalocal dracut[1421]: *** Including module: systemd ***
Feb 01 06:38:48 np0005604215.novalocal dracut[1421]: *** Including module: systemd-initrd ***
Feb 01 06:38:48 np0005604215.novalocal dracut[1421]: *** Including module: i18n ***
Feb 01 06:38:48 np0005604215.novalocal dracut[1421]: No KEYMAP configured.
Feb 01 06:38:48 np0005604215.novalocal dracut[1421]: *** Including module: drm ***
Feb 01 06:38:48 np0005604215.novalocal dracut[1421]: *** Including module: prefixdevname ***
Feb 01 06:38:48 np0005604215.novalocal dracut[1421]: *** Including module: kernel-modules ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: kernel-modules-extra ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]:   kernel-modules-extra: configuration source "/run/depmod.d" does not exist
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]:   kernel-modules-extra: configuration source "/lib/depmod.d" does not exist
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]:   kernel-modules-extra: parsing configuration file "/etc/depmod.d/dist.conf"
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]:   kernel-modules-extra: /etc/depmod.d/dist.conf: added "updates extra built-in weak-updates" to the list of search directories
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: qemu ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: fstab-sys ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: rootfs-block ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: terminfo ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: udev-rules ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: Skipping udev rule: 91-permissions.rules
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: Skipping udev rule: 80-drivers-modprobe.rules
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: virtiofs ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: dracut-systemd ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: usrmount ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: base ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: fs-lib ***
Feb 01 06:38:49 np0005604215.novalocal dracut[1421]: *** Including module: kdumpbase ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Including module: microcode_ctl-fw_dir_override ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:   microcode_ctl module: mangling fw_dir
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-2d-07" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-4e-03" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-4f-01" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-55-04" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-5e-03" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-8c-01" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]:     microcode_ctl: final fw_dir: "/lib/firmware/updates/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware/updates /lib/firmware/5.14.0-284.11.1.el9_2.x86_64 /lib/firmware"
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Including module: shutdown ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Including module: squash ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Including modules done ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Installing kernel module dependencies ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Installing kernel module dependencies done ***
Feb 01 06:38:50 np0005604215.novalocal dracut[1421]: *** Resolving executable dependencies ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Resolving executable dependencies done ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Hardlinking files ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Mode:           real
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Files:          1099
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Linked:         3 files
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Compared:       0 xattrs
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Compared:       373 files
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Saved:          61.04 KiB
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Duration:       0.047188 seconds
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Hardlinking files done ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Could not find 'strip'. Not stripping the initramfs.
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Generating early-microcode cpio image ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Constructing AuthenticAMD.bin ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Store current command line parameters ***
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: Stored kernel commandline:
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: No dracut internal kernel commandline stored in the initramfs
Feb 01 06:38:52 np0005604215.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 01 06:38:52 np0005604215.novalocal dracut[1421]: *** Install squash loader ***
Feb 01 06:38:53 np0005604215.novalocal dracut[1421]: *** Squashing the files inside the initramfs ***
Feb 01 06:38:54 np0005604215.novalocal dracut[1421]: *** Squashing the files inside the initramfs done ***
Feb 01 06:38:54 np0005604215.novalocal dracut[1421]: *** Creating image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' ***
Feb 01 06:38:54 np0005604215.novalocal dracut[1421]: *** Creating initramfs image file '/boot/initramfs-5.14.0-284.11.1.el9_2.x86_64kdump.img' done ***
Feb 01 06:38:54 np0005604215.novalocal kdumpctl[1137]: kdump: kexec: loaded kdump kernel
Feb 01 06:38:54 np0005604215.novalocal kdumpctl[1137]: kdump: Starting kdump: [OK]
Feb 01 06:38:54 np0005604215.novalocal systemd[1]: Finished Crash recovery kernel arming.
Feb 01 06:38:54 np0005604215.novalocal systemd[1]: Startup finished in 1.156s (kernel) + 1.808s (initrd) + 15.965s (userspace) = 18.930s.
Feb 01 06:39:07 np0005604215.novalocal sshd[4175]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:39:07 np0005604215.novalocal sshd[4175]: Accepted publickey for zuul from 38.102.83.114 port 40490 ssh2: RSA SHA256:zhs3MiW0JhxzckYcMHQES8SMYHj1iGcomnyzmbiwor8
Feb 01 06:39:07 np0005604215.novalocal systemd[1]: Created slice User Slice of UID 1000.
Feb 01 06:39:07 np0005604215.novalocal systemd[1]: Starting User Runtime Directory /run/user/1000...
Feb 01 06:39:07 np0005604215.novalocal systemd-logind[761]: New session 1 of user zuul.
Feb 01 06:39:07 np0005604215.novalocal systemd[1]: Finished User Runtime Directory /run/user/1000.
Feb 01 06:39:07 np0005604215.novalocal systemd[1]: Starting User Manager for UID 1000...
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: pam_unix(systemd-user:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Queued start job for default target Main User Target.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Created slice User Application Slice.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Reached target Paths.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Reached target Timers.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Starting D-Bus User Message Bus Socket...
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Starting Create User's Volatile Files and Directories...
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Listening on D-Bus User Message Bus Socket.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Finished Create User's Volatile Files and Directories.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Reached target Sockets.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Reached target Basic System.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Reached target Main User Target.
Feb 01 06:39:07 np0005604215.novalocal systemd[4179]: Startup finished in 109ms.
Feb 01 06:39:07 np0005604215.novalocal systemd[1]: Started User Manager for UID 1000.
Feb 01 06:39:07 np0005604215.novalocal systemd[1]: Started Session 1 of User zuul.
Feb 01 06:39:07 np0005604215.novalocal sshd[4175]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:39:07 np0005604215.novalocal python3[4231]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 06:39:12 np0005604215.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 01 06:39:21 np0005604215.novalocal python3[4252]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 06:39:27 np0005604215.novalocal python3[4305]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 06:39:28 np0005604215.novalocal python3[4335]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Feb 01 06:39:31 np0005604215.novalocal python3[4351]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:39:32 np0005604215.novalocal python3[4365]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:33 np0005604215.novalocal python3[4424]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:39:34 np0005604215.novalocal python3[4465]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769927973.4524443-394-277321625407689/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa follow=False checksum=1450e921e2d17379ea725f99be2eea1fb6e75a52 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:35 np0005604215.novalocal python3[4538]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:39:35 np0005604215.novalocal python3[4579]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769927975.1587512-494-52220837057847/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa.pub follow=False checksum=ad19e951a009809a91d74da158b058ce7df88458 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:37 np0005604215.novalocal python3[4607]: ansible-ping Invoked with data=pong
Feb 01 06:39:39 np0005604215.novalocal python3[4621]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 06:39:43 np0005604215.novalocal python3[4674]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Feb 01 06:39:45 np0005604215.novalocal python3[4696]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:46 np0005604215.novalocal python3[4710]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:46 np0005604215.novalocal python3[4724]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:47 np0005604215.novalocal python3[4738]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:47 np0005604215.novalocal python3[4752]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:48 np0005604215.novalocal python3[4766]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:50 np0005604215.novalocal sudo[4780]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bamonspebefsmvgozizfaugdzyzbcbej ; /usr/bin/python3
Feb 01 06:39:50 np0005604215.novalocal sudo[4780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:39:50 np0005604215.novalocal python3[4782]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:50 np0005604215.novalocal sudo[4780]: pam_unix(sudo:session): session closed for user root
Feb 01 06:39:51 np0005604215.novalocal sudo[4828]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itggxbjhjzhlcsdtwquqxfbtfgyxutsk ; /usr/bin/python3
Feb 01 06:39:51 np0005604215.novalocal sudo[4828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:39:51 np0005604215.novalocal python3[4830]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:39:51 np0005604215.novalocal sudo[4828]: pam_unix(sudo:session): session closed for user root
Feb 01 06:39:52 np0005604215.novalocal sudo[4871]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kktxwxoxbzugdjrgajfoiredrkppzsza ; /usr/bin/python3
Feb 01 06:39:52 np0005604215.novalocal sudo[4871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:39:52 np0005604215.novalocal python3[4873]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1769927991.6463-104-6990630972916/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:39:52 np0005604215.novalocal sudo[4871]: pam_unix(sudo:session): session closed for user root
Feb 01 06:39:53 np0005604215.novalocal chronyd[767]: Selected source 138.197.164.54 (2.rhel.pool.ntp.org)
Feb 01 06:39:59 np0005604215.novalocal python3[4901]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:00 np0005604215.novalocal python3[4915]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:00 np0005604215.novalocal python3[4929]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:00 np0005604215.novalocal python3[4943]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:00 np0005604215.novalocal python3[4957]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:01 np0005604215.novalocal python3[4971]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:01 np0005604215.novalocal python3[4986]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:01 np0005604215.novalocal python3[5000]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:01 np0005604215.novalocal python3[5014]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:02 np0005604215.novalocal python3[5028]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:02 np0005604215.novalocal python3[5042]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:02 np0005604215.novalocal python3[5056]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:03 np0005604215.novalocal python3[5070]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:03 np0005604215.novalocal python3[5084]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:03 np0005604215.novalocal python3[5098]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:03 np0005604215.novalocal python3[5112]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:03 np0005604215.novalocal python3[5126]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:04 np0005604215.novalocal python3[5140]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:04 np0005604215.novalocal python3[5154]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:04 np0005604215.novalocal python3[5168]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:05 np0005604215.novalocal python3[5182]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:05 np0005604215.novalocal python3[5196]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:05 np0005604215.novalocal python3[5210]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:05 np0005604215.novalocal python3[5224]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:06 np0005604215.novalocal python3[5238]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:06 np0005604215.novalocal python3[5252]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:40:07 np0005604215.novalocal sudo[5266]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rzqocprsvbenectinoksztsndjxnsnwn ; /usr/bin/python3
Feb 01 06:40:07 np0005604215.novalocal sudo[5266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:07 np0005604215.novalocal python3[5268]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 01 06:40:07 np0005604215.novalocal systemd[1]: Starting Time & Date Service...
Feb 01 06:40:07 np0005604215.novalocal systemd[1]: Started Time & Date Service.
Feb 01 06:40:07 np0005604215.novalocal systemd-timedated[5270]: Changed time zone to 'UTC' (UTC).
Feb 01 06:40:07 np0005604215.novalocal sudo[5266]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:08 np0005604215.novalocal sudo[5287]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qezxhwjqknxxxepaspqknrslcnhqhfvl ; /usr/bin/python3
Feb 01 06:40:08 np0005604215.novalocal sudo[5287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:09 np0005604215.novalocal python3[5289]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:09 np0005604215.novalocal sudo[5287]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:10 np0005604215.novalocal python3[5335]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:40:10 np0005604215.novalocal python3[5376]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1769928009.977927-500-130948598554410/source _original_basename=tmp17zwbxb9 follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:11 np0005604215.novalocal python3[5436]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:40:11 np0005604215.novalocal python3[5477]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769928011.4460287-592-3203137487946/source _original_basename=tmpsaz2xzgk follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:13 np0005604215.novalocal sudo[5537]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suzpzokkryjbbqqcgvbhlutktzgklfuy ; /usr/bin/python3
Feb 01 06:40:13 np0005604215.novalocal sudo[5537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:13 np0005604215.novalocal python3[5539]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:40:13 np0005604215.novalocal sudo[5537]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:13 np0005604215.novalocal sudo[5580]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eujggwkauaaclwjgfscqpkmzontqinvb ; /usr/bin/python3
Feb 01 06:40:13 np0005604215.novalocal sudo[5580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:14 np0005604215.novalocal python3[5582]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1769928013.4437199-732-129222601682913/source _original_basename=tmpkb6xt0pw follow=False checksum=9313104c4584898a1afe992edc322b557e0f1f28 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:14 np0005604215.novalocal sudo[5580]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:15 np0005604215.novalocal python3[5610]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:40:15 np0005604215.novalocal python3[5626]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:40:16 np0005604215.novalocal sudo[5674]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfcomsmljtwnfnnolqhnccxjqfxaiftg ; /usr/bin/python3
Feb 01 06:40:16 np0005604215.novalocal sudo[5674]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:16 np0005604215.novalocal python3[5676]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:40:16 np0005604215.novalocal sudo[5674]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:16 np0005604215.novalocal sudo[5717]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hsldnkvzcnwuaphnmsxryujdfbrpddvj ; /usr/bin/python3
Feb 01 06:40:16 np0005604215.novalocal sudo[5717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:17 np0005604215.novalocal python3[5719]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1769928016.4272835-860-95594803417146/source _original_basename=tmpwilrtpym follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:17 np0005604215.novalocal sudo[5717]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:18 np0005604215.novalocal sudo[5748]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzaiiisyiyvbndnyttbtgyxeackuguye ; /usr/bin/python3
Feb 01 06:40:18 np0005604215.novalocal sudo[5748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:18 np0005604215.novalocal python3[5750]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ef9-e89a-3a26-ae0f-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:40:18 np0005604215.novalocal sudo[5748]: pam_unix(sudo:session): session closed for user root
Feb 01 06:40:19 np0005604215.novalocal python3[5768]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-3a26-ae0f-000000000024-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Feb 01 06:40:21 np0005604215.novalocal python3[5786]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:37 np0005604215.novalocal systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 01 06:40:39 np0005604215.novalocal sudo[5802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhobzudnngcnvklwolylochwmdfopgui ; /usr/bin/python3
Feb 01 06:40:39 np0005604215.novalocal sudo[5802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:40:39 np0005604215.novalocal python3[5804]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:40:39 np0005604215.novalocal sudo[5802]: pam_unix(sudo:session): session closed for user root
Feb 01 06:41:39 np0005604215.novalocal sshd[4188]: Received disconnect from 38.102.83.114 port 40490:11: disconnected by user
Feb 01 06:41:39 np0005604215.novalocal sshd[4188]: Disconnected from user zuul 38.102.83.114 port 40490
Feb 01 06:41:39 np0005604215.novalocal sshd[4175]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:41:39 np0005604215.novalocal systemd-logind[761]: Session 1 logged out. Waiting for processes to exit.
Feb 01 06:41:40 np0005604215.novalocal systemd[4179]: Starting Mark boot as successful...
Feb 01 06:41:40 np0005604215.novalocal systemd[4179]: Finished Mark boot as successful.
Feb 01 06:42:42 np0005604215.novalocal systemd[1]: Unmounting EFI System Partition Automount...
Feb 01 06:42:42 np0005604215.novalocal systemd[1]: efi.mount: Deactivated successfully.
Feb 01 06:42:42 np0005604215.novalocal systemd[1]: Unmounted EFI System Partition Automount.
Feb 01 06:44:40 np0005604215.novalocal systemd[4179]: Created slice User Background Tasks Slice.
Feb 01 06:44:40 np0005604215.novalocal systemd[4179]: Starting Cleanup of User's Temporary Files and Directories...
Feb 01 06:44:40 np0005604215.novalocal systemd[4179]: Finished Cleanup of User's Temporary Files and Directories.
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: reg 0x10: [io  0x0000-0x003f]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: reg 0x14: [mem 0x00000000-0x00000fff]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: reg 0x20: [mem 0x00000000-0x00003fff 64bit pref]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: reg 0x30: [mem 0x00000000-0x0007ffff pref]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: BAR 6: assigned [mem 0xc0000000-0xc007ffff pref]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: BAR 4: assigned [mem 0x440000000-0x440003fff 64bit pref]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: BAR 1: assigned [mem 0xc0080000-0xc0080fff]
Feb 01 06:44:44 np0005604215.novalocal kernel: pci 0000:00:07.0: BAR 0: assigned [io  0x1000-0x103f]
Feb 01 06:44:44 np0005604215.novalocal kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3073] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 01 06:44:44 np0005604215.novalocal systemd-udevd[5814]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3224] device (eth1): state change: unmanaged -> unavailable (reason 'managed', sys-iface-state: 'external')
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3262] settings: (eth1): created default wired connection 'Wired connection 1'
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3267] device (eth1): carrier: link connected
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3271] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', sys-iface-state: 'managed')
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3277] policy: auto-activating connection 'Wired connection 1' (ba1ceec1-c224-34b6-a0a4-ea1192c7597e)
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3284] device (eth1): Activation: starting connection 'Wired connection 1' (ba1ceec1-c224-34b6-a0a4-ea1192c7597e)
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3286] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'managed')
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3291] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'managed')
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3298] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'managed')
Feb 01 06:44:44 np0005604215.novalocal NetworkManager[790]: <info>  [1769928284.3303] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 01 06:44:45 np0005604215.novalocal sshd[5817]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:44:45 np0005604215.novalocal kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth1: link becomes ready
Feb 01 06:44:45 np0005604215.novalocal sshd[5817]: Accepted publickey for zuul from 38.102.83.114 port 54914 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 06:44:45 np0005604215.novalocal systemd-logind[761]: New session 3 of user zuul.
Feb 01 06:44:45 np0005604215.novalocal systemd[1]: Started Session 3 of User zuul.
Feb 01 06:44:45 np0005604215.novalocal sshd[5817]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:44:45 np0005604215.novalocal python3[5834]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ef9-e89a-9afb-5883-000000000475-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:44:58 np0005604215.novalocal sudo[5882]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdzuehlssootexjoyswdkbklbpfgizgw ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 01 06:44:58 np0005604215.novalocal sudo[5882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:44:58 np0005604215.novalocal python3[5884]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:44:58 np0005604215.novalocal sudo[5882]: pam_unix(sudo:session): session closed for user root
Feb 01 06:44:58 np0005604215.novalocal sudo[5925]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyloyqoujqpnpegrwwfqfzzpfygmquxx ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 01 06:44:58 np0005604215.novalocal sudo[5925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:44:59 np0005604215.novalocal python3[5927]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769928298.4675562-537-185031792863134/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=c5ec26e43b1f8e7018b3bd3d9cbfeb38dd096269 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:44:59 np0005604215.novalocal sudo[5925]: pam_unix(sudo:session): session closed for user root
Feb 01 06:44:59 np0005604215.novalocal sudo[5955]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbysnlwmalczbwfxlanmzeearcjsftry ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 01 06:44:59 np0005604215.novalocal sudo[5955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:44:59 np0005604215.novalocal python3[5957]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Stopped Network Manager Wait Online.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Stopping Network Manager Wait Online...
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Stopping Network Manager...
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.6829] caught SIGTERM, shutting down normally.
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.6923] dhcp4 (eth0): canceled DHCP transaction
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.6924] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.6924] dhcp4 (eth0): state changed no lease
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.6930] manager: NetworkManager state is now CONNECTING
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.7113] dhcp4 (eth1): canceled DHCP transaction
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.7115] dhcp4 (eth1): state changed no lease
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[790]: <info>  [1769928300.7210] exiting (success)
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: NetworkManager.service: Deactivated successfully.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Stopped Network Manager.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: NetworkManager.service: Consumed 2.843s CPU time.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Starting Network Manager...
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.7756] NetworkManager (version 1.42.2-1.el9) is starting... (after a restart, boot:f77db588-715c-4e22-a8c7-41daa1528c92)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.7759] Read config: /etc/NetworkManager/NetworkManager.conf (run: 15-carrier-timeout.conf)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.7786] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Started Network Manager.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Starting Network Manager Wait Online...
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.7852] manager[0x556748592090]: monitoring kernel firmware directory '/lib/firmware'.
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Starting Hostname Service...
Feb 01 06:45:00 np0005604215.novalocal sudo[5955]: pam_unix(sudo:session): session closed for user root
Feb 01 06:45:00 np0005604215.novalocal systemd[1]: Started Hostname Service.
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8736] hostname: hostname: using hostnamed
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8738] hostname: static hostname changed from (none) to "np0005604215.novalocal"
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8747] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8755] manager[0x556748592090]: rfkill: Wi-Fi hardware radio set enabled
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8756] manager[0x556748592090]: rfkill: WWAN hardware radio set enabled
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8797] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-device-plugin-team.so)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8799] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8801] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8803] manager: Networking is enabled by state file
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8812] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.42.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8814] settings: Loaded settings plugin: keyfile (internal)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8864] dhcp: init: Using DHCP client 'internal'
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8869] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8879] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8889] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8906] device (lo): Activation: starting connection 'lo' (d27cc6ff-3b23-4411-8524-3e0f36165c06)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8917] device (eth0): carrier: link connected
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8925] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8934] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8936] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8947] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8959] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8970] device (eth1): carrier: link connected
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8977] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8988] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (ba1ceec1-c224-34b6-a0a4-ea1192c7597e) (indicated)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8990] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.8999] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9011] device (eth1): Activation: starting connection 'Wired connection 1' (ba1ceec1-c224-34b6-a0a4-ea1192c7597e)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9041] device (lo): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9057] device (lo): state change: prepare -> config (reason 'none', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9060] device (lo): state change: config -> ip-config (reason 'none', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9064] device (eth0): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9069] device (eth0): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9072] device (eth1): state change: disconnected -> prepare (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9076] device (eth1): state change: prepare -> config (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9080] device (lo): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9097] device (eth0): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9103] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9120] device (eth1): state change: config -> ip-config (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9125] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9175] device (lo): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9183] device (lo): state change: secondaries -> activated (reason 'none', sys-iface-state: 'external')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9190] device (lo): Activation: successful, device activated.
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9201] dhcp4 (eth0): state changed new lease, address=38.102.83.164
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9207] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9290] device (eth0): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9322] device (eth0): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9325] device (eth0): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9334] manager: NetworkManager state is now CONNECTED_SITE
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9338] device (eth0): Activation: successful, device activated.
Feb 01 06:45:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928300.9350] manager: NetworkManager state is now CONNECTED_GLOBAL
Feb 01 06:45:01 np0005604215.novalocal python3[6030]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ef9-e89a-9afb-5883-000000000136-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:45:11 np0005604215.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 01 06:45:30 np0005604215.novalocal systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 01 06:45:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928345.8218] device (eth1): state change: ip-config -> ip-check (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:45 np0005604215.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 01 06:45:45 np0005604215.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 01 06:45:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928345.8416] device (eth1): state change: ip-check -> secondaries (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928345.8420] device (eth1): state change: secondaries -> activated (reason 'none', sys-iface-state: 'assume')
Feb 01 06:45:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928345.8442] device (eth1): Activation: successful, device activated.
Feb 01 06:45:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769928345.8453] manager: startup complete
Feb 01 06:45:45 np0005604215.novalocal systemd[1]: Finished Network Manager Wait Online.
Feb 01 06:45:55 np0005604215.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 01 06:46:01 np0005604215.novalocal sshd[5820]: Received disconnect from 38.102.83.114 port 54914:11: disconnected by user
Feb 01 06:46:01 np0005604215.novalocal sshd[5820]: Disconnected from user zuul 38.102.83.114 port 54914
Feb 01 06:46:01 np0005604215.novalocal sshd[5817]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:46:01 np0005604215.novalocal systemd[1]: session-3.scope: Deactivated successfully.
Feb 01 06:46:01 np0005604215.novalocal systemd[1]: session-3.scope: Consumed 1.445s CPU time.
Feb 01 06:46:01 np0005604215.novalocal systemd-logind[761]: Session 3 logged out. Waiting for processes to exit.
Feb 01 06:46:01 np0005604215.novalocal systemd-logind[761]: Removed session 3.
Feb 01 06:46:25 np0005604215.novalocal sshd[6060]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:46:25 np0005604215.novalocal sshd[6060]: Accepted publickey for zuul from 38.102.83.114 port 41346 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 06:46:25 np0005604215.novalocal systemd-logind[761]: New session 4 of user zuul.
Feb 01 06:46:25 np0005604215.novalocal systemd[1]: Started Session 4 of User zuul.
Feb 01 06:46:25 np0005604215.novalocal sshd[6060]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:46:25 np0005604215.novalocal sudo[6109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmsjiphhnvexmnzmlnuiokstnotdzlyh ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 01 06:46:25 np0005604215.novalocal sudo[6109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:46:26 np0005604215.novalocal python3[6111]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:46:26 np0005604215.novalocal sudo[6109]: pam_unix(sudo:session): session closed for user root
Feb 01 06:46:26 np0005604215.novalocal sudo[6152]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akmjdsjxevspbyqsbdsmmbznfspnpwag ; OS_CLOUD=vexxhost /usr/bin/python3
Feb 01 06:46:26 np0005604215.novalocal sudo[6152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:46:26 np0005604215.novalocal python3[6154]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769928385.8376184-628-1439053193405/source _original_basename=tmplgaibzgz follow=False checksum=b662c6ad0fdede3f6b8f2737681b36760d23a74b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:46:26 np0005604215.novalocal sudo[6152]: pam_unix(sudo:session): session closed for user root
Feb 01 06:46:29 np0005604215.novalocal sshd[6060]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:46:29 np0005604215.novalocal systemd[1]: session-4.scope: Deactivated successfully.
Feb 01 06:46:29 np0005604215.novalocal systemd-logind[761]: Session 4 logged out. Waiting for processes to exit.
Feb 01 06:46:29 np0005604215.novalocal systemd-logind[761]: Removed session 4.
Feb 01 06:48:32 np0005604215.novalocal chronyd[767]: Selected source 209.227.173.244 (2.rhel.pool.ntp.org)
Feb 01 06:53:40 np0005604215.novalocal systemd[1]: Starting Cleanup of Temporary Directories...
Feb 01 06:53:40 np0005604215.novalocal systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Feb 01 06:53:40 np0005604215.novalocal systemd[1]: Finished Cleanup of Temporary Directories.
Feb 01 06:53:40 np0005604215.novalocal systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Feb 01 06:54:06 np0005604215.novalocal sshd[6176]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:54:06 np0005604215.novalocal sshd[6176]: Accepted publickey for zuul from 38.102.83.114 port 57624 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 06:54:06 np0005604215.novalocal systemd-logind[761]: New session 5 of user zuul.
Feb 01 06:54:06 np0005604215.novalocal systemd[1]: Started Session 5 of User zuul.
Feb 01 06:54:06 np0005604215.novalocal sshd[6176]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:54:06 np0005604215.novalocal sudo[6193]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kbvvksouykhmisepyqccsiznasufrkep ; /usr/bin/python3
Feb 01 06:54:06 np0005604215.novalocal sudo[6193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:06 np0005604215.novalocal python3[6195]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-ef6d-83b0-0000000021a5-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:54:06 np0005604215.novalocal sudo[6193]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:08 np0005604215.novalocal sudo[6212]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phqedqhgekuohdgscxifnpqppgwznpig ; /usr/bin/python3
Feb 01 06:54:08 np0005604215.novalocal sudo[6212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:08 np0005604215.novalocal python3[6214]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:54:08 np0005604215.novalocal sudo[6212]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:08 np0005604215.novalocal sudo[6228]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsgvsuerwbgdfwmmiepvvzxelodhdrek ; /usr/bin/python3
Feb 01 06:54:08 np0005604215.novalocal sudo[6228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:08 np0005604215.novalocal python3[6230]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:54:08 np0005604215.novalocal sudo[6228]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:08 np0005604215.novalocal sudo[6244]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtuehyaczdhsistlhmdmzqsgsawqlkqz ; /usr/bin/python3
Feb 01 06:54:08 np0005604215.novalocal sudo[6244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:08 np0005604215.novalocal python3[6246]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:54:08 np0005604215.novalocal sudo[6244]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:08 np0005604215.novalocal sudo[6260]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzrdhvyvhnvjdhdbzrffzpshsrlexytd ; /usr/bin/python3
Feb 01 06:54:08 np0005604215.novalocal sudo[6260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:08 np0005604215.novalocal python3[6262]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:54:09 np0005604215.novalocal sudo[6260]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:09 np0005604215.novalocal sudo[6276]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slavqzajnkmydksltyxorrvmbcikwzxs ; /usr/bin/python3
Feb 01 06:54:09 np0005604215.novalocal sudo[6276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:09 np0005604215.novalocal python3[6278]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:54:09 np0005604215.novalocal sudo[6276]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:10 np0005604215.novalocal sudo[6324]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-boypwizlabbwitekewdaqieiyggehmue ; /usr/bin/python3
Feb 01 06:54:10 np0005604215.novalocal sudo[6324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:10 np0005604215.novalocal python3[6326]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:54:10 np0005604215.novalocal sudo[6324]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:11 np0005604215.novalocal sudo[6367]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lezhifhuxsmklqykeasvganclyomxiox ; /usr/bin/python3
Feb 01 06:54:11 np0005604215.novalocal sudo[6367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:11 np0005604215.novalocal python3[6369]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769928850.6871905-672-272484530281361/source _original_basename=tmpo44v3sz0 follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:54:11 np0005604215.novalocal sudo[6367]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:12 np0005604215.novalocal sudo[6397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nowhlvhobzooaeotejnwozkvcyirnkui ; /usr/bin/python3
Feb 01 06:54:12 np0005604215.novalocal sudo[6397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:12 np0005604215.novalocal python3[6399]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 06:54:12 np0005604215.novalocal systemd[1]: Reloading.
Feb 01 06:54:12 np0005604215.novalocal systemd-rc-local-generator[6417]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 06:54:12 np0005604215.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 06:54:13 np0005604215.novalocal sudo[6397]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:14 np0005604215.novalocal sudo[6443]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btkeomviwndixwymeyehjosqdstzastb ; /usr/bin/python3
Feb 01 06:54:14 np0005604215.novalocal sudo[6443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:14 np0005604215.novalocal python3[6445]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Feb 01 06:54:14 np0005604215.novalocal sudo[6443]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:15 np0005604215.novalocal sudo[6459]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxiuervcuetjcyvcvzjcnuqkvvefvyfh ; /usr/bin/python3
Feb 01 06:54:15 np0005604215.novalocal sudo[6459]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:15 np0005604215.novalocal python3[6461]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:54:15 np0005604215.novalocal sudo[6459]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:15 np0005604215.novalocal sudo[6477]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbnndxegepmznmuxbirxlciuxensnkcl ; /usr/bin/python3
Feb 01 06:54:15 np0005604215.novalocal sudo[6477]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:15 np0005604215.novalocal python3[6479]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:54:15 np0005604215.novalocal sudo[6477]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:16 np0005604215.novalocal sudo[6495]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqflaiegixwbohfftrvhtrdkushzzhqd ; /usr/bin/python3
Feb 01 06:54:16 np0005604215.novalocal sudo[6495]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:16 np0005604215.novalocal python3[6497]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:54:16 np0005604215.novalocal sudo[6495]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:16 np0005604215.novalocal sudo[6513]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-asixgfhitueynrunrrvvmebdcjrpjfri ; /usr/bin/python3
Feb 01 06:54:16 np0005604215.novalocal sudo[6513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:54:16 np0005604215.novalocal python3[6515]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max
                                                       _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:54:16 np0005604215.novalocal sudo[6513]: pam_unix(sudo:session): session closed for user root
Feb 01 06:54:17 np0005604215.novalocal python3[6532]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;
                                                       _uses_shell=True zuul_log_id=fa163ef9-e89a-ef6d-83b0-0000000021ac-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:54:18 np0005604215.novalocal python3[6552]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 06:54:21 np0005604215.novalocal sshd[6176]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:54:21 np0005604215.novalocal systemd[1]: session-5.scope: Deactivated successfully.
Feb 01 06:54:21 np0005604215.novalocal systemd[1]: session-5.scope: Consumed 3.853s CPU time.
Feb 01 06:54:21 np0005604215.novalocal systemd-logind[761]: Session 5 logged out. Waiting for processes to exit.
Feb 01 06:54:21 np0005604215.novalocal systemd-logind[761]: Removed session 5.
Feb 01 06:55:23 np0005604215.novalocal sshd[6559]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:55:23 np0005604215.novalocal sshd[6559]: Accepted publickey for zuul from 38.102.83.114 port 48070 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 06:55:23 np0005604215.novalocal systemd[1]: Started Session 6 of User zuul.
Feb 01 06:55:23 np0005604215.novalocal systemd-logind[761]: New session 6 of user zuul.
Feb 01 06:55:23 np0005604215.novalocal sshd[6559]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:55:23 np0005604215.novalocal sudo[6576]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewgkltgnkrgkvckgpwyqbjgcrwijfqwj ; /usr/bin/python3
Feb 01 06:55:23 np0005604215.novalocal sudo[6576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:55:24 np0005604215.novalocal systemd[1]: Starting RHSM dbus service...
Feb 01 06:55:24 np0005604215.novalocal systemd[1]: Started RHSM dbus service.
Feb 01 06:55:24 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 01 06:55:24 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 01 06:55:24 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 01 06:55:24 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 01 06:55:25 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.managerlib:90] Consumer created: np0005604215.novalocal (228c691b-7b73-45e5-afcd-2aea3d003268)
Feb 01 06:55:25 np0005604215.novalocal subscription-manager[6583]: Registered system with identity: 228c691b-7b73-45e5-afcd-2aea3d003268
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.entcertlib:131] certs updated:
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]: Total updates: 1
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]: Found (local) serial# []
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]: Expected (UEP) serial# [9104674843723702672]
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]: Added (new)
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]:   [sn:9104674843723702672 ( Content Access,) @ /etc/pki/entitlement/9104674843723702672.pem]
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]: Deleted (rogue):
Feb 01 06:55:26 np0005604215.novalocal rhsm-service[6583]:   <NONE>
Feb 01 06:55:26 np0005604215.novalocal subscription-manager[6583]: Added subscription for 'Content Access' contract 'None'
Feb 01 06:55:26 np0005604215.novalocal subscription-manager[6583]: Added subscription for product ' Content Access'
Feb 01 06:55:27 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.i18n:169] Could not import locale for C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 01 06:55:27 np0005604215.novalocal rhsm-service[6583]:  INFO [subscription_manager.i18n:139] Could not import locale either for C_C: [Errno 2] No translation file found for domain: 'rhsm'
Feb 01 06:55:27 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 06:55:27 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 06:55:27 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 06:55:27 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 06:55:28 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 06:55:28 np0005604215.novalocal sudo[6576]: pam_unix(sudo:session): session closed for user root
Feb 01 06:55:31 np0005604215.novalocal python3[6674]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-1e29-84da-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 06:56:24 np0005604215.novalocal sudo[6691]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jowgtvhbosgwmzwbvjuncpuxnodmynpd ; /usr/bin/python3
Feb 01 06:56:24 np0005604215.novalocal sudo[6691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:56:24 np0005604215.novalocal systemd[1]: Starting dnf makecache...
Feb 01 06:56:24 np0005604215.novalocal python3[6694]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 06:56:25 np0005604215.novalocal dnf[6693]: Updating Subscription Management repositories.
Feb 01 06:56:26 np0005604215.novalocal dnf[6693]: Failed determining last makecache time.
Feb 01 06:56:27 np0005604215.novalocal dnf[6693]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   35 MB/s |  14 MB     00:00
Feb 01 06:56:29 np0005604215.novalocal dnf[6693]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  52 MB/s |  24 MB     00:00
Feb 01 06:56:34 np0005604215.novalocal dnf[6693]: Red Hat Enterprise Linux 9 for x86_64 - AppStre  70 MB/s |  42 MB     00:00
Feb 01 06:56:42 np0005604215.novalocal dnf[6693]: Red Hat Enterprise Linux 9 for x86_64 - BaseOS   71 MB/s |  44 MB     00:00
Feb 01 06:56:47 np0005604215.novalocal dnf[6693]: Last metadata expiration check: 0:00:02 ago on Sun Feb  1 06:56:42 2026.
Feb 01 06:56:49 np0005604215.novalocal dnf[6693]: Metadata cache created.
Feb 01 06:56:49 np0005604215.novalocal systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 01 06:56:49 np0005604215.novalocal systemd[1]: Finished dnf makecache.
Feb 01 06:56:49 np0005604215.novalocal systemd[1]: dnf-makecache.service: Consumed 23.169s CPU time.
Feb 01 06:56:53 np0005604215.novalocal setsebool[6769]: The virt_use_nfs policy boolean was changed to 1 by root
Feb 01 06:56:53 np0005604215.novalocal setsebool[6769]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  Converting 406 SID table entries...
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 06:57:01 np0005604215.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 06:57:14 np0005604215.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=3 res=1
Feb 01 06:57:14 np0005604215.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 06:57:14 np0005604215.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 01 06:57:14 np0005604215.novalocal systemd[1]: Reloading.
Feb 01 06:57:14 np0005604215.novalocal systemd-rc-local-generator[7668]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 06:57:14 np0005604215.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 06:57:15 np0005604215.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 06:57:16 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 06:57:16 np0005604215.novalocal sudo[6691]: pam_unix(sudo:session): session closed for user root
Feb 01 06:57:18 np0005604215.novalocal sudo[12565]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqoznmfbcbygcgpxvadrsbzjrcyaccqa ; /usr/bin/python3
Feb 01 06:57:18 np0005604215.novalocal sudo[12565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:57:18 np0005604215.novalocal systemd[1]: var-lib-containers-storage-overlay-opaque\x2dbug\x2dcheck3624694533-merged.mount: Deactivated successfully.
Feb 01 06:57:18 np0005604215.novalocal podman[12826]: 2026-02-01 06:57:18.863616734 +0000 UTC m=+0.107065104 system refresh
Feb 01 06:57:19 np0005604215.novalocal sudo[12565]: pam_unix(sudo:session): session closed for user root
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: Starting D-Bus User Message Bus...
Feb 01 06:57:19 np0005604215.novalocal dbus-broker-launch[14398]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 01 06:57:19 np0005604215.novalocal dbus-broker-launch[14398]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: Started D-Bus User Message Bus.
Feb 01 06:57:19 np0005604215.novalocal dbus-broker-lau[14398]: Ready
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: selinux: avc:  op=load_policy lsm=selinux seqno=3 res=1
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: Created slice Slice /user.
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: podman-14248.scope: unit configures an IP firewall, but not running as root.
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: (This warning is only shown for the first unit using IP firewalling.)
Feb 01 06:57:19 np0005604215.novalocal systemd[4179]: Started podman-14248.scope.
Feb 01 06:57:19 np0005604215.novalocal systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 06:57:20 np0005604215.novalocal systemd[4179]: Started podman-pause-ef96ed7a.scope.
Feb 01 06:57:20 np0005604215.novalocal sshd[6559]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:57:20 np0005604215.novalocal systemd[1]: session-6.scope: Deactivated successfully.
Feb 01 06:57:20 np0005604215.novalocal systemd[1]: session-6.scope: Consumed 30.800s CPU time.
Feb 01 06:57:20 np0005604215.novalocal systemd-logind[761]: Session 6 logged out. Waiting for processes to exit.
Feb 01 06:57:20 np0005604215.novalocal systemd-logind[761]: Removed session 6.
Feb 01 06:57:22 np0005604215.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 06:57:22 np0005604215.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 01 06:57:22 np0005604215.novalocal systemd[1]: man-db-cache-update.service: Consumed 9.127s CPU time.
Feb 01 06:57:22 np0005604215.novalocal systemd[1]: run-rd67729f7e47a46dd9723e7c61dc6c308.service: Deactivated successfully.
Feb 01 06:57:35 np0005604215.novalocal sshd[18427]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:57:35 np0005604215.novalocal sshd[18428]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:57:35 np0005604215.novalocal sshd[18430]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:57:35 np0005604215.novalocal sshd[18429]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:57:35 np0005604215.novalocal sshd[18431]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:57:35 np0005604215.novalocal sshd[18427]: Connection closed by 38.102.83.41 port 58564 [preauth]
Feb 01 06:57:35 np0005604215.novalocal sshd[18428]: Connection closed by 38.102.83.41 port 58578 [preauth]
Feb 01 06:57:35 np0005604215.novalocal sshd[18430]: Unable to negotiate with 38.102.83.41 port 58590: no matching host key type found. Their offer: ssh-ed25519 [preauth]
Feb 01 06:57:35 np0005604215.novalocal sshd[18431]: Unable to negotiate with 38.102.83.41 port 58602: no matching host key type found. Their offer: sk-ecdsa-sha2-nistp256@openssh.com [preauth]
Feb 01 06:57:35 np0005604215.novalocal sshd[18429]: Unable to negotiate with 38.102.83.41 port 58608: no matching host key type found. Their offer: sk-ssh-ed25519@openssh.com [preauth]
Feb 01 06:57:39 np0005604215.novalocal sshd[18437]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:57:39 np0005604215.novalocal sshd[18437]: Accepted publickey for zuul from 38.102.83.114 port 46314 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 06:57:39 np0005604215.novalocal systemd-logind[761]: New session 7 of user zuul.
Feb 01 06:57:39 np0005604215.novalocal systemd[1]: Started Session 7 of User zuul.
Feb 01 06:57:39 np0005604215.novalocal sshd[18437]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:57:39 np0005604215.novalocal python3[18454]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGBAgohDMlstWoPOrziVyT3cq7c4YoWvTNp64hcksvV2VrQsWD6YrTZBaXHL0twL/A8QbTt5cQ7NNpUOjUCI5d4= zuul@np0005604206.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:57:40 np0005604215.novalocal sudo[18468]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nocdahncnojlmwzkcowplajjzevozena ; /usr/bin/python3
Feb 01 06:57:40 np0005604215.novalocal sudo[18468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:57:40 np0005604215.novalocal python3[18470]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBGBAgohDMlstWoPOrziVyT3cq7c4YoWvTNp64hcksvV2VrQsWD6YrTZBaXHL0twL/A8QbTt5cQ7NNpUOjUCI5d4= zuul@np0005604206.novalocal
                                                        manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:57:40 np0005604215.novalocal sudo[18468]: pam_unix(sudo:session): session closed for user root
Feb 01 06:57:42 np0005604215.novalocal sshd[18437]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:57:42 np0005604215.novalocal systemd[1]: session-7.scope: Deactivated successfully.
Feb 01 06:57:42 np0005604215.novalocal systemd-logind[761]: Session 7 logged out. Waiting for processes to exit.
Feb 01 06:57:42 np0005604215.novalocal systemd-logind[761]: Removed session 7.
Feb 01 06:59:00 np0005604215.novalocal sshd[18472]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 06:59:00 np0005604215.novalocal sshd[18472]: Accepted publickey for zuul from 38.102.83.114 port 51388 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 06:59:00 np0005604215.novalocal systemd-logind[761]: New session 8 of user zuul.
Feb 01 06:59:00 np0005604215.novalocal systemd[1]: Started Session 8 of User zuul.
Feb 01 06:59:00 np0005604215.novalocal sshd[18472]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 06:59:00 np0005604215.novalocal sudo[18489]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgpkfyrtvmnlcfrdlsjwfdgormeyjwrt ; /usr/bin/python3
Feb 01 06:59:00 np0005604215.novalocal sudo[18489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:00 np0005604215.novalocal python3[18491]: ansible-authorized_key Invoked with user=root manage_dir=True key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 06:59:00 np0005604215.novalocal sudo[18489]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:01 np0005604215.novalocal sudo[18505]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mukberkvxkfmkbvcciibylbapxfgvryd ; /usr/bin/python3
Feb 01 06:59:01 np0005604215.novalocal sudo[18505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:02 np0005604215.novalocal python3[18507]: ansible-user Invoked with name=root state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604215.novalocal update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 01 06:59:02 np0005604215.novalocal sudo[18505]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:04 np0005604215.novalocal sudo[18555]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yloajshziopeuxcnhautsbohsporlhfh ; /usr/bin/python3
Feb 01 06:59:04 np0005604215.novalocal sudo[18555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:04 np0005604215.novalocal python3[18557]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:59:04 np0005604215.novalocal sudo[18555]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:04 np0005604215.novalocal sudo[18598]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eibqjkbfnccpwtkuykxrsprbpkgqpydp ; /usr/bin/python3
Feb 01 06:59:04 np0005604215.novalocal sudo[18598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:04 np0005604215.novalocal python3[18600]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769929144.0698407-136-80285915147853/source dest=/root/.ssh/id_rsa mode=384 owner=root force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa follow=False checksum=1450e921e2d17379ea725f99be2eea1fb6e75a52 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:59:04 np0005604215.novalocal sudo[18598]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:05 np0005604215.novalocal sudo[18660]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwitigkcywfuqdlsbtlcfrwxtrncydzo ; /usr/bin/python3
Feb 01 06:59:05 np0005604215.novalocal sudo[18660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:06 np0005604215.novalocal python3[18662]: ansible-ansible.legacy.stat Invoked with path=/root/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:59:06 np0005604215.novalocal sudo[18660]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:06 np0005604215.novalocal sudo[18703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blzfivnzvjoovdtxjijgzouykakfhixg ; /usr/bin/python3
Feb 01 06:59:06 np0005604215.novalocal sudo[18703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:06 np0005604215.novalocal python3[18705]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769929145.7265825-226-260773617685089/source dest=/root/.ssh/id_rsa.pub mode=420 owner=root force=False _original_basename=fade19abcb7148119bae13ccbb795d6e_id_rsa.pub follow=False checksum=ad19e951a009809a91d74da158b058ce7df88458 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:59:06 np0005604215.novalocal sudo[18703]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:08 np0005604215.novalocal sudo[18733]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izgeasqypyrvtvwfvzsfelpnvjmauinb ; /usr/bin/python3
Feb 01 06:59:08 np0005604215.novalocal sudo[18733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 06:59:08 np0005604215.novalocal python3[18735]: ansible-ansible.builtin.file Invoked with path=/etc/nodepool state=directory mode=0777 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:59:08 np0005604215.novalocal sudo[18733]: pam_unix(sudo:session): session closed for user root
Feb 01 06:59:09 np0005604215.novalocal python3[18781]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:59:09 np0005604215.novalocal python3[18797]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes _original_basename=tmp14i9g_e6 recurse=False state=file path=/etc/nodepool/sub_nodes force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:59:10 np0005604215.novalocal python3[18857]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:59:11 np0005604215.novalocal python3[18873]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/sub_nodes_private _original_basename=tmpypscrmh9 recurse=False state=file path=/etc/nodepool/sub_nodes_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:59:12 np0005604215.novalocal python3[18933]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 06:59:12 np0005604215.novalocal python3[18949]: ansible-ansible.legacy.file Invoked with dest=/etc/nodepool/node_private _original_basename=tmptx2asf_z recurse=False state=file path=/etc/nodepool/node_private force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 06:59:13 np0005604215.novalocal sshd[18472]: pam_unix(sshd:session): session closed for user zuul
Feb 01 06:59:13 np0005604215.novalocal systemd[1]: session-8.scope: Deactivated successfully.
Feb 01 06:59:13 np0005604215.novalocal systemd[1]: session-8.scope: Consumed 3.475s CPU time.
Feb 01 06:59:13 np0005604215.novalocal systemd-logind[761]: Session 8 logged out. Waiting for processes to exit.
Feb 01 06:59:13 np0005604215.novalocal systemd-logind[761]: Removed session 8.
Feb 01 07:01:01 np0005604215.novalocal CROND[18966]: (root) CMD (run-parts /etc/cron.hourly)
Feb 01 07:01:01 np0005604215.novalocal run-parts[18969]: (/etc/cron.hourly) starting 0anacron
Feb 01 07:01:01 np0005604215.novalocal anacron[18977]: Anacron started on 2026-02-01
Feb 01 07:01:01 np0005604215.novalocal anacron[18977]: Will run job `cron.daily' in 40 min.
Feb 01 07:01:01 np0005604215.novalocal anacron[18977]: Will run job `cron.weekly' in 60 min.
Feb 01 07:01:01 np0005604215.novalocal anacron[18977]: Will run job `cron.monthly' in 80 min.
Feb 01 07:01:01 np0005604215.novalocal anacron[18977]: Jobs will be executed sequentially
Feb 01 07:01:01 np0005604215.novalocal run-parts[18979]: (/etc/cron.hourly) finished 0anacron
Feb 01 07:01:01 np0005604215.novalocal CROND[18965]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 01 07:01:14 np0005604215.novalocal sshd[18980]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:15 np0005604215.novalocal sshd[18980]: Accepted publickey for zuul from 38.102.83.41 port 46328 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:01:15 np0005604215.novalocal systemd-logind[761]: New session 9 of user zuul.
Feb 01 07:01:15 np0005604215.novalocal systemd[1]: Started Session 9 of User zuul.
Feb 01 07:01:15 np0005604215.novalocal sshd[18980]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 07:01:15 np0005604215.novalocal python3[19026]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:01:19 np0005604215.novalocal sshd[19028]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:22 np0005604215.novalocal sshd[19028]: error: maximum authentication attempts exceeded for root from 218.102.181.149 port 48768 ssh2 [preauth]
Feb 01 07:01:22 np0005604215.novalocal sshd[19028]: Disconnecting authenticating user root 218.102.181.149 port 48768: Too many authentication failures [preauth]
Feb 01 07:01:22 np0005604215.novalocal sshd[19030]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:27 np0005604215.novalocal sshd[19030]: error: maximum authentication attempts exceeded for root from 218.102.181.149 port 49286 ssh2 [preauth]
Feb 01 07:01:27 np0005604215.novalocal sshd[19030]: Disconnecting authenticating user root 218.102.181.149 port 49286: Too many authentication failures [preauth]
Feb 01 07:01:27 np0005604215.novalocal sshd[19032]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:30 np0005604215.novalocal sshd[19032]: error: maximum authentication attempts exceeded for root from 218.102.181.149 port 49948 ssh2 [preauth]
Feb 01 07:01:30 np0005604215.novalocal sshd[19032]: Disconnecting authenticating user root 218.102.181.149 port 49948: Too many authentication failures [preauth]
Feb 01 07:01:31 np0005604215.novalocal sshd[19034]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:34 np0005604215.novalocal sshd[19034]: Received disconnect from 218.102.181.149 port 50480:11: disconnected by user [preauth]
Feb 01 07:01:34 np0005604215.novalocal sshd[19034]: Disconnected from authenticating user root 218.102.181.149 port 50480 [preauth]
Feb 01 07:01:34 np0005604215.novalocal sshd[19036]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:36 np0005604215.novalocal sshd[19036]: Invalid user admin from 218.102.181.149 port 50876
Feb 01 07:01:37 np0005604215.novalocal sshd[19036]: error: maximum authentication attempts exceeded for invalid user admin from 218.102.181.149 port 50876 ssh2 [preauth]
Feb 01 07:01:37 np0005604215.novalocal sshd[19036]: Disconnecting invalid user admin 218.102.181.149 port 50876: Too many authentication failures [preauth]
Feb 01 07:01:38 np0005604215.novalocal sshd[19038]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:41 np0005604215.novalocal sshd[19038]: Invalid user admin from 218.102.181.149 port 51496
Feb 01 07:01:42 np0005604215.novalocal sshd[19038]: error: maximum authentication attempts exceeded for invalid user admin from 218.102.181.149 port 51496 ssh2 [preauth]
Feb 01 07:01:42 np0005604215.novalocal sshd[19038]: Disconnecting invalid user admin 218.102.181.149 port 51496: Too many authentication failures [preauth]
Feb 01 07:01:42 np0005604215.novalocal sshd[19040]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:44 np0005604215.novalocal sshd[19040]: Invalid user admin from 218.102.181.149 port 52062
Feb 01 07:01:45 np0005604215.novalocal sshd[19040]: Received disconnect from 218.102.181.149 port 52062:11: disconnected by user [preauth]
Feb 01 07:01:45 np0005604215.novalocal sshd[19040]: Disconnected from invalid user admin 218.102.181.149 port 52062 [preauth]
Feb 01 07:01:45 np0005604215.novalocal sshd[19042]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:48 np0005604215.novalocal sshd[19042]: Invalid user oracle from 218.102.181.149 port 52484
Feb 01 07:01:49 np0005604215.novalocal sshd[19042]: error: maximum authentication attempts exceeded for invalid user oracle from 218.102.181.149 port 52484 ssh2 [preauth]
Feb 01 07:01:49 np0005604215.novalocal sshd[19042]: Disconnecting invalid user oracle 218.102.181.149 port 52484: Too many authentication failures [preauth]
Feb 01 07:01:49 np0005604215.novalocal sshd[19044]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:52 np0005604215.novalocal sshd[19044]: Invalid user oracle from 218.102.181.149 port 53076
Feb 01 07:01:53 np0005604215.novalocal sshd[19044]: error: maximum authentication attempts exceeded for invalid user oracle from 218.102.181.149 port 53076 ssh2 [preauth]
Feb 01 07:01:53 np0005604215.novalocal sshd[19044]: Disconnecting invalid user oracle 218.102.181.149 port 53076: Too many authentication failures [preauth]
Feb 01 07:01:53 np0005604215.novalocal sshd[19046]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:55 np0005604215.novalocal sshd[19046]: Invalid user oracle from 218.102.181.149 port 53578
Feb 01 07:01:56 np0005604215.novalocal sshd[19046]: Received disconnect from 218.102.181.149 port 53578:11: disconnected by user [preauth]
Feb 01 07:01:56 np0005604215.novalocal sshd[19046]: Disconnected from invalid user oracle 218.102.181.149 port 53578 [preauth]
Feb 01 07:01:56 np0005604215.novalocal sshd[19048]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:01:59 np0005604215.novalocal sshd[19048]: Invalid user usuario from 218.102.181.149 port 53998
Feb 01 07:02:00 np0005604215.novalocal sshd[19048]: error: maximum authentication attempts exceeded for invalid user usuario from 218.102.181.149 port 53998 ssh2 [preauth]
Feb 01 07:02:00 np0005604215.novalocal sshd[19048]: Disconnecting invalid user usuario 218.102.181.149 port 53998: Too many authentication failures [preauth]
Feb 01 07:02:00 np0005604215.novalocal sshd[19050]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:03 np0005604215.novalocal sshd[19050]: Invalid user usuario from 218.102.181.149 port 54584
Feb 01 07:02:04 np0005604215.novalocal sshd[19050]: error: maximum authentication attempts exceeded for invalid user usuario from 218.102.181.149 port 54584 ssh2 [preauth]
Feb 01 07:02:04 np0005604215.novalocal sshd[19050]: Disconnecting invalid user usuario 218.102.181.149 port 54584: Too many authentication failures [preauth]
Feb 01 07:02:04 np0005604215.novalocal sshd[19052]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:07 np0005604215.novalocal sshd[19052]: Invalid user usuario from 218.102.181.149 port 55112
Feb 01 07:02:07 np0005604215.novalocal sshd[19052]: Received disconnect from 218.102.181.149 port 55112:11: disconnected by user [preauth]
Feb 01 07:02:07 np0005604215.novalocal sshd[19052]: Disconnected from invalid user usuario 218.102.181.149 port 55112 [preauth]
Feb 01 07:02:07 np0005604215.novalocal sshd[19054]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:09 np0005604215.novalocal sshd[19054]: Invalid user test from 218.102.181.149 port 55562
Feb 01 07:02:11 np0005604215.novalocal sshd[19054]: error: maximum authentication attempts exceeded for invalid user test from 218.102.181.149 port 55562 ssh2 [preauth]
Feb 01 07:02:11 np0005604215.novalocal sshd[19054]: Disconnecting invalid user test 218.102.181.149 port 55562: Too many authentication failures [preauth]
Feb 01 07:02:11 np0005604215.novalocal sshd[19056]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:14 np0005604215.novalocal sshd[19056]: Invalid user test from 218.102.181.149 port 56078
Feb 01 07:02:15 np0005604215.novalocal sshd[19056]: error: maximum authentication attempts exceeded for invalid user test from 218.102.181.149 port 56078 ssh2 [preauth]
Feb 01 07:02:15 np0005604215.novalocal sshd[19056]: Disconnecting invalid user test 218.102.181.149 port 56078: Too many authentication failures [preauth]
Feb 01 07:02:15 np0005604215.novalocal sshd[19058]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:17 np0005604215.novalocal sshd[19058]: Invalid user test from 218.102.181.149 port 56656
Feb 01 07:02:18 np0005604215.novalocal sshd[19058]: Received disconnect from 218.102.181.149 port 56656:11: disconnected by user [preauth]
Feb 01 07:02:18 np0005604215.novalocal sshd[19058]: Disconnected from invalid user test 218.102.181.149 port 56656 [preauth]
Feb 01 07:02:18 np0005604215.novalocal sshd[19060]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:20 np0005604215.novalocal sshd[19060]: Invalid user user from 218.102.181.149 port 57024
Feb 01 07:02:21 np0005604215.novalocal sshd[19060]: error: maximum authentication attempts exceeded for invalid user user from 218.102.181.149 port 57024 ssh2 [preauth]
Feb 01 07:02:21 np0005604215.novalocal sshd[19060]: Disconnecting invalid user user 218.102.181.149 port 57024: Too many authentication failures [preauth]
Feb 01 07:02:22 np0005604215.novalocal sshd[19062]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:25 np0005604215.novalocal sshd[19062]: Invalid user user from 218.102.181.149 port 57594
Feb 01 07:02:26 np0005604215.novalocal sshd[19062]: error: maximum authentication attempts exceeded for invalid user user from 218.102.181.149 port 57594 ssh2 [preauth]
Feb 01 07:02:26 np0005604215.novalocal sshd[19062]: Disconnecting invalid user user 218.102.181.149 port 57594: Too many authentication failures [preauth]
Feb 01 07:02:27 np0005604215.novalocal sshd[19064]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:29 np0005604215.novalocal sshd[19064]: Invalid user user from 218.102.181.149 port 58288
Feb 01 07:02:30 np0005604215.novalocal sshd[19064]: Received disconnect from 218.102.181.149 port 58288:11: disconnected by user [preauth]
Feb 01 07:02:30 np0005604215.novalocal sshd[19064]: Disconnected from invalid user user 218.102.181.149 port 58288 [preauth]
Feb 01 07:02:30 np0005604215.novalocal sshd[19066]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:33 np0005604215.novalocal sshd[19066]: Invalid user ftpuser from 218.102.181.149 port 58804
Feb 01 07:02:34 np0005604215.novalocal sshd[19066]: error: maximum authentication attempts exceeded for invalid user ftpuser from 218.102.181.149 port 58804 ssh2 [preauth]
Feb 01 07:02:34 np0005604215.novalocal sshd[19066]: Disconnecting invalid user ftpuser 218.102.181.149 port 58804: Too many authentication failures [preauth]
Feb 01 07:02:34 np0005604215.novalocal sshd[19068]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:37 np0005604215.novalocal sshd[19068]: Invalid user ftpuser from 218.102.181.149 port 59380
Feb 01 07:02:38 np0005604215.novalocal sshd[19068]: error: maximum authentication attempts exceeded for invalid user ftpuser from 218.102.181.149 port 59380 ssh2 [preauth]
Feb 01 07:02:38 np0005604215.novalocal sshd[19068]: Disconnecting invalid user ftpuser 218.102.181.149 port 59380: Too many authentication failures [preauth]
Feb 01 07:02:39 np0005604215.novalocal sshd[19070]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:39 np0005604215.novalocal sshd[19072]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:41 np0005604215.novalocal sshd[19070]: Invalid user ftpuser from 218.102.181.149 port 59986
Feb 01 07:02:42 np0005604215.novalocal sshd[19070]: Received disconnect from 218.102.181.149 port 59986:11: disconnected by user [preauth]
Feb 01 07:02:42 np0005604215.novalocal sshd[19070]: Disconnected from invalid user ftpuser 218.102.181.149 port 59986 [preauth]
Feb 01 07:02:42 np0005604215.novalocal sshd[19074]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:44 np0005604215.novalocal sshd[19074]: Invalid user test1 from 218.102.181.149 port 60470
Feb 01 07:02:46 np0005604215.novalocal sshd[19074]: error: maximum authentication attempts exceeded for invalid user test1 from 218.102.181.149 port 60470 ssh2 [preauth]
Feb 01 07:02:46 np0005604215.novalocal sshd[19074]: Disconnecting invalid user test1 218.102.181.149 port 60470: Too many authentication failures [preauth]
Feb 01 07:02:46 np0005604215.novalocal sshd[19076]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:49 np0005604215.novalocal sshd[19076]: Invalid user test1 from 218.102.181.149 port 32796
Feb 01 07:02:50 np0005604215.novalocal sshd[19076]: error: maximum authentication attempts exceeded for invalid user test1 from 218.102.181.149 port 32796 ssh2 [preauth]
Feb 01 07:02:50 np0005604215.novalocal sshd[19076]: Disconnecting invalid user test1 218.102.181.149 port 32796: Too many authentication failures [preauth]
Feb 01 07:02:50 np0005604215.novalocal sshd[19079]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:52 np0005604215.novalocal sshd[19079]: Invalid user test1 from 218.102.181.149 port 33372
Feb 01 07:02:53 np0005604215.novalocal sshd[19079]: Received disconnect from 218.102.181.149 port 33372:11: disconnected by user [preauth]
Feb 01 07:02:53 np0005604215.novalocal sshd[19079]: Disconnected from invalid user test1 218.102.181.149 port 33372 [preauth]
Feb 01 07:02:53 np0005604215.novalocal sshd[19081]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:02:55 np0005604215.novalocal sshd[19072]: Connection closed by 206.168.34.209 port 25828 [preauth]
Feb 01 07:02:57 np0005604215.novalocal sshd[19081]: Invalid user test2 from 218.102.181.149 port 33752
Feb 01 07:02:58 np0005604215.novalocal sshd[19081]: error: maximum authentication attempts exceeded for invalid user test2 from 218.102.181.149 port 33752 ssh2 [preauth]
Feb 01 07:02:58 np0005604215.novalocal sshd[19081]: Disconnecting invalid user test2 218.102.181.149 port 33752: Too many authentication failures [preauth]
Feb 01 07:02:58 np0005604215.novalocal sshd[19083]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:01 np0005604215.novalocal sshd[19083]: Invalid user test2 from 218.102.181.149 port 34456
Feb 01 07:03:03 np0005604215.novalocal sshd[19083]: error: maximum authentication attempts exceeded for invalid user test2 from 218.102.181.149 port 34456 ssh2 [preauth]
Feb 01 07:03:03 np0005604215.novalocal sshd[19083]: Disconnecting invalid user test2 218.102.181.149 port 34456: Too many authentication failures [preauth]
Feb 01 07:03:03 np0005604215.novalocal sshd[19085]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:06 np0005604215.novalocal sshd[19085]: Invalid user test2 from 218.102.181.149 port 35132
Feb 01 07:03:06 np0005604215.novalocal sshd[19085]: Received disconnect from 218.102.181.149 port 35132:11: disconnected by user [preauth]
Feb 01 07:03:06 np0005604215.novalocal sshd[19085]: Disconnected from invalid user test2 218.102.181.149 port 35132 [preauth]
Feb 01 07:03:06 np0005604215.novalocal sshd[19087]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:09 np0005604215.novalocal sshd[19087]: Invalid user ubuntu from 218.102.181.149 port 35632
Feb 01 07:03:10 np0005604215.novalocal sshd[19087]: error: maximum authentication attempts exceeded for invalid user ubuntu from 218.102.181.149 port 35632 ssh2 [preauth]
Feb 01 07:03:10 np0005604215.novalocal sshd[19087]: Disconnecting invalid user ubuntu 218.102.181.149 port 35632: Too many authentication failures [preauth]
Feb 01 07:03:11 np0005604215.novalocal sshd[19089]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:13 np0005604215.novalocal sshd[19089]: Invalid user ubuntu from 218.102.181.149 port 36256
Feb 01 07:03:15 np0005604215.novalocal sshd[19089]: error: maximum authentication attempts exceeded for invalid user ubuntu from 218.102.181.149 port 36256 ssh2 [preauth]
Feb 01 07:03:15 np0005604215.novalocal sshd[19089]: Disconnecting invalid user ubuntu 218.102.181.149 port 36256: Too many authentication failures [preauth]
Feb 01 07:03:15 np0005604215.novalocal sshd[19091]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:18 np0005604215.novalocal sshd[19091]: Invalid user ubuntu from 218.102.181.149 port 36868
Feb 01 07:03:19 np0005604215.novalocal sshd[19091]: Received disconnect from 218.102.181.149 port 36868:11: disconnected by user [preauth]
Feb 01 07:03:19 np0005604215.novalocal sshd[19091]: Disconnected from invalid user ubuntu 218.102.181.149 port 36868 [preauth]
Feb 01 07:03:19 np0005604215.novalocal sshd[19093]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:21 np0005604215.novalocal sshd[19093]: Invalid user pi from 218.102.181.149 port 37414
Feb 01 07:03:22 np0005604215.novalocal sshd[19093]: Received disconnect from 218.102.181.149 port 37414:11: disconnected by user [preauth]
Feb 01 07:03:22 np0005604215.novalocal sshd[19093]: Disconnected from invalid user pi 218.102.181.149 port 37414 [preauth]
Feb 01 07:03:23 np0005604215.novalocal sshd[19095]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:03:25 np0005604215.novalocal sshd[19095]: Invalid user baikal from 218.102.181.149 port 37930
Feb 01 07:03:26 np0005604215.novalocal sshd[19095]: Received disconnect from 218.102.181.149 port 37930:11: disconnected by user [preauth]
Feb 01 07:03:26 np0005604215.novalocal sshd[19095]: Disconnected from invalid user baikal 218.102.181.149 port 37930 [preauth]
Feb 01 07:06:14 np0005604215.novalocal sshd[18983]: Received disconnect from 38.102.83.41 port 46328:11: disconnected by user
Feb 01 07:06:14 np0005604215.novalocal sshd[18983]: Disconnected from user zuul 38.102.83.41 port 46328
Feb 01 07:06:14 np0005604215.novalocal sshd[18980]: pam_unix(sshd:session): session closed for user zuul
Feb 01 07:06:14 np0005604215.novalocal systemd[1]: session-9.scope: Deactivated successfully.
Feb 01 07:06:14 np0005604215.novalocal systemd-logind[761]: Session 9 logged out. Waiting for processes to exit.
Feb 01 07:06:14 np0005604215.novalocal systemd-logind[761]: Removed session 9.
Feb 01 07:12:50 np0005604215.novalocal sshd[19101]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:12:50 np0005604215.novalocal sshd[19101]: Accepted publickey for zuul from 38.102.83.114 port 57340 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:12:50 np0005604215.novalocal systemd-logind[761]: New session 10 of user zuul.
Feb 01 07:12:50 np0005604215.novalocal systemd[1]: Started Session 10 of User zuul.
Feb 01 07:12:50 np0005604215.novalocal sshd[19101]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 07:12:50 np0005604215.novalocal python3[19118]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/redhat-release zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:12:53 np0005604215.novalocal sudo[19136]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjgwpcvwbvwdwhdrsasawvlcmndmpjna ; /usr/bin/python3
Feb 01 07:12:53 np0005604215.novalocal sudo[19136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:12:53 np0005604215.novalocal python3[19138]: ansible-ansible.legacy.command Invoked with _raw_params=yum clean all zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000000d-1-overcloudnovacompute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:12:55 np0005604215.novalocal sudo[19136]: pam_unix(sudo:session): session closed for user root
Feb 01 07:13:23 np0005604215.novalocal sudo[19155]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msdmxezzesvbzkrliypbflsncijnnqhl ; /usr/bin/python3
Feb 01 07:13:23 np0005604215.novalocal sudo[19155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:13:24 np0005604215.novalocal python3[19157]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-baseos-eus-rpms'] state=enabled purge=False
Feb 01 07:13:26 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:13:27 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:13:50 np0005604215.novalocal sudo[19155]: pam_unix(sudo:session): session closed for user root
Feb 01 07:13:57 np0005604215.novalocal sudo[19313]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbxneanemxhduicgwfhddoxgvzwnirqy ; /usr/bin/python3
Feb 01 07:13:57 np0005604215.novalocal sudo[19313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:13:57 np0005604215.novalocal python3[19315]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-appstream-eus-rpms'] state=enabled purge=False
Feb 01 07:13:59 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:00 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:01 np0005604215.novalocal sudo[19313]: pam_unix(sudo:session): session closed for user root
Feb 01 07:14:16 np0005604215.novalocal sudo[19454]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztdzkhhupfdgqcvmjpfnnwmiwozuzajo ; /usr/bin/python3
Feb 01 07:14:16 np0005604215.novalocal sudo[19454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:14:16 np0005604215.novalocal python3[19456]: ansible-community.general.rhsm_repository Invoked with name=['rhel-9-for-x86_64-highavailability-eus-rpms'] state=enabled purge=False
Feb 01 07:14:18 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:18 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:23 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:23 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:29 np0005604215.novalocal sudo[19454]: pam_unix(sudo:session): session closed for user root
Feb 01 07:14:47 np0005604215.novalocal sudo[19731]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovdwtjanhvnmnvwyeyodernshnjscuza ; /usr/bin/python3
Feb 01 07:14:47 np0005604215.novalocal sudo[19731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:14:47 np0005604215.novalocal python3[19733]: ansible-community.general.rhsm_repository Invoked with name=['fast-datapath-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Feb 01 07:14:49 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:49 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:54 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:14:54 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:15:01 np0005604215.novalocal sudo[19731]: pam_unix(sudo:session): session closed for user root
Feb 01 07:15:16 np0005604215.novalocal sudo[20067]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viidwmmgnryymjrkoeqeqdlfshdtthyw ; /usr/bin/python3
Feb 01 07:15:16 np0005604215.novalocal sudo[20067]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:15:16 np0005604215.novalocal python3[20069]: ansible-community.general.rhsm_repository Invoked with name=['openstack-17.1-for-rhel-9-x86_64-rpms'] state=enabled purge=False
Feb 01 07:15:19 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:15:24 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:15:24 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:15:31 np0005604215.novalocal sudo[20067]: pam_unix(sudo:session): session closed for user root
Feb 01 07:15:34 np0005604215.novalocal sudo[20463]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwnpopxieajwpxtwucgugnoqadmwupnq ; /usr/bin/python3
Feb 01 07:15:34 np0005604215.novalocal sudo[20463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:15:34 np0005604215.novalocal python3[20465]: ansible-ansible.legacy.command Invoked with _raw_params=yum repolist --enabled
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000013-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:15:36 np0005604215.novalocal sudo[20463]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:04 np0005604215.novalocal sudo[20482]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aydhypoixsmjwtkkcfnwitpdnbunvsdq ; /usr/bin/python3
Feb 01 07:16:04 np0005604215.novalocal sudo[20482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:05 np0005604215.novalocal python3[20484]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch', 'os-net-config', 'ansible-core'] state=present update_cache=True allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  Converting 487 SID table entries...
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability open_perms=1
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:16:26 np0005604215.novalocal kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:16:26 np0005604215.novalocal groupadd[20583]: group added to /etc/group: name=unbound, GID=987
Feb 01 07:16:26 np0005604215.novalocal groupadd[20583]: group added to /etc/gshadow: name=unbound
Feb 01 07:16:26 np0005604215.novalocal groupadd[20583]: new group: name=unbound, GID=987
Feb 01 07:16:26 np0005604215.novalocal useradd[20590]: new user: name=unbound, UID=987, GID=987, home=/etc/unbound, shell=/sbin/nologin, from=none
Feb 01 07:16:26 np0005604215.novalocal dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=4 res=1
Feb 01 07:16:26 np0005604215.novalocal systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Feb 01 07:16:26 np0005604215.novalocal groupadd[20603]: group added to /etc/group: name=openvswitch, GID=986
Feb 01 07:16:26 np0005604215.novalocal groupadd[20603]: group added to /etc/gshadow: name=openvswitch
Feb 01 07:16:26 np0005604215.novalocal groupadd[20603]: new group: name=openvswitch, GID=986
Feb 01 07:16:26 np0005604215.novalocal useradd[20610]: new user: name=openvswitch, UID=986, GID=986, home=/, shell=/sbin/nologin, from=none
Feb 01 07:16:27 np0005604215.novalocal groupadd[20618]: group added to /etc/group: name=hugetlbfs, GID=985
Feb 01 07:16:27 np0005604215.novalocal groupadd[20618]: group added to /etc/gshadow: name=hugetlbfs
Feb 01 07:16:27 np0005604215.novalocal groupadd[20618]: new group: name=hugetlbfs, GID=985
Feb 01 07:16:27 np0005604215.novalocal usermod[20626]: add 'openvswitch' to group 'hugetlbfs'
Feb 01 07:16:27 np0005604215.novalocal usermod[20626]: add 'openvswitch' to shadow group 'hugetlbfs'
Feb 01 07:16:29 np0005604215.novalocal systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:16:29 np0005604215.novalocal systemd[1]: Starting man-db-cache-update.service...
Feb 01 07:16:29 np0005604215.novalocal systemd[1]: Reloading.
Feb 01 07:16:29 np0005604215.novalocal systemd-rc-local-generator[21148]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:16:29 np0005604215.novalocal systemd-sysv-generator[21153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:16:29 np0005604215.novalocal systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:16:30 np0005604215.novalocal systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 07:16:30 np0005604215.novalocal systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 07:16:30 np0005604215.novalocal systemd[1]: Finished man-db-cache-update.service.
Feb 01 07:16:30 np0005604215.novalocal systemd[1]: run-r6a24df3bb2174e4892da2442aa4c6682.service: Deactivated successfully.
Feb 01 07:16:31 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:16:31 np0005604215.novalocal sudo[20482]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:31 np0005604215.novalocal rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:16:43 np0005604215.novalocal sudo[21815]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xywbrawqmysdvnqnxpvwczpnkvainbdh ; /usr/bin/python3
Feb 01 07:16:43 np0005604215.novalocal sudo[21815]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:43 np0005604215.novalocal python3[21817]: ansible-ansible.legacy.command Invoked with _raw_params=ansible-galaxy collection install ansible.posix
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000015-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:16:54 np0005604215.novalocal sudo[21815]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:55 np0005604215.novalocal sudo[21836]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewyyegcakmnxpyknhlnzhfqhfgitubtb ; /usr/bin/python3
Feb 01 07:16:55 np0005604215.novalocal sudo[21836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:55 np0005604215.novalocal python3[21838]: ansible-ansible.builtin.file Invoked with path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:16:55 np0005604215.novalocal sudo[21836]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:57 np0005604215.novalocal sudo[21884]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uqtqsijhqbvlimcrffoqwvyvcepozrwa ; /usr/bin/python3
Feb 01 07:16:57 np0005604215.novalocal sudo[21884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:57 np0005604215.novalocal python3[21886]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/tripleo_config.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:16:57 np0005604215.novalocal sudo[21884]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:57 np0005604215.novalocal sudo[21927]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltuiwoejjnjxgkxsvwfehwtiokwaziqu ; /usr/bin/python3
Feb 01 07:16:57 np0005604215.novalocal sudo[21927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:57 np0005604215.novalocal python3[21929]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769930217.093162-334-141113391887851/source dest=/etc/os-net-config/tripleo_config.yaml mode=None follow=False _original_basename=overcloud_net_config.j2 checksum=91bc45728dd9738fc644e3ada9d8642294da29ff backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:16:57 np0005604215.novalocal sudo[21927]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:59 np0005604215.novalocal sudo[21957]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkesicbeicyaqmgshavvizjcovfsnsze ; /usr/bin/python3
Feb 01 07:16:59 np0005604215.novalocal sudo[21957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:59 np0005604215.novalocal python3[21959]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network  state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 01 07:16:59 np0005604215.novalocal sudo[21957]: pam_unix(sudo:session): session closed for user root
Feb 01 07:16:59 np0005604215.novalocal systemd-journald[619]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 89.2 (297 of 333 items), suggesting rotation.
Feb 01 07:16:59 np0005604215.novalocal systemd-journald[619]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 07:16:59 np0005604215.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 07:16:59 np0005604215.novalocal rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 07:16:59 np0005604215.novalocal sudo[21978]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekccaofkhjwnywhggclmepatejkejdcj ; /usr/bin/python3
Feb 01 07:16:59 np0005604215.novalocal sudo[21978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:16:59 np0005604215.novalocal python3[21980]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-20 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 01 07:17:00 np0005604215.novalocal sudo[21978]: pam_unix(sudo:session): session closed for user root
Feb 01 07:17:00 np0005604215.novalocal sudo[21998]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjnfwhxqgvqqrqupjywyrevzavohnohe ; /usr/bin/python3
Feb 01 07:17:00 np0005604215.novalocal sudo[21998]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:17:00 np0005604215.novalocal python3[22000]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-21 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 01 07:17:00 np0005604215.novalocal sudo[21998]: pam_unix(sudo:session): session closed for user root
Feb 01 07:17:00 np0005604215.novalocal sudo[22018]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wipuwgqwvpglcudfsauekmctuzdlhgyh ; /usr/bin/python3
Feb 01 07:17:00 np0005604215.novalocal sudo[22018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:17:00 np0005604215.novalocal python3[22020]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-22 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 01 07:17:00 np0005604215.novalocal sudo[22018]: pam_unix(sudo:session): session closed for user root
Feb 01 07:17:00 np0005604215.novalocal sudo[22038]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfqkpujlehdrumxmirtvobqitwgehtqe ; /usr/bin/python3
Feb 01 07:17:00 np0005604215.novalocal sudo[22038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:17:00 np0005604215.novalocal python3[22040]: ansible-community.general.nmcli Invoked with conn_name=ci-private-network-23 state=absent ignore_unsupported_suboptions=False autoconnect=True gw4_ignore_auto=False never_default4=False dns4_ignore_auto=False may_fail4=True gw6_ignore_auto=False dns6_ignore_auto=False mode=balance-rr stp=True priority=128 slavepriority=32 forwarddelay=15 hellotime=2 maxage=20 ageingtime=300 hairpin=False path_cost=100 runner=roundrobin master=None slave_type=None ifname=None type=None ip4=None gw4=None routes4=None routes4_extended=None route_metric4=None routing_rules4=None dns4=None dns4_search=None dns4_options=None method4=None dhcp_client_id=None ip6=None gw6=None dns6=None dns6_search=None dns6_options=None routes6=None routes6_extended=None route_metric6=None method6=None ip_privacy6=None addr_gen_mode6=None miimon=None downdelay=None updelay=None xmit_hash_policy=None arp_interval=None arp_ip_target=None primary=None mtu=None mac=None zone=None runner_hwaddr_policy=None runner_fast_rate=None vlanid=None vlandev=None flags=None ingress=None egress=None vxlan_id=None vxlan_local=None vxlan_remote=None ip_tunnel_dev=None ip_tunnel_local=None ip_tunnel_remote=None ip_tunnel_input_key=NOT_LOGGING_PARAMETER ip_tunnel_output_key=NOT_LOGGING_PARAMETER ssid=None wifi=None wifi_sec=NOT_LOGGING_PARAMETER gsm=None macvlan=None wireguard=None vpn=None transport_mode=None
Feb 01 07:17:00 np0005604215.novalocal sudo[22038]: pam_unix(sudo:session): session closed for user root
Feb 01 07:17:03 np0005604215.novalocal sudo[22058]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvoqhwfcykortzkgpolrotnyxprcjgps ; /usr/bin/python3
Feb 01 07:17:03 np0005604215.novalocal sudo[22058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:17:03 np0005604215.novalocal python3[22060]: ansible-ansible.builtin.systemd Invoked with name=network state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:17:03 np0005604215.novalocal systemd[1]: Starting LSB: Bring up/down networking...
Feb 01 07:17:03 np0005604215.novalocal network[22063]: WARN      : [network] You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:03 np0005604215.novalocal network[22074]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:03 np0005604215.novalocal network[22063]: WARN      : [network] 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:03 np0005604215.novalocal network[22075]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:03 np0005604215.novalocal network[22063]: WARN      : [network] It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 07:17:03 np0005604215.novalocal network[22076]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 07:17:03 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930223.9815] audit: op="connections-reload" pid=22104 uid=0 result="success"
Feb 01 07:17:04 np0005604215.novalocal network[22063]: Bringing up loopback interface:  [  OK  ]
Feb 01 07:17:04 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930224.1976] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth0" pid=22192 uid=0 result="success"
Feb 01 07:17:04 np0005604215.novalocal network[22063]: Bringing up interface eth0:  [  OK  ]
Feb 01 07:17:04 np0005604215.novalocal systemd[1]: Started LSB: Bring up/down networking.
Feb 01 07:17:04 np0005604215.novalocal sudo[22058]: pam_unix(sudo:session): session closed for user root
Feb 01 07:17:04 np0005604215.novalocal sudo[22231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cihwwjsmoknycfqjwhkeiioownexhqso ; /usr/bin/python3
Feb 01 07:17:04 np0005604215.novalocal sudo[22231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:17:04 np0005604215.novalocal python3[22233]: ansible-ansible.builtin.systemd Invoked with name=openvswitch state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:17:04 np0005604215.novalocal systemd[1]: Starting Open vSwitch Database Unit...
Feb 01 07:17:04 np0005604215.novalocal chown[22237]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Feb 01 07:17:04 np0005604215.novalocal ovs-ctl[22242]: /etc/openvswitch/conf.db does not exist ... (warning).
Feb 01 07:17:04 np0005604215.novalocal ovs-ctl[22242]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Feb 01 07:17:04 np0005604215.novalocal ovs-ctl[22242]: Starting ovsdb-server [  OK  ]
Feb 01 07:17:04 np0005604215.novalocal ovs-vsctl[22291]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Feb 01 07:17:05 np0005604215.novalocal ovs-vsctl[22311]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.6-141.el9fdp "external-ids:system-id=\"f18e6148-4a7e-452d-80cb-72c86b59e439\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"rhel\"" "system-version=\"9.2\""
Feb 01 07:17:05 np0005604215.novalocal ovs-ctl[22242]: Configuring Open vSwitch system IDs [  OK  ]
Feb 01 07:17:05 np0005604215.novalocal ovs-ctl[22242]: Enabling remote OVSDB managers [  OK  ]
Feb 01 07:17:05 np0005604215.novalocal ovs-vsctl[22317]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005604215.novalocal
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Started Open vSwitch Database Unit.
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Starting Open vSwitch Delete Transient Ports...
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Finished Open vSwitch Delete Transient Ports.
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Starting Open vSwitch Forwarding Unit...
Feb 01 07:17:05 np0005604215.novalocal kernel: openvswitch: Open vSwitch switching datapath
Feb 01 07:17:05 np0005604215.novalocal ovs-ctl[22361]: Inserting openvswitch module [  OK  ]
Feb 01 07:17:05 np0005604215.novalocal ovs-ctl[22330]: Starting ovs-vswitchd [  OK  ]
Feb 01 07:17:05 np0005604215.novalocal ovs-vsctl[22380]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=np0005604215.novalocal
Feb 01 07:17:05 np0005604215.novalocal ovs-ctl[22330]: Enabling remote OVSDB managers [  OK  ]
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Started Open vSwitch Forwarding Unit.
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Starting Open vSwitch...
Feb 01 07:17:05 np0005604215.novalocal systemd[1]: Finished Open vSwitch.
Feb 01 07:17:05 np0005604215.novalocal sudo[22231]: pam_unix(sudo:session): session closed for user root
Feb 01 07:17:35 np0005604215.novalocal sudo[22397]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvnpmaepcvhayzeifgghudbbzamllpwe ; /usr/bin/python3
Feb 01 07:17:35 np0005604215.novalocal sudo[22397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:17:35 np0005604215.novalocal python3[22399]: ansible-ansible.legacy.command Invoked with _raw_params=os-net-config -c /etc/os-net-config/tripleo_config.yaml
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000001a-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:17:36 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930256.7370] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22596 uid=0 result="success"
Feb 01 07:17:36 np0005604215.novalocal ifup[22597]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:36 np0005604215.novalocal ifup[22598]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:36 np0005604215.novalocal ifup[22599]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:36 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930256.7679] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22605 uid=0 result="success"
Feb 01 07:17:36 np0005604215.novalocal ovs-vsctl[22607]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --may-exist add-br br-ex -- set bridge br-ex other-config:mac-table-size=50000 -- set bridge br-ex other-config:hwaddr=fa:16:3e:17:45:d1 -- set bridge br-ex fail_mode=standalone -- del-controller br-ex
Feb 01 07:17:36 np0005604215.novalocal kernel: device ovs-system entered promiscuous mode
Feb 01 07:17:36 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930256.7955] manager: (ovs-system): new Generic device (/org/freedesktop/NetworkManager/Devices/4)
Feb 01 07:17:36 np0005604215.novalocal kernel: Timeout policy base is empty
Feb 01 07:17:36 np0005604215.novalocal kernel: Failed to associated timeout policy `ovs_test_tp'
Feb 01 07:17:36 np0005604215.novalocal systemd-udevd[22608]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:36 np0005604215.novalocal kernel: device br-ex entered promiscuous mode
Feb 01 07:17:36 np0005604215.novalocal systemd-udevd[22622]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:36 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930256.8385] manager: (br-ex): new Generic device (/org/freedesktop/NetworkManager/Devices/5)
Feb 01 07:17:36 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930256.8663] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22633 uid=0 result="success"
Feb 01 07:17:36 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930256.8869] device (br-ex): carrier: link connected
Feb 01 07:17:39 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930259.9429] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22662 uid=0 result="success"
Feb 01 07:17:39 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930259.9895] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22677 uid=0 result="success"
Feb 01 07:17:40 np0005604215.novalocal NET[22702]: /etc/sysconfig/network-scripts/ifup-post : updated /etc/resolv.conf
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.0770] device (eth1): state change: activated -> unmanaged (reason 'unmanaged', sys-iface-state: 'managed')
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.0883] dhcp4 (eth1): canceled DHCP transaction
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.0883] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.0883] dhcp4 (eth1): state changed no lease
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.0916] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22711 uid=0 result="success"
Feb 01 07:17:40 np0005604215.novalocal ifup[22712]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:40 np0005604215.novalocal ifup[22713]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:40 np0005604215.novalocal systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 01 07:17:40 np0005604215.novalocal ifup[22714]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:40 np0005604215.novalocal systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.1273] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22728 uid=0 result="success"
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.2110] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22739 uid=0 result="success"
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.2191] device (eth1): carrier: link connected
Feb 01 07:17:40 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930260.2416] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22748 uid=0 result="success"
Feb 01 07:17:40 np0005604215.novalocal ipv6_wait_tentative[22760]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Feb 01 07:17:41 np0005604215.novalocal ipv6_wait_tentative[22765]: Waiting for interface eth1 IPv6 address(es) to leave the 'tentative' state
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.3067] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-eth1" pid=22774 uid=0 result="success"
Feb 01 07:17:42 np0005604215.novalocal ovs-vsctl[22789]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex eth1 -- add-port br-ex eth1
Feb 01 07:17:42 np0005604215.novalocal kernel: device eth1 entered promiscuous mode
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.3766] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22797 uid=0 result="success"
Feb 01 07:17:42 np0005604215.novalocal ifup[22798]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:42 np0005604215.novalocal ifup[22799]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:42 np0005604215.novalocal ifup[22800]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.4075] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-br-ex" pid=22806 uid=0 result="success"
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.4498] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22816 uid=0 result="success"
Feb 01 07:17:42 np0005604215.novalocal ifup[22817]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:42 np0005604215.novalocal ifup[22818]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:42 np0005604215.novalocal ifup[22819]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.4805] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22825 uid=0 result="success"
Feb 01 07:17:42 np0005604215.novalocal ovs-vsctl[22828]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Feb 01 07:17:42 np0005604215.novalocal kernel: device vlan22 entered promiscuous mode
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.5199] manager: (vlan22): new Generic device (/org/freedesktop/NetworkManager/Devices/6)
Feb 01 07:17:42 np0005604215.novalocal systemd-udevd[22830]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.5431] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22839 uid=0 result="success"
Feb 01 07:17:42 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930262.5639] device (vlan22): carrier: link connected
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.6325] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22868 uid=0 result="success"
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.6737] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=22883 uid=0 result="success"
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.7306] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22904 uid=0 result="success"
Feb 01 07:17:45 np0005604215.novalocal ifup[22905]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:45 np0005604215.novalocal ifup[22906]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:45 np0005604215.novalocal ifup[22907]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.7616] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22913 uid=0 result="success"
Feb 01 07:17:45 np0005604215.novalocal ovs-vsctl[22916]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Feb 01 07:17:45 np0005604215.novalocal systemd-udevd[22918]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.8032] manager: (vlan20): new Generic device (/org/freedesktop/NetworkManager/Devices/7)
Feb 01 07:17:45 np0005604215.novalocal kernel: device vlan20 entered promiscuous mode
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.8297] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22928 uid=0 result="success"
Feb 01 07:17:45 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930265.8496] device (vlan20): carrier: link connected
Feb 01 07:17:48 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930268.9072] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22958 uid=0 result="success"
Feb 01 07:17:48 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930268.9531] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=22973 uid=0 result="success"
Feb 01 07:17:49 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930269.0062] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=22994 uid=0 result="success"
Feb 01 07:17:49 np0005604215.novalocal ifup[22995]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:49 np0005604215.novalocal ifup[22996]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:49 np0005604215.novalocal ifup[22997]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:49 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930269.0363] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23003 uid=0 result="success"
Feb 01 07:17:49 np0005604215.novalocal ovs-vsctl[23006]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Feb 01 07:17:49 np0005604215.novalocal kernel: device vlan44 entered promiscuous mode
Feb 01 07:17:49 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930269.1099] manager: (vlan44): new Generic device (/org/freedesktop/NetworkManager/Devices/8)
Feb 01 07:17:49 np0005604215.novalocal systemd-udevd[23009]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:49 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930269.1273] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23018 uid=0 result="success"
Feb 01 07:17:49 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930269.1441] device (vlan44): carrier: link connected
Feb 01 07:17:50 np0005604215.novalocal systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.1978] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23049 uid=0 result="success"
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.2397] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23064 uid=0 result="success"
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.2964] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23085 uid=0 result="success"
Feb 01 07:17:52 np0005604215.novalocal ifup[23086]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:52 np0005604215.novalocal ifup[23087]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:52 np0005604215.novalocal ifup[23088]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.3256] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23094 uid=0 result="success"
Feb 01 07:17:52 np0005604215.novalocal ovs-vsctl[23097]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Feb 01 07:17:52 np0005604215.novalocal kernel: device vlan23 entered promiscuous mode
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.3636] manager: (vlan23): new Generic device (/org/freedesktop/NetworkManager/Devices/9)
Feb 01 07:17:52 np0005604215.novalocal systemd-udevd[23099]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.3909] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23109 uid=0 result="success"
Feb 01 07:17:52 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930272.4115] device (vlan23): carrier: link connected
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.4716] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23139 uid=0 result="success"
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.5206] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23154 uid=0 result="success"
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.5838] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23175 uid=0 result="success"
Feb 01 07:17:55 np0005604215.novalocal ifup[23176]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:55 np0005604215.novalocal ifup[23177]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:55 np0005604215.novalocal ifup[23178]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.6153] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23184 uid=0 result="success"
Feb 01 07:17:55 np0005604215.novalocal ovs-vsctl[23187]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Feb 01 07:17:55 np0005604215.novalocal systemd-udevd[23189]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 07:17:55 np0005604215.novalocal kernel: device vlan21 entered promiscuous mode
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.6576] manager: (vlan21): new Generic device (/org/freedesktop/NetworkManager/Devices/10)
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.6840] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23199 uid=0 result="success"
Feb 01 07:17:55 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930275.7063] device (vlan21): carrier: link connected
Feb 01 07:17:58 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930278.7584] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23229 uid=0 result="success"
Feb 01 07:17:58 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930278.8053] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23244 uid=0 result="success"
Feb 01 07:17:58 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930278.8658] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23265 uid=0 result="success"
Feb 01 07:17:58 np0005604215.novalocal ifup[23266]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:17:58 np0005604215.novalocal ifup[23267]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:17:58 np0005604215.novalocal ifup[23268]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:17:58 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930278.8986] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23274 uid=0 result="success"
Feb 01 07:17:58 np0005604215.novalocal ovs-vsctl[23277]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan44 -- add-port br-ex vlan44 tag=44 -- set Interface vlan44 type=internal
Feb 01 07:17:58 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930278.9620] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23284 uid=0 result="success"
Feb 01 07:18:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930280.0213] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23311 uid=0 result="success"
Feb 01 07:18:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930280.0686] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan44" pid=23326 uid=0 result="success"
Feb 01 07:18:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930280.1280] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23347 uid=0 result="success"
Feb 01 07:18:00 np0005604215.novalocal ifup[23348]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:18:00 np0005604215.novalocal ifup[23349]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:18:00 np0005604215.novalocal ifup[23350]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:18:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930280.1597] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23356 uid=0 result="success"
Feb 01 07:18:00 np0005604215.novalocal ovs-vsctl[23359]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan20 -- add-port br-ex vlan20 tag=20 -- set Interface vlan20 type=internal
Feb 01 07:18:00 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930280.2148] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23366 uid=0 result="success"
Feb 01 07:18:01 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930281.2793] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23394 uid=0 result="success"
Feb 01 07:18:01 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930281.3262] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan20" pid=23409 uid=0 result="success"
Feb 01 07:18:01 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930281.3843] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23430 uid=0 result="success"
Feb 01 07:18:01 np0005604215.novalocal ifup[23431]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:18:01 np0005604215.novalocal ifup[23432]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:18:01 np0005604215.novalocal ifup[23433]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:18:01 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930281.4156] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23439 uid=0 result="success"
Feb 01 07:18:01 np0005604215.novalocal ovs-vsctl[23442]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan21 -- add-port br-ex vlan21 tag=21 -- set Interface vlan21 type=internal
Feb 01 07:18:01 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930281.4751] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23449 uid=0 result="success"
Feb 01 07:18:02 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930282.5333] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23477 uid=0 result="success"
Feb 01 07:18:02 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930282.5773] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan21" pid=23492 uid=0 result="success"
Feb 01 07:18:02 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930282.6291] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23513 uid=0 result="success"
Feb 01 07:18:02 np0005604215.novalocal ifup[23514]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:18:02 np0005604215.novalocal ifup[23515]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:18:02 np0005604215.novalocal ifup[23516]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:18:02 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930282.6587] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23522 uid=0 result="success"
Feb 01 07:18:02 np0005604215.novalocal ovs-vsctl[23525]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan23 -- add-port br-ex vlan23 tag=23 -- set Interface vlan23 type=internal
Feb 01 07:18:02 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930282.7420] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23532 uid=0 result="success"
Feb 01 07:18:03 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930283.7972] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23560 uid=0 result="success"
Feb 01 07:18:03 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930283.8419] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan23" pid=23575 uid=0 result="success"
Feb 01 07:18:03 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930283.9013] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23596 uid=0 result="success"
Feb 01 07:18:03 np0005604215.novalocal ifup[23597]: You are using 'ifup' script provided by 'network-scripts', which are now deprecated.
Feb 01 07:18:03 np0005604215.novalocal ifup[23598]: 'network-scripts' will be removed from distribution in near future.
Feb 01 07:18:03 np0005604215.novalocal ifup[23599]: It is advised to switch to 'NetworkManager' instead - it provides 'ifup/ifdown' scripts as well.
Feb 01 07:18:03 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930283.9294] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23605 uid=0 result="success"
Feb 01 07:18:03 np0005604215.novalocal ovs-vsctl[23608]: ovs|00001|vsctl|INFO|Called as ovs-vsctl -t 10 -- --if-exists del-port br-ex vlan22 -- add-port br-ex vlan22 tag=22 -- set Interface vlan22 type=internal
Feb 01 07:18:04 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930284.0133] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23615 uid=0 result="success"
Feb 01 07:18:05 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930285.0725] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23643 uid=0 result="success"
Feb 01 07:18:05 np0005604215.novalocal NetworkManager[5972]: <info>  [1769930285.1184] audit: op="connections-load" args="/etc/sysconfig/network-scripts/ifcfg-vlan22" pid=23658 uid=0 result="success"
Feb 01 07:18:05 np0005604215.novalocal sudo[22397]: pam_unix(sudo:session): session closed for user root
Feb 01 07:18:30 np0005604215.novalocal python3[23690]: ansible-ansible.legacy.command Invoked with _raw_params=ip a
                                                       ping -c 2 -W 2 192.168.122.10
                                                       ping -c 2 -W 2 192.168.122.11
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-00000000001b-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:18:34 np0005604215.novalocal python3[23709]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 07:18:34 np0005604215.novalocal sudo[23723]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhtfkanmwcviqsumswfmoxcbynopzzog ; /usr/bin/python3
Feb 01 07:18:34 np0005604215.novalocal sudo[23723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:18:35 np0005604215.novalocal python3[23725]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 07:18:35 np0005604215.novalocal sudo[23723]: pam_unix(sudo:session): session closed for user root
Feb 01 07:18:36 np0005604215.novalocal python3[23739]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 07:18:36 np0005604215.novalocal sudo[23753]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-foilrbmmooljinickkzddimjndmmlugj ; /usr/bin/python3
Feb 01 07:18:36 np0005604215.novalocal sudo[23753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:18:37 np0005604215.novalocal python3[23755]: ansible-ansible.posix.authorized_key Invoked with user=root key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Feb 01 07:18:37 np0005604215.novalocal sudo[23753]: pam_unix(sudo:session): session closed for user root
Feb 01 07:18:37 np0005604215.novalocal python3[23769]: ansible-ansible.builtin.slurp Invoked with path=/etc/hostname src=/etc/hostname
Feb 01 07:18:38 np0005604215.novalocal python3[23784]: ansible-ansible.legacy.command Invoked with _raw_params=hostname="np0005604215.novalocal"
                                                       hostname_str_array=(${hostname//./ })
                                                       echo ${hostname_str_array[0]} > /home/zuul/ansible_hostname
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000022-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:18:39 np0005604215.novalocal sudo[23802]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftvwfhjjofmtqrbtzbblidagkbyftsfk ; /usr/bin/python3
Feb 01 07:18:39 np0005604215.novalocal sudo[23802]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:18:39 np0005604215.novalocal python3[23804]: ansible-ansible.legacy.command Invoked with _raw_params=hostname=$(cat /home/zuul/ansible_hostname)
                                                       hostnamectl hostname "$hostname.localdomain"
                                                        _uses_shell=True zuul_log_id=fa163ef9-e89a-37fb-68e8-000000000023-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:18:39 np0005604215.novalocal systemd[1]: Starting Hostname Service...
Feb 01 07:18:39 np0005604215.novalocal systemd[1]: Started Hostname Service.
Feb 01 07:18:39 np0005604215.localdomain systemd-hostnamed[23808]: Hostname set to <np0005604215.localdomain> (static)
Feb 01 07:18:39 np0005604215.localdomain NetworkManager[5972]: <info>  [1769930319.5568] hostname: static hostname changed from "np0005604215.novalocal" to "np0005604215.localdomain"
Feb 01 07:18:39 np0005604215.localdomain systemd[1]: Starting Network Manager Script Dispatcher Service...
Feb 01 07:18:39 np0005604215.localdomain systemd[1]: Started Network Manager Script Dispatcher Service.
Feb 01 07:18:39 np0005604215.localdomain sudo[23802]: pam_unix(sudo:session): session closed for user root
Feb 01 07:18:40 np0005604215.localdomain sshd[19101]: pam_unix(sshd:session): session closed for user zuul
Feb 01 07:18:40 np0005604215.localdomain systemd[1]: session-10.scope: Deactivated successfully.
Feb 01 07:18:40 np0005604215.localdomain systemd[1]: session-10.scope: Consumed 1min 43.599s CPU time.
Feb 01 07:18:40 np0005604215.localdomain systemd-logind[761]: Session 10 logged out. Waiting for processes to exit.
Feb 01 07:18:40 np0005604215.localdomain systemd-logind[761]: Removed session 10.
Feb 01 07:18:43 np0005604215.localdomain sshd[23819]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:18:43 np0005604215.localdomain sshd[23819]: Accepted publickey for zuul from 38.102.83.114 port 55800 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:18:43 np0005604215.localdomain systemd-logind[761]: New session 11 of user zuul.
Feb 01 07:18:43 np0005604215.localdomain systemd[1]: Started Session 11 of User zuul.
Feb 01 07:18:43 np0005604215.localdomain sshd[23819]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 07:18:44 np0005604215.localdomain python3[23836]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Feb 01 07:18:45 np0005604215.localdomain sshd[23819]: pam_unix(sshd:session): session closed for user zuul
Feb 01 07:18:45 np0005604215.localdomain systemd[1]: session-11.scope: Deactivated successfully.
Feb 01 07:18:45 np0005604215.localdomain systemd-logind[761]: Session 11 logged out. Waiting for processes to exit.
Feb 01 07:18:45 np0005604215.localdomain systemd-logind[761]: Removed session 11.
Feb 01 07:18:49 np0005604215.localdomain systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Feb 01 07:19:09 np0005604215.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 01 07:19:37 np0005604215.localdomain sshd[23840]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:19:38 np0005604215.localdomain sshd[23840]: Accepted publickey for zuul from 38.102.83.114 port 43030 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:19:38 np0005604215.localdomain systemd-logind[761]: New session 12 of user zuul.
Feb 01 07:19:38 np0005604215.localdomain systemd[1]: Started Session 12 of User zuul.
Feb 01 07:19:38 np0005604215.localdomain sshd[23840]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 07:19:38 np0005604215.localdomain sudo[23857]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrabxgkbqpsdvhmntzlxciswjvhvifrj ; /usr/bin/python3
Feb 01 07:19:38 np0005604215.localdomain sudo[23857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:19:38 np0005604215.localdomain python3[23859]: ansible-ansible.legacy.dnf Invoked with name=['lvm2', 'jq'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:19:42 np0005604215.localdomain systemd-rc-local-generator[23901]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:19:42 np0005604215.localdomain systemd-sysv-generator[23906]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: Listening on Device-mapper event daemon FIFOs.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:19:42 np0005604215.localdomain systemd-rc-local-generator[23941]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:19:42 np0005604215.localdomain systemd-sysv-generator[23946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:19:42 np0005604215.localdomain systemd-rc-local-generator[23984]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:19:42 np0005604215.localdomain systemd-sysv-generator[23987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:19:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: Listening on LVM2 poll daemon socket.
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:19:43 np0005604215.localdomain systemd-rc-local-generator[24041]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:19:43 np0005604215.localdomain systemd-sysv-generator[24046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 07:19:43 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:19:44 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 07:19:44 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 07:19:44 np0005604215.localdomain systemd[1]: run-ra9f44e288ab74c8fad21591d94c1d7d8.service: Deactivated successfully.
Feb 01 07:19:44 np0005604215.localdomain systemd[1]: run-r80b7093d53e24c90b03922db9ba7f157.service: Deactivated successfully.
Feb 01 07:19:44 np0005604215.localdomain sudo[23857]: pam_unix(sudo:session): session closed for user root
Feb 01 07:20:44 np0005604215.localdomain sshd[23843]: Received disconnect from 38.102.83.114 port 43030:11: disconnected by user
Feb 01 07:20:44 np0005604215.localdomain sshd[23843]: Disconnected from user zuul 38.102.83.114 port 43030
Feb 01 07:20:44 np0005604215.localdomain sshd[23840]: pam_unix(sshd:session): session closed for user zuul
Feb 01 07:20:44 np0005604215.localdomain systemd[1]: session-12.scope: Deactivated successfully.
Feb 01 07:20:44 np0005604215.localdomain systemd[1]: session-12.scope: Consumed 4.721s CPU time.
Feb 01 07:20:44 np0005604215.localdomain systemd-logind[761]: Session 12 logged out. Waiting for processes to exit.
Feb 01 07:20:44 np0005604215.localdomain systemd-logind[761]: Removed session 12.
Feb 01 07:21:36 np0005604215.localdomain sshd[24631]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:21:36 np0005604215.localdomain sshd[24631]: Invalid user  from 209.38.226.254 port 58984
Feb 01 07:21:44 np0005604215.localdomain sshd[24631]: Connection closed by invalid user  209.38.226.254 port 58984 [preauth]
Feb 01 07:36:01 np0005604215.localdomain sshd[24637]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:36:01 np0005604215.localdomain sshd[24637]: Accepted publickey for zuul from 192.168.122.100 port 41002 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:36:01 np0005604215.localdomain systemd-logind[761]: New session 13 of user zuul.
Feb 01 07:36:01 np0005604215.localdomain systemd[1]: Started Session 13 of User zuul.
Feb 01 07:36:01 np0005604215.localdomain sshd[24637]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 07:36:01 np0005604215.localdomain sudo[24683]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbcmmvcbkqdtrdumlnjpaofrfyjnursx ; /usr/bin/python3
Feb 01 07:36:01 np0005604215.localdomain sudo[24683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:01 np0005604215.localdomain python3[24685]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 07:36:02 np0005604215.localdomain sudo[24683]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:03 np0005604215.localdomain sudo[24770]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xbjnwsgimndjnzstgpfcpjfzxoppfvig ; /usr/bin/python3
Feb 01 07:36:03 np0005604215.localdomain sudo[24770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:03 np0005604215.localdomain python3[24772]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:36:06 np0005604215.localdomain sudo[24770]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:06 np0005604215.localdomain sudo[24787]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzgyuqblrdasstbiycpyohvwwdpecuhn ; /usr/bin/python3
Feb 01 07:36:06 np0005604215.localdomain sudo[24787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:06 np0005604215.localdomain python3[24789]: ansible-ansible.builtin.stat Invoked with path=/dev/loop3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:36:06 np0005604215.localdomain sudo[24787]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:07 np0005604215.localdomain sudo[24803]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngweruohsgkamumpjjpkjxmdeqlbyjpp ; /usr/bin/python3
Feb 01 07:36:07 np0005604215.localdomain sudo[24803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:07 np0005604215.localdomain python3[24805]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-0.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop3 /var/lib/ceph-osd-0.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:07 np0005604215.localdomain kernel: loop: module loaded
Feb 01 07:36:07 np0005604215.localdomain kernel: loop3: detected capacity change from 0 to 14680064
Feb 01 07:36:07 np0005604215.localdomain sudo[24803]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:07 np0005604215.localdomain sudo[24828]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwpllxjrkzstcmtsdhvzjgosqwhdagrq ; /usr/bin/python3
Feb 01 07:36:07 np0005604215.localdomain sudo[24828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:07 np0005604215.localdomain python3[24830]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop3
                                                         vgcreate ceph_vg0 /dev/loop3
                                                         lvcreate -n ceph_lv0 -l +100%FREE ceph_vg0
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:07 np0005604215.localdomain lvm[24833]: PV /dev/loop3 not used.
Feb 01 07:36:08 np0005604215.localdomain lvm[24835]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 01 07:36:08 np0005604215.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg0.
Feb 01 07:36:08 np0005604215.localdomain lvm[24844]:   1 logical volume(s) in volume group "ceph_vg0" now active
Feb 01 07:36:08 np0005604215.localdomain systemd[1]: lvm-activate-ceph_vg0.service: Deactivated successfully.
Feb 01 07:36:08 np0005604215.localdomain sudo[24828]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:08 np0005604215.localdomain sudo[24890]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gzartdzdgikioqlzdiwgmjpsrttaduxr ; /usr/bin/python3
Feb 01 07:36:08 np0005604215.localdomain sudo[24890]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:08 np0005604215.localdomain python3[24892]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-0.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:36:08 np0005604215.localdomain sudo[24890]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:09 np0005604215.localdomain sudo[24933]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjlwsaecyuudykubkzmyszqytsryzceo ; /usr/bin/python3
Feb 01 07:36:09 np0005604215.localdomain sudo[24933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:09 np0005604215.localdomain python3[24935]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931368.4223719-54355-269898621379971/source dest=/etc/systemd/system/ceph-osd-losetup-0.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=427b1db064a970126b729b07acf99fa7d0eecb9c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:09 np0005604215.localdomain sudo[24933]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:09 np0005604215.localdomain sudo[24963]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-snfihawgmusrsuiaerkyefvolhogrraf ; /usr/bin/python3
Feb 01 07:36:09 np0005604215.localdomain sudo[24963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:10 np0005604215.localdomain python3[24965]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-0.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:36:10 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:36:10 np0005604215.localdomain systemd-rc-local-generator[24990]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:36:10 np0005604215.localdomain systemd-sysv-generator[24993]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:36:10 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:36:10 np0005604215.localdomain systemd[1]: Starting Ceph OSD losetup...
Feb 01 07:36:10 np0005604215.localdomain bash[25007]: /dev/loop3: [64516]:8399529 (/var/lib/ceph-osd-0.img)
Feb 01 07:36:10 np0005604215.localdomain systemd[1]: Finished Ceph OSD losetup.
Feb 01 07:36:10 np0005604215.localdomain lvm[25008]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 01 07:36:10 np0005604215.localdomain lvm[25008]: VG ceph_vg0 finished
Feb 01 07:36:10 np0005604215.localdomain sudo[24963]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:10 np0005604215.localdomain sudo[25022]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isuvqhmfrynacrjmnutjzsqzbnswtpjm ; /usr/bin/python3
Feb 01 07:36:10 np0005604215.localdomain sudo[25022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:10 np0005604215.localdomain python3[25024]: ansible-ansible.builtin.dnf Invoked with name=['util-linux', 'lvm2', 'jq', 'podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:36:13 np0005604215.localdomain sudo[25022]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:13 np0005604215.localdomain sudo[25039]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hchrsvrvfoljxcnbtasmkpavthqrogtd ; /usr/bin/python3
Feb 01 07:36:13 np0005604215.localdomain sudo[25039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:13 np0005604215.localdomain python3[25041]: ansible-ansible.builtin.stat Invoked with path=/dev/loop4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:36:13 np0005604215.localdomain sudo[25039]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:14 np0005604215.localdomain sudo[25055]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czzxofedynnllhnkzowjorfeyibvecju ; /usr/bin/python3
Feb 01 07:36:14 np0005604215.localdomain sudo[25055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:14 np0005604215.localdomain python3[25057]: ansible-ansible.legacy.command Invoked with _raw_params=dd if=/dev/zero of=/var/lib/ceph-osd-1.img bs=1 count=0 seek=7G
                                                         losetup /dev/loop4 /var/lib/ceph-osd-1.img
                                                         lsblk _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:14 np0005604215.localdomain kernel: loop4: detected capacity change from 0 to 14680064
Feb 01 07:36:14 np0005604215.localdomain sudo[25055]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:14 np0005604215.localdomain sudo[25077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwvyaygfdywsymtfmexhikcvawijbnmn ; /usr/bin/python3
Feb 01 07:36:14 np0005604215.localdomain sudo[25077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:14 np0005604215.localdomain python3[25079]: ansible-ansible.legacy.command Invoked with _raw_params=pvcreate /dev/loop4
                                                         vgcreate ceph_vg1 /dev/loop4
                                                         lvcreate -n ceph_lv1 -l +100%FREE ceph_vg1
                                                         lvs _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:15 np0005604215.localdomain lvm[25082]: PV /dev/loop4 not used.
Feb 01 07:36:15 np0005604215.localdomain lvm[25092]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 01 07:36:15 np0005604215.localdomain systemd[1]: Started /usr/sbin/lvm vgchange -aay --autoactivation event ceph_vg1.
Feb 01 07:36:15 np0005604215.localdomain lvm[25094]:   1 logical volume(s) in volume group "ceph_vg1" now active
Feb 01 07:36:15 np0005604215.localdomain sudo[25077]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:15 np0005604215.localdomain systemd[1]: lvm-activate-ceph_vg1.service: Deactivated successfully.
Feb 01 07:36:15 np0005604215.localdomain sudo[25140]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtlpjrttltrsamsrrzqbpfopdurvvrgb ; /usr/bin/python3
Feb 01 07:36:15 np0005604215.localdomain sudo[25140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:15 np0005604215.localdomain python3[25142]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/ceph-osd-losetup-1.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:36:15 np0005604215.localdomain sudo[25140]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:15 np0005604215.localdomain sudo[25183]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmfgyxevaqhcracqezzgibvrlsqsvolo ; /usr/bin/python3
Feb 01 07:36:15 np0005604215.localdomain sudo[25183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:16 np0005604215.localdomain python3[25185]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931375.4517767-54529-197489223700253/source dest=/etc/systemd/system/ceph-osd-losetup-1.service mode=0644 force=True follow=False _original_basename=ceph-osd-losetup.service.j2 checksum=19612168ea279db4171b94ee1f8625de1ec44b58 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:16 np0005604215.localdomain sudo[25183]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:16 np0005604215.localdomain sudo[25213]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfwrlzaehoyckxbqtxajztfzoeijkpek ; /usr/bin/python3
Feb 01 07:36:16 np0005604215.localdomain sudo[25213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:16 np0005604215.localdomain python3[25215]: ansible-ansible.builtin.systemd Invoked with state=started enabled=True name=ceph-osd-losetup-1.service daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:36:17 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:36:17 np0005604215.localdomain systemd-sysv-generator[25247]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:36:17 np0005604215.localdomain systemd-rc-local-generator[25243]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:36:17 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:36:18 np0005604215.localdomain systemd[1]: Starting Ceph OSD losetup...
Feb 01 07:36:18 np0005604215.localdomain bash[25256]: /dev/loop4: [64516]:9171997 (/var/lib/ceph-osd-1.img)
Feb 01 07:36:18 np0005604215.localdomain systemd[1]: Finished Ceph OSD losetup.
Feb 01 07:36:18 np0005604215.localdomain sudo[25213]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:18 np0005604215.localdomain lvm[25257]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 01 07:36:18 np0005604215.localdomain lvm[25257]: VG ceph_vg1 finished
Feb 01 07:36:26 np0005604215.localdomain sudo[25300]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpktgwgoqtvohivlwaxhnunyznummswu ; /usr/bin/python3
Feb 01 07:36:26 np0005604215.localdomain sudo[25300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:26 np0005604215.localdomain python3[25302]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 07:36:26 np0005604215.localdomain sudo[25300]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:27 np0005604215.localdomain sudo[25320]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntorkngnkirilemyuvbyafupidvrxhmn ; /usr/bin/python3
Feb 01 07:36:27 np0005604215.localdomain sudo[25320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:27 np0005604215.localdomain python3[25322]: ansible-hostname Invoked with name=np0005604215.localdomain use=None
Feb 01 07:36:27 np0005604215.localdomain systemd[1]: Starting Hostname Service...
Feb 01 07:36:27 np0005604215.localdomain systemd[1]: Started Hostname Service.
Feb 01 07:36:27 np0005604215.localdomain sudo[25320]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:29 np0005604215.localdomain sudo[25343]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecgnrrrsyxgqaushtwnhzmiowwwnwwwm ; /usr/bin/python3
Feb 01 07:36:29 np0005604215.localdomain sudo[25343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:29 np0005604215.localdomain python3[25345]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Feb 01 07:36:29 np0005604215.localdomain sudo[25343]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:30 np0005604215.localdomain sudo[25391]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upjqezrhludnzlzigbdtdoolrklfppum ; /usr/bin/python3
Feb 01 07:36:30 np0005604215.localdomain sudo[25391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:30 np0005604215.localdomain python3[25393]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.6b1777aptmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:30 np0005604215.localdomain sudo[25391]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:30 np0005604215.localdomain sudo[25421]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynpewaqfaqxdmbfrycmhvgwtjgdhytxx ; /usr/bin/python3
Feb 01 07:36:30 np0005604215.localdomain sudo[25421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:30 np0005604215.localdomain python3[25423]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.6b1777aptmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:30 np0005604215.localdomain sudo[25421]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:31 np0005604215.localdomain sudo[25437]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yujrgppngzkwjcasxbblvteylouakdqh ; /usr/bin/python3
Feb 01 07:36:31 np0005604215.localdomain sudo[25437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:31 np0005604215.localdomain python3[25439]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.6b1777aptmphosts insertbefore=BOF block=192.168.122.106 np0005604212.localdomain np0005604212
                                                         192.168.122.106 np0005604212.ctlplane.localdomain np0005604212.ctlplane
                                                         192.168.122.107 np0005604213.localdomain np0005604213
                                                         192.168.122.107 np0005604213.ctlplane.localdomain np0005604213.ctlplane
                                                         192.168.122.108 np0005604215.localdomain np0005604215
                                                         192.168.122.108 np0005604215.ctlplane.localdomain np0005604215.ctlplane
                                                         192.168.122.103 np0005604209.localdomain np0005604209
                                                         192.168.122.103 np0005604209.ctlplane.localdomain np0005604209.ctlplane
                                                         192.168.122.104 np0005604210.localdomain np0005604210
                                                         192.168.122.104 np0005604210.ctlplane.localdomain np0005604210.ctlplane
                                                         192.168.122.105 np0005604211.localdomain np0005604211
                                                         192.168.122.105 np0005604211.ctlplane.localdomain np0005604211.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:31 np0005604215.localdomain sudo[25437]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:31 np0005604215.localdomain sudo[25453]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scwkcwbumeicsthktfiybtdbwkcswpdz ; /usr/bin/python3
Feb 01 07:36:31 np0005604215.localdomain sudo[25453]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:32 np0005604215.localdomain python3[25455]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.6b1777aptmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:32 np0005604215.localdomain sudo[25453]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:32 np0005604215.localdomain sudo[25470]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ypybchphgfazccztavcngolcxykcstgv ; /usr/bin/python3
Feb 01 07:36:32 np0005604215.localdomain sudo[25470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:32 np0005604215.localdomain python3[25472]: ansible-file Invoked with path=/tmp/ansible.6b1777aptmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:32 np0005604215.localdomain sudo[25470]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:34 np0005604215.localdomain sudo[25486]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eislixpccashckxujqwcmzsrcdyajjss ; /usr/bin/python3
Feb 01 07:36:34 np0005604215.localdomain sudo[25486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:34 np0005604215.localdomain python3[25488]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:34 np0005604215.localdomain sudo[25486]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:35 np0005604215.localdomain sudo[25504]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjonxjhaxiazpotqajzkechczqoqjarm ; /usr/bin/python3
Feb 01 07:36:35 np0005604215.localdomain sudo[25504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:35 np0005604215.localdomain python3[25506]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:36:38 np0005604215.localdomain sudo[25504]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:39 np0005604215.localdomain sudo[25553]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuuvoofyyitbyamsgxsgydrqbefxpdqj ; /usr/bin/python3
Feb 01 07:36:39 np0005604215.localdomain sudo[25553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:39 np0005604215.localdomain python3[25555]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:36:39 np0005604215.localdomain sudo[25553]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:40 np0005604215.localdomain sudo[25598]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neemtkdiylrnlpevpfyposctflwaknkx ; /usr/bin/python3
Feb 01 07:36:40 np0005604215.localdomain sudo[25598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:40 np0005604215.localdomain python3[25600]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931399.478967-55383-266207153660309/source dest=/etc/chrony.conf owner=root group=root mode=420 follow=False _original_basename=chrony.conf.j2 checksum=4fd4fbbb2de00c70a54478b7feb8ef8adf6a3362 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:40 np0005604215.localdomain sudo[25598]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:41 np0005604215.localdomain sudo[25628]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xilbsaskrzsgtbdwobfvubowcwbnhypt ; /usr/bin/python3
Feb 01 07:36:41 np0005604215.localdomain sudo[25628]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:41 np0005604215.localdomain python3[25630]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:36:41 np0005604215.localdomain sudo[25628]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:42 np0005604215.localdomain sudo[25646]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yohjiyhjiqpvnotaqvhlpqecuempocyw ; /usr/bin/python3
Feb 01 07:36:42 np0005604215.localdomain sudo[25646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:43 np0005604215.localdomain python3[25648]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:36:43 np0005604215.localdomain chronyd[767]: chronyd exiting
Feb 01 07:36:43 np0005604215.localdomain systemd[1]: Stopping NTP client/server...
Feb 01 07:36:43 np0005604215.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 01 07:36:43 np0005604215.localdomain systemd[1]: Stopped NTP client/server.
Feb 01 07:36:43 np0005604215.localdomain systemd[1]: chronyd.service: Consumed 80ms CPU time, read 1.9M from disk, written 0B to disk.
Feb 01 07:36:43 np0005604215.localdomain systemd[1]: Starting NTP client/server...
Feb 01 07:36:43 np0005604215.localdomain chronyd[25656]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 01 07:36:43 np0005604215.localdomain chronyd[25656]: Frequency -30.640 +/- 0.062 ppm read from /var/lib/chrony/drift
Feb 01 07:36:43 np0005604215.localdomain chronyd[25656]: Loaded seccomp filter (level 2)
Feb 01 07:36:43 np0005604215.localdomain systemd[1]: Started NTP client/server.
Feb 01 07:36:43 np0005604215.localdomain sudo[25646]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:44 np0005604215.localdomain sudo[25703]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkgiyqemahrogmebheekunryenxlaolg ; /usr/bin/python3
Feb 01 07:36:44 np0005604215.localdomain sudo[25703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:45 np0005604215.localdomain python3[25705]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:36:45 np0005604215.localdomain sudo[25703]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:45 np0005604215.localdomain sudo[25746]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wunuiupudtdaikmpecqtkztyrbmypqtg ; /usr/bin/python3
Feb 01 07:36:45 np0005604215.localdomain sudo[25746]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:45 np0005604215.localdomain python3[25748]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769931404.7074661-55614-206577621577627/source dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service follow=False checksum=d4d85e046d61f558ac7ec8178c6d529d893e81e1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:36:45 np0005604215.localdomain sudo[25746]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:45 np0005604215.localdomain sudo[25776]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kiaozkxhacltahzdstbbfnrfchwpmhbu ; /usr/bin/python3
Feb 01 07:36:45 np0005604215.localdomain sudo[25776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:46 np0005604215.localdomain python3[25778]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:36:46 np0005604215.localdomain systemd-rc-local-generator[25801]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:36:46 np0005604215.localdomain systemd-sysv-generator[25806]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:36:46 np0005604215.localdomain systemd-sysv-generator[25844]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:36:46 np0005604215.localdomain systemd-rc-local-generator[25841]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: Starting chronyd online sources service...
Feb 01 07:36:46 np0005604215.localdomain chronyc[25853]: 200 OK
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Feb 01 07:36:46 np0005604215.localdomain systemd[1]: Finished chronyd online sources service.
Feb 01 07:36:46 np0005604215.localdomain sudo[25776]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:47 np0005604215.localdomain sudo[25867]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uswoszihikwkapwncjkrbvtsxkcthfza ; /usr/bin/python3
Feb 01 07:36:47 np0005604215.localdomain sudo[25867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:47 np0005604215.localdomain python3[25869]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:47 np0005604215.localdomain chronyd[25656]: System clock was stepped by 0.000000 seconds
Feb 01 07:36:47 np0005604215.localdomain sudo[25867]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:47 np0005604215.localdomain sudo[25884]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaoktuipkdyscwxigoexrswiybglpkkw ; /usr/bin/python3
Feb 01 07:36:47 np0005604215.localdomain sudo[25884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:47 np0005604215.localdomain chronyd[25656]: Selected source 216.197.156.83 (pool.ntp.org)
Feb 01 07:36:47 np0005604215.localdomain python3[25886]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:36:47 np0005604215.localdomain sudo[25884]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:57 np0005604215.localdomain systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Feb 01 07:36:58 np0005604215.localdomain sudo[25904]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozdshcpngbkxklypkamgsedgghdxoduz ; /usr/bin/python3
Feb 01 07:36:58 np0005604215.localdomain sudo[25904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:58 np0005604215.localdomain python3[25906]: ansible-timezone Invoked with name=UTC hwclock=None
Feb 01 07:36:58 np0005604215.localdomain systemd[1]: Starting Time & Date Service...
Feb 01 07:36:58 np0005604215.localdomain systemd[1]: Started Time & Date Service.
Feb 01 07:36:58 np0005604215.localdomain sudo[25904]: pam_unix(sudo:session): session closed for user root
Feb 01 07:36:58 np0005604215.localdomain sudo[25924]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihdqanwemvvzlzvcdzeptrvamigllcgv ; /usr/bin/python3
Feb 01 07:36:58 np0005604215.localdomain sudo[25924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:36:59 np0005604215.localdomain python3[25926]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:36:59 np0005604215.localdomain systemd[1]: Stopping NTP client/server...
Feb 01 07:36:59 np0005604215.localdomain chronyd[25656]: chronyd exiting
Feb 01 07:36:59 np0005604215.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 01 07:36:59 np0005604215.localdomain systemd[1]: Stopped NTP client/server.
Feb 01 07:36:59 np0005604215.localdomain systemd[1]: Starting NTP client/server...
Feb 01 07:36:59 np0005604215.localdomain chronyd[25933]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 01 07:36:59 np0005604215.localdomain chronyd[25933]: Frequency -30.640 +/- 0.062 ppm read from /var/lib/chrony/drift
Feb 01 07:36:59 np0005604215.localdomain chronyd[25933]: Loaded seccomp filter (level 2)
Feb 01 07:36:59 np0005604215.localdomain systemd[1]: Started NTP client/server.
Feb 01 07:36:59 np0005604215.localdomain sudo[25924]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:03 np0005604215.localdomain chronyd[25933]: Selected source 216.197.156.83 (pool.ntp.org)
Feb 01 07:37:15 np0005604215.localdomain sudo[25948]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bylhhsaxwwyiiwbczcvqpazcewjowstu ; /usr/bin/python3
Feb 01 07:37:15 np0005604215.localdomain sudo[25948]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:15 np0005604215.localdomain useradd[25952]: new group: name=ceph-admin, GID=1002
Feb 01 07:37:15 np0005604215.localdomain useradd[25952]: new user: name=ceph-admin, UID=1002, GID=1002, home=/home/ceph-admin, shell=/bin/bash, from=none
Feb 01 07:37:15 np0005604215.localdomain sudo[25948]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:15 np0005604215.localdomain sudo[26004]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iobteomplvcwryddqdhtnbdyjeasajfm ; /usr/bin/python3
Feb 01 07:37:15 np0005604215.localdomain sudo[26004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:15 np0005604215.localdomain sudo[26004]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:16 np0005604215.localdomain sudo[26047]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xafpushtjztpjvnughdocayblwytieqw ; /usr/bin/python3
Feb 01 07:37:16 np0005604215.localdomain sudo[26047]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:16 np0005604215.localdomain sudo[26047]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:16 np0005604215.localdomain sudo[26077]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cndupgqdzleqdghafziiucohlisbpmle ; /usr/bin/python3
Feb 01 07:37:16 np0005604215.localdomain sudo[26077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:16 np0005604215.localdomain sudo[26077]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:17 np0005604215.localdomain sudo[26093]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krjmxjbnolbpwotttaooabwmmwsuauiw ; /usr/bin/python3
Feb 01 07:37:17 np0005604215.localdomain sudo[26093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:17 np0005604215.localdomain sudo[26093]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:17 np0005604215.localdomain sudo[26109]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oszxingpenzrkuxjhpxskaiaplthyuxy ; /usr/bin/python3
Feb 01 07:37:17 np0005604215.localdomain sudo[26109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:17 np0005604215.localdomain sudo[26109]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:18 np0005604215.localdomain sudo[26125]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfytpdeznfyanmuynhlxdqrbesgivaxy ; /usr/bin/python3
Feb 01 07:37:18 np0005604215.localdomain sudo[26125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:37:18 np0005604215.localdomain sudo[26125]: pam_unix(sudo:session): session closed for user root
Feb 01 07:37:28 np0005604215.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 01 07:39:07 np0005604215.localdomain sshd[26130]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:07 np0005604215.localdomain sshd[26130]: Accepted publickey for ceph-admin from 192.168.122.103 port 56690 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:07 np0005604215.localdomain systemd-logind[761]: New session 14 of user ceph-admin.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 1002.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/1002...
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/1002.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Starting User Manager for UID 1002...
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: pam_unix(systemd-user:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:07 np0005604215.localdomain sshd[26148]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Queued start job for default target Main User Target.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Created slice User Application Slice.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Reached target Paths.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Reached target Timers.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Starting D-Bus User Message Bus Socket...
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Starting Create User's Volatile Files and Directories...
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Finished Create User's Volatile Files and Directories.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Listening on D-Bus User Message Bus Socket.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Reached target Sockets.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Reached target Basic System.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Reached target Main User Target.
Feb 01 07:39:07 np0005604215.localdomain systemd[26134]: Startup finished in 113ms.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Started User Manager for UID 1002.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Started Session 14 of User ceph-admin.
Feb 01 07:39:07 np0005604215.localdomain sshd[26130]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:07 np0005604215.localdomain sshd[26148]: Accepted publickey for ceph-admin from 192.168.122.103 port 56696 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:07 np0005604215.localdomain systemd-logind[761]: New session 16 of user ceph-admin.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Started Session 16 of User ceph-admin.
Feb 01 07:39:07 np0005604215.localdomain sshd[26148]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:07 np0005604215.localdomain sudo[26155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:07 np0005604215.localdomain sudo[26155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:07 np0005604215.localdomain sudo[26155]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:07 np0005604215.localdomain sshd[26170]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:07 np0005604215.localdomain sshd[26170]: Accepted publickey for ceph-admin from 192.168.122.103 port 56704 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:07 np0005604215.localdomain systemd-logind[761]: New session 17 of user ceph-admin.
Feb 01 07:39:07 np0005604215.localdomain systemd[1]: Started Session 17 of User ceph-admin.
Feb 01 07:39:07 np0005604215.localdomain sshd[26170]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:07 np0005604215.localdomain sudo[26174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005604215.localdomain
Feb 01 07:39:07 np0005604215.localdomain sudo[26174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:07 np0005604215.localdomain sudo[26174]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:08 np0005604215.localdomain sshd[26189]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:08 np0005604215.localdomain sshd[26189]: Accepted publickey for ceph-admin from 192.168.122.103 port 56718 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:08 np0005604215.localdomain systemd-logind[761]: New session 18 of user ceph-admin.
Feb 01 07:39:08 np0005604215.localdomain systemd[1]: Started Session 18 of User ceph-admin.
Feb 01 07:39:08 np0005604215.localdomain sshd[26189]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:08 np0005604215.localdomain sudo[26193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Feb 01 07:39:08 np0005604215.localdomain sudo[26193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:08 np0005604215.localdomain sudo[26193]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:08 np0005604215.localdomain sshd[26208]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:08 np0005604215.localdomain sshd[26208]: Accepted publickey for ceph-admin from 192.168.122.103 port 56720 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:08 np0005604215.localdomain systemd-logind[761]: New session 19 of user ceph-admin.
Feb 01 07:39:08 np0005604215.localdomain systemd[1]: Started Session 19 of User ceph-admin.
Feb 01 07:39:08 np0005604215.localdomain sshd[26208]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:08 np0005604215.localdomain sudo[26212]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 07:39:08 np0005604215.localdomain sudo[26212]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:08 np0005604215.localdomain sudo[26212]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:08 np0005604215.localdomain sshd[26227]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:08 np0005604215.localdomain sshd[26227]: Accepted publickey for ceph-admin from 192.168.122.103 port 56736 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:08 np0005604215.localdomain systemd-logind[761]: New session 20 of user ceph-admin.
Feb 01 07:39:08 np0005604215.localdomain systemd[1]: Started Session 20 of User ceph-admin.
Feb 01 07:39:08 np0005604215.localdomain sshd[26227]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:09 np0005604215.localdomain sudo[26231]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 07:39:09 np0005604215.localdomain sudo[26231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:09 np0005604215.localdomain sudo[26231]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:09 np0005604215.localdomain sshd[26246]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:09 np0005604215.localdomain sshd[26246]: Accepted publickey for ceph-admin from 192.168.122.103 port 56744 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:09 np0005604215.localdomain systemd-logind[761]: New session 21 of user ceph-admin.
Feb 01 07:39:09 np0005604215.localdomain systemd[1]: Started Session 21 of User ceph-admin.
Feb 01 07:39:09 np0005604215.localdomain sshd[26246]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:09 np0005604215.localdomain sudo[26250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Feb 01 07:39:09 np0005604215.localdomain sudo[26250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:09 np0005604215.localdomain sudo[26250]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:09 np0005604215.localdomain sshd[26265]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:09 np0005604215.localdomain sshd[26265]: Accepted publickey for ceph-admin from 192.168.122.103 port 56758 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:09 np0005604215.localdomain systemd-logind[761]: New session 22 of user ceph-admin.
Feb 01 07:39:09 np0005604215.localdomain systemd[1]: Started Session 22 of User ceph-admin.
Feb 01 07:39:09 np0005604215.localdomain sshd[26265]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:09 np0005604215.localdomain sudo[26269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 07:39:09 np0005604215.localdomain sudo[26269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:09 np0005604215.localdomain sudo[26269]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:10 np0005604215.localdomain sshd[26284]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:10 np0005604215.localdomain sshd[26284]: Accepted publickey for ceph-admin from 192.168.122.103 port 56762 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:10 np0005604215.localdomain systemd-logind[761]: New session 23 of user ceph-admin.
Feb 01 07:39:10 np0005604215.localdomain systemd[1]: Started Session 23 of User ceph-admin.
Feb 01 07:39:10 np0005604215.localdomain sshd[26284]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:10 np0005604215.localdomain sudo[26288]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new
Feb 01 07:39:10 np0005604215.localdomain sudo[26288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:10 np0005604215.localdomain sudo[26288]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:10 np0005604215.localdomain sshd[26303]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:10 np0005604215.localdomain sshd[26303]: Accepted publickey for ceph-admin from 192.168.122.103 port 56772 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:10 np0005604215.localdomain systemd-logind[761]: New session 24 of user ceph-admin.
Feb 01 07:39:10 np0005604215.localdomain systemd[1]: Started Session 24 of User ceph-admin.
Feb 01 07:39:10 np0005604215.localdomain sshd[26303]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:11 np0005604215.localdomain sshd[26320]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:11 np0005604215.localdomain sshd[26320]: Accepted publickey for ceph-admin from 192.168.122.103 port 56780 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:11 np0005604215.localdomain systemd-logind[761]: New session 25 of user ceph-admin.
Feb 01 07:39:11 np0005604215.localdomain systemd[1]: Started Session 25 of User ceph-admin.
Feb 01 07:39:11 np0005604215.localdomain sshd[26320]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:11 np0005604215.localdomain sudo[26324]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3
Feb 01 07:39:11 np0005604215.localdomain sudo[26324]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:11 np0005604215.localdomain sudo[26324]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:11 np0005604215.localdomain sshd[26339]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:39:11 np0005604215.localdomain sshd[26339]: Accepted publickey for ceph-admin from 192.168.122.103 port 56782 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 07:39:11 np0005604215.localdomain systemd-logind[761]: New session 26 of user ceph-admin.
Feb 01 07:39:11 np0005604215.localdomain systemd[1]: Started Session 26 of User ceph-admin.
Feb 01 07:39:11 np0005604215.localdomain sshd[26339]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 07:39:11 np0005604215.localdomain sudo[26343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host --expect-hostname np0005604215.localdomain
Feb 01 07:39:11 np0005604215.localdomain sudo[26343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:11 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:12 np0005604215.localdomain sudo[26343]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:25 np0005604215.localdomain sudo[26380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:39:25 np0005604215.localdomain sudo[26380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:25 np0005604215.localdomain sudo[26380]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:25 np0005604215.localdomain sudo[26395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:25 np0005604215.localdomain sudo[26395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:25 np0005604215.localdomain sudo[26395]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:25 np0005604215.localdomain sudo[26410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 07:39:25 np0005604215.localdomain sudo[26410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:26 np0005604215.localdomain sudo[26410]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:26 np0005604215.localdomain sudo[26445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:26 np0005604215.localdomain sudo[26445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:26 np0005604215.localdomain sudo[26445]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:26 np0005604215.localdomain sudo[26460]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 07:39:26 np0005604215.localdomain sudo[26460]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:26 np0005604215.localdomain sudo[26460]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:26 np0005604215.localdomain sudo[26511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:26 np0005604215.localdomain sudo[26511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:26 np0005604215.localdomain sudo[26511]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:26 np0005604215.localdomain sudo[26526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:39:26 np0005604215.localdomain sudo[26526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:27 np0005604215.localdomain systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 26553 (sysctl)
Feb 01 07:39:27 np0005604215.localdomain systemd[1]: Mounting Arbitrary Executable File Formats File System...
Feb 01 07:39:27 np0005604215.localdomain systemd[1]: Mounted Arbitrary Executable File Formats File System.
Feb 01 07:39:27 np0005604215.localdomain sudo[26526]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:27 np0005604215.localdomain sudo[26576]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:27 np0005604215.localdomain sudo[26576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:27 np0005604215.localdomain sudo[26576]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:27 np0005604215.localdomain sudo[26591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 07:39:27 np0005604215.localdomain sudo[26591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:28 np0005604215.localdomain sudo[26591]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:28 np0005604215.localdomain sudo[26624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:28 np0005604215.localdomain sudo[26624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:28 np0005604215.localdomain sudo[26624]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:28 np0005604215.localdomain sudo[26639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- inventory --format=json-pretty --filter-for-batch
Feb 01 07:39:28 np0005604215.localdomain sudo[26639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:31 np0005604215.localdomain kernel: VFS: idmapped mount is not enabled.
Feb 01 07:39:49 np0005604215.localdomain podman[26692]: 2026-02-01 07:39:28.801276696 +0000 UTC m=+0.040513841 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:39:49 np0005604215.localdomain podman[26692]: 
Feb 01 07:39:50 np0005604215.localdomain podman[26692]: 2026-02-01 07:39:50.432319543 +0000 UTC m=+21.671556608 container create 26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_austin, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, name=rhceph, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, description=Red Hat Ceph Storage 7)
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: Created slice Slice /machine.
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723.scope.
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:39:50 np0005604215.localdomain podman[26692]: 2026-02-01 07:39:50.548509657 +0000 UTC m=+21.787746722 container init 26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_austin, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: tmp-crun.4WsTGQ.mount: Deactivated successfully.
Feb 01 07:39:50 np0005604215.localdomain podman[26692]: 2026-02-01 07:39:50.562202303 +0000 UTC m=+21.801439368 container start 26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_austin, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.buildah.version=1.41.4, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, vcs-type=git, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 07:39:50 np0005604215.localdomain podman[26692]: 2026-02-01 07:39:50.562564264 +0000 UTC m=+21.801801369 container attach 26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_austin, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True)
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: libpod-26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723.scope: Deactivated successfully.
Feb 01 07:39:50 np0005604215.localdomain admiring_austin[26788]: 167 167
Feb 01 07:39:50 np0005604215.localdomain podman[26692]: 2026-02-01 07:39:50.567275151 +0000 UTC m=+21.806512216 container died 26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_austin, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., release=1764794109, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.openshift.tags=rhceph ceph)
Feb 01 07:39:50 np0005604215.localdomain podman[26793]: 2026-02-01 07:39:50.650408556 +0000 UTC m=+0.074104505 container remove 26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=admiring_austin, com.redhat.component=rhceph-container, ceph=True, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, GIT_CLEAN=True, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: libpod-conmon-26cd8696c19d2fe4af77843d622f2c78f40c90e03746036000841490c3881723.scope: Deactivated successfully.
Feb 01 07:39:50 np0005604215.localdomain podman[26815]: 
Feb 01 07:39:50 np0005604215.localdomain podman[26815]: 2026-02-01 07:39:50.87171003 +0000 UTC m=+0.071327809 container create 9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_noether, RELEASE=main, GIT_CLEAN=True, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, distribution-scope=public, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4)
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e.scope.
Feb 01 07:39:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:39:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fec8d8a73a794d0b0aed5d06b9a5c8b7e21c597c0eef0f1ee2709d3c9c171f6/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:39:50 np0005604215.localdomain podman[26815]: 2026-02-01 07:39:50.845290549 +0000 UTC m=+0.044908328 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:39:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fec8d8a73a794d0b0aed5d06b9a5c8b7e21c597c0eef0f1ee2709d3c9c171f6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:39:50 np0005604215.localdomain podman[26815]: 2026-02-01 07:39:50.96553987 +0000 UTC m=+0.165157669 container init 9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_noether, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 07:39:50 np0005604215.localdomain podman[26815]: 2026-02-01 07:39:50.976825051 +0000 UTC m=+0.176442840 container start 9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_noether, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vcs-type=git, release=1764794109, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, name=rhceph)
Feb 01 07:39:50 np0005604215.localdomain podman[26815]: 2026-02-01 07:39:50.977104949 +0000 UTC m=+0.176722778 container attach 9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_noether, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, release=1764794109)
Feb 01 07:39:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-7503a611394e34ea27df147f3929c32dcbe9bb686ca920a835ff327e8ebae175-merged.mount: Deactivated successfully.
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]: [
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:     {
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "available": false,
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "ceph_device": false,
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "lsm_data": {},
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "lvs": [],
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "path": "/dev/sr0",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "rejected_reasons": [
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "Insufficient space (<5GB)",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "Has a FileSystem"
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         ],
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         "sys_api": {
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "actuators": null,
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "device_nodes": "sr0",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "human_readable_size": "482.00 KB",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "id_bus": "ata",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "model": "QEMU DVD-ROM",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "nr_requests": "2",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "partitions": {},
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "path": "/dev/sr0",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "removable": "1",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "rev": "2.5+",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "ro": "0",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "rotational": "1",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "sas_address": "",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "sas_device_handle": "",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "scheduler_mode": "mq-deadline",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "sectors": 0,
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "sectorsize": "2048",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "size": 493568.0,
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "support_discard": "0",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "type": "disk",
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:             "vendor": "QEMU"
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:         }
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]:     }
Feb 01 07:39:51 np0005604215.localdomain flamboyant_noether[26831]: ]
Feb 01 07:39:51 np0005604215.localdomain systemd[1]: libpod-9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e.scope: Deactivated successfully.
Feb 01 07:39:51 np0005604215.localdomain podman[26815]: 2026-02-01 07:39:51.758557548 +0000 UTC m=+0.958175337 container died 9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_noether, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, architecture=x86_64, name=rhceph, version=7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 07:39:51 np0005604215.localdomain systemd[1]: tmp-crun.q1Wjcr.mount: Deactivated successfully.
Feb 01 07:39:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4fec8d8a73a794d0b0aed5d06b9a5c8b7e21c597c0eef0f1ee2709d3c9c171f6-merged.mount: Deactivated successfully.
Feb 01 07:39:51 np0005604215.localdomain podman[27960]: 2026-02-01 07:39:51.85861105 +0000 UTC m=+0.085285514 container remove 9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_noether, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, name=rhceph, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, release=1764794109, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 07:39:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:39:51 np0005604215.localdomain systemd[1]: libpod-conmon-9c8d3ecf4b7691ff3f6e17386e9b700fc03ce3ac68bc634818365f1c2bcf4c0e.scope: Deactivated successfully.
Feb 01 07:39:51 np0005604215.localdomain sudo[26639]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:52 np0005604215.localdomain sudo[27975]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:39:52 np0005604215.localdomain sudo[27975]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:52 np0005604215.localdomain sudo[27975]: pam_unix(sudo:session): session closed for user root
Feb 01 07:39:52 np0005604215.localdomain sudo[27990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 _orch set-coredump-overrides --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e --coredump-max-size=32G
Feb 01 07:39:52 np0005604215.localdomain sudo[27990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: systemd-coredump.socket: Deactivated successfully.
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: Closed Process Core Dump Socket.
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: Stopping Process Core Dump Socket...
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: Listening on Process Core Dump Socket.
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:39:52 np0005604215.localdomain systemd-sysv-generator[28047]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:39:52 np0005604215.localdomain systemd-rc-local-generator[28041]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:39:52 np0005604215.localdomain systemd-rc-local-generator[28082]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:39:52 np0005604215.localdomain systemd-sysv-generator[28085]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:39:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:39:52 np0005604215.localdomain sudo[27990]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:22 np0005604215.localdomain sudo[28092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:22 np0005604215.localdomain sudo[28092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:22 np0005604215.localdomain sudo[28092]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:22 np0005604215.localdomain sudo[28107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 07:40:22 np0005604215.localdomain sudo[28107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:40:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:40:22 np0005604215.localdomain podman[28298]: 
Feb 01 07:40:23 np0005604215.localdomain podman[28298]: 2026-02-01 07:40:22.933368897 +0000 UTC m=+0.043589909 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:24 np0005604215.localdomain podman[28298]: 2026-02-01 07:40:24.10950001 +0000 UTC m=+1.219721042 container create 89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_taussig, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True)
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: Started libpod-conmon-89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4.scope.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:24 np0005604215.localdomain podman[28298]: 2026-02-01 07:40:24.18695156 +0000 UTC m=+1.297172542 container init 89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_taussig, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vendor=Red Hat, Inc., com.redhat.component=rhceph-container)
Feb 01 07:40:24 np0005604215.localdomain podman[28298]: 2026-02-01 07:40:24.197260389 +0000 UTC m=+1.307481361 container start 89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_taussig, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., release=1764794109)
Feb 01 07:40:24 np0005604215.localdomain podman[28298]: 2026-02-01 07:40:24.197463544 +0000 UTC m=+1.307684526 container attach 89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_taussig, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.expose-services=, release=1764794109, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 07:40:24 np0005604215.localdomain awesome_taussig[28388]: 167 167
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: libpod-89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4.scope: Deactivated successfully.
Feb 01 07:40:24 np0005604215.localdomain podman[28298]: 2026-02-01 07:40:24.20199713 +0000 UTC m=+1.312218172 container died 89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_taussig, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:24 np0005604215.localdomain podman[28393]: 2026-02-01 07:40:24.323390266 +0000 UTC m=+0.109056354 container remove 89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=awesome_taussig, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_BRANCH=main, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1764794109, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: libpod-conmon-89517299267c8f4f95cca75ab569851b82ed78796056d4787362eb51168d6ef4.scope: Deactivated successfully.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:24 np0005604215.localdomain systemd-rc-local-generator[28428]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:24 np0005604215.localdomain systemd-sysv-generator[28435]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2ade20519be1e549b1bf6510f51794d7b717c89e8519849b0c372d7baa735f12-merged.mount: Deactivated successfully.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:24 np0005604215.localdomain systemd-sysv-generator[28474]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:24 np0005604215.localdomain systemd-rc-local-generator[28469]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: Reached target All Ceph clusters and services.
Feb 01 07:40:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:24 np0005604215.localdomain systemd-rc-local-generator[28507]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:24 np0005604215.localdomain systemd-sysv-generator[28513]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Reached target Ceph cluster 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:25 np0005604215.localdomain systemd-rc-local-generator[28545]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:25 np0005604215.localdomain systemd-sysv-generator[28552]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:25 np0005604215.localdomain systemd-sysv-generator[28592]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:25 np0005604215.localdomain systemd-rc-local-generator[28588]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Created slice Slice /system/ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Reached target System Time Set.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Reached target System Time Synchronized.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: Starting Ceph crash.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:40:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Feb 01 07:40:25 np0005604215.localdomain podman[28651]: 
Feb 01 07:40:25 np0005604215.localdomain podman[28651]: 2026-02-01 07:40:25.924768183 +0000 UTC m=+0.074047167 container create 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.buildah.version=1.41.4, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, ceph=True, com.redhat.component=rhceph-container)
Feb 01 07:40:25 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9833aab783ba439bc7c83ce1c86d58b1573d474a692c843b2890ed6efebe973d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:25 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9833aab783ba439bc7c83ce1c86d58b1573d474a692c843b2890ed6efebe973d/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:25 np0005604215.localdomain podman[28651]: 2026-02-01 07:40:25.893168971 +0000 UTC m=+0.042447945 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9833aab783ba439bc7c83ce1c86d58b1573d474a692c843b2890ed6efebe973d/merged/etc/ceph/ceph.client.crash.np0005604215.keyring supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:26 np0005604215.localdomain podman[28651]: 2026-02-01 07:40:26.021597525 +0000 UTC m=+0.170876500 container init 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, vendor=Red Hat, Inc., GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, name=rhceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, ceph=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 07:40:26 np0005604215.localdomain podman[28651]: 2026-02-01 07:40:26.03119594 +0000 UTC m=+0.180474924 container start 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, release=1764794109, ceph=True, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:26 np0005604215.localdomain bash[28651]: 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a
Feb 01 07:40:26 np0005604215.localdomain systemd[1]: Started Ceph crash.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 07:40:26 np0005604215.localdomain sudo[28107]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: INFO:ceph-crash:pinging cluster to exercise our key
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.212+0000 7f67d327d640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.212+0000 7f67d327d640 -1 AuthRegistry(0x7f67cc0680d0) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.213+0000 7f67d327d640 -1 auth: unable to find a keyring on /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin: (2) No such file or directory
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.213+0000 7f67d327d640 -1 AuthRegistry(0x7f67d327c000) no keyring found at /etc/ceph/ceph.client.admin.keyring,/etc/ceph/ceph.keyring,/etc/ceph/keyring,/etc/ceph/keyring.bin, disabling cephx
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.219+0000 7f67d17f3640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.222+0000 7f67cbfff640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.222+0000 7f67d0ff2640 -1 monclient(hunting): handle_auth_bad_method server allowed_methods [2] but i only support [1]
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: 2026-02-01T07:40:26.222+0000 7f67d327d640 -1 monclient: authenticate NOTE: no keyring found; disabled cephx authentication
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: [errno 13] RADOS permission denied (error connecting to the cluster)
Feb 01 07:40:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215[28665]: INFO:ceph-crash:monitoring path /var/lib/ceph/crash, delay 600s
Feb 01 07:40:28 np0005604215.localdomain sudo[28684]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:28 np0005604215.localdomain sudo[28684]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:28 np0005604215.localdomain sudo[28684]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:29 np0005604215.localdomain sudo[28699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --env CEPH_VOLUME_OSDSPEC_AFFINITY=default_drive_group --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e --config-json - -- lvm batch --no-auto /dev/ceph_vg0/ceph_lv0 /dev/ceph_vg1/ceph_lv1 --yes --no-systemd
Feb 01 07:40:29 np0005604215.localdomain sudo[28699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 2026-02-01 07:40:29.612532199 +0000 UTC m=+0.069893679 container create 51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_robinson, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, vcs-type=git, release=1764794109, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:29 np0005604215.localdomain systemd[1]: Started libpod-conmon-51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082.scope.
Feb 01 07:40:29 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 2026-02-01 07:40:29.582780845 +0000 UTC m=+0.040142315 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 2026-02-01 07:40:29.693266647 +0000 UTC m=+0.150628097 container init 51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_robinson, architecture=x86_64, io.buildah.version=1.41.4, name=rhceph, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, version=7, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 2026-02-01 07:40:29.701595735 +0000 UTC m=+0.158957205 container start 51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_robinson, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 2026-02-01 07:40:29.701978673 +0000 UTC m=+0.159340123 container attach 51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_robinson, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, io.buildah.version=1.41.4, name=rhceph, version=7, architecture=x86_64)
Feb 01 07:40:29 np0005604215.localdomain systemd[1]: libpod-51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082.scope: Deactivated successfully.
Feb 01 07:40:29 np0005604215.localdomain trusting_robinson[28766]: 167 167
Feb 01 07:40:29 np0005604215.localdomain podman[28751]: 2026-02-01 07:40:29.707154403 +0000 UTC m=+0.164515883 container died 51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_robinson, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Feb 01 07:40:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bf96dff0a303ec2bd9e9f1029a08b37c06f63982b37c93871fb58c48e1052245-merged.mount: Deactivated successfully.
Feb 01 07:40:29 np0005604215.localdomain podman[28771]: 2026-02-01 07:40:29.795918964 +0000 UTC m=+0.075747904 container remove 51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=trusting_robinson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, com.redhat.component=rhceph-container, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:29 np0005604215.localdomain systemd[1]: libpod-conmon-51b8b6028dfc37f3aae75102ad31d8ce8ca85383535d7e588666e17198122082.scope: Deactivated successfully.
Feb 01 07:40:30 np0005604215.localdomain podman[28790]: 
Feb 01 07:40:30 np0005604215.localdomain podman[28790]: 2026-02-01 07:40:30.020898804 +0000 UTC m=+0.079912662 container create 25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_hawking, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, ceph=True)
Feb 01 07:40:30 np0005604215.localdomain systemd[1]: Started libpod-conmon-25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5.scope.
Feb 01 07:40:30 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:30 np0005604215.localdomain podman[28790]: 2026-02-01 07:40:29.988940104 +0000 UTC m=+0.047953982 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:30 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4e0fc0a50119b5077adde57cc05c20d078f6b94a4ff9d6045f6165f5cc91c2c/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:30 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4e0fc0a50119b5077adde57cc05c20d078f6b94a4ff9d6045f6165f5cc91c2c/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:30 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4e0fc0a50119b5077adde57cc05c20d078f6b94a4ff9d6045f6165f5cc91c2c/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:30 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4e0fc0a50119b5077adde57cc05c20d078f6b94a4ff9d6045f6165f5cc91c2c/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:30 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b4e0fc0a50119b5077adde57cc05c20d078f6b94a4ff9d6045f6165f5cc91c2c/merged/var/lib/ceph/bootstrap-osd/ceph.keyring supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:30 np0005604215.localdomain podman[28790]: 2026-02-01 07:40:30.148310517 +0000 UTC m=+0.207324365 container init 25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_hawking, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, GIT_CLEAN=True, io.openshift.expose-services=, vendor=Red Hat, Inc., version=7, name=rhceph, vcs-type=git, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public)
Feb 01 07:40:30 np0005604215.localdomain podman[28790]: 2026-02-01 07:40:30.1606584 +0000 UTC m=+0.219672248 container start 25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_hawking, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.expose-services=, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 07:40:30 np0005604215.localdomain podman[28790]: 2026-02-01 07:40:30.161053898 +0000 UTC m=+0.220067796 container attach 25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_hawking, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, name=rhceph, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_CLEAN=True)
Feb 01 07:40:30 np0005604215.localdomain infallible_hawking[28806]: --> passed data devices: 0 physical, 2 LVM
Feb 01 07:40:30 np0005604215.localdomain infallible_hawking[28806]: --> relative data size: 1.0
Feb 01 07:40:30 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 01 07:40:30 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new 91738c8a-fd02-4668-b2ac-8ebbd36126da
Feb 01 07:40:31 np0005604215.localdomain lvm[28860]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 01 07:40:31 np0005604215.localdomain lvm[28860]: VG ceph_vg0 finished
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-2
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg0/ceph_lv0
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ln -s /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-2/activate.monmap
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]:  stderr: got monmap epoch 3
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: --> Creating keyring file for osd.2
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/keyring
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2/
Feb 01 07:40:31 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 2 --monmap /var/lib/ceph/osd/ceph-2/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-2/ --osd-uuid 91738c8a-fd02-4668-b2ac-8ebbd36126da --setuser ceph --setgroup ceph
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]:  stderr: 2026-02-01T07:40:31.855+0000 7f7654b99a80 -1 bluestore(/var/lib/ceph/osd/ceph-2//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]:  stderr: 2026-02-01T07:40:31.855+0000 7f7654b99a80 -1 bluestore(/var/lib/ceph/osd/ceph-2/) _read_fsid unparsable uuid
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: --> ceph-volume lvm prepare successful for: ceph_vg0/ceph_lv0
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg0/ceph_lv0 --path /var/lib/ceph/osd/ceph-2 --no-mon-config
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ln -snf /dev/ceph_vg0/ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: --> ceph-volume lvm activate successful for osd ID: 2
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: --> ceph-volume lvm create successful for: ceph_vg0/ceph_lv0
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 01 07:40:34 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring -i - osd new dc0298a4-c2cb-4512-baf8-45dcc8aa1439
Feb 01 07:40:35 np0005604215.localdomain lvm[29789]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 01 07:40:35 np0005604215.localdomain lvm[29789]: VG ceph_vg1 finished
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-authtool --gen-print-key
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/mount -t tmpfs tmpfs /var/lib/ceph/osd/ceph-5
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -h ceph:ceph /dev/ceph_vg1/ceph_lv1
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ln -s /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph --cluster ceph --name client.bootstrap-osd --keyring /var/lib/ceph/bootstrap-osd/ceph.keyring mon getmap -o /var/lib/ceph/osd/ceph-5/activate.monmap
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]:  stderr: got monmap epoch 3
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: --> Creating keyring file for osd.5
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/keyring
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5/
Feb 01 07:40:35 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-osd --cluster ceph --osd-objectstore bluestore --mkfs -i 5 --monmap /var/lib/ceph/osd/ceph-5/activate.monmap --keyfile - --osdspec-affinity default_drive_group --osd-data /var/lib/ceph/osd/ceph-5/ --osd-uuid dc0298a4-c2cb-4512-baf8-45dcc8aa1439 --setuser ceph --setgroup ceph
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]:  stderr: 2026-02-01T07:40:35.585+0000 7f6719e42a80 -1 bluestore(/var/lib/ceph/osd/ceph-5//block) _read_bdev_label unable to decode label at offset 102: void bluestore_bdev_label_t::decode(ceph::buffer::v15_2_0::list::const_iterator&) decode past end of struct encoding: Malformed input [buffer:3]
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]:  stderr: 2026-02-01T07:40:35.585+0000 7f6719e42a80 -1 bluestore(/var/lib/ceph/osd/ceph-5/) _read_fsid unparsable uuid
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: --> ceph-volume lvm prepare successful for: ceph_vg1/ceph_lv1
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ceph-bluestore-tool --cluster=ceph prime-osd-dir --dev /dev/ceph_vg1/ceph_lv1 --path /var/lib/ceph/osd/ceph-5 --no-mon-config
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/ln -snf /dev/ceph_vg1/ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -h ceph:ceph /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: --> ceph-volume lvm activate successful for osd ID: 5
Feb 01 07:40:38 np0005604215.localdomain infallible_hawking[28806]: --> ceph-volume lvm create successful for: ceph_vg1/ceph_lv1
Feb 01 07:40:38 np0005604215.localdomain systemd[1]: libpod-25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5.scope: Deactivated successfully.
Feb 01 07:40:38 np0005604215.localdomain systemd[1]: libpod-25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5.scope: Consumed 3.593s CPU time.
Feb 01 07:40:38 np0005604215.localdomain podman[30688]: 2026-02-01 07:40:38.273698581 +0000 UTC m=+0.036971098 container died 25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_hawking, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vendor=Red Hat, Inc., vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, release=1764794109, description=Red Hat Ceph Storage 7)
Feb 01 07:40:38 np0005604215.localdomain systemd[1]: tmp-crun.9mlC9o.mount: Deactivated successfully.
Feb 01 07:40:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b4e0fc0a50119b5077adde57cc05c20d078f6b94a4ff9d6045f6165f5cc91c2c-merged.mount: Deactivated successfully.
Feb 01 07:40:38 np0005604215.localdomain podman[30688]: 2026-02-01 07:40:38.307953171 +0000 UTC m=+0.071225658 container remove 25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_hawking, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, distribution-scope=public, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_CLEAN=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:38 np0005604215.localdomain systemd[1]: libpod-conmon-25e11df276a7e1b9b21a679a6d97c52bf7e99972a6b2ce88a484818c6ee875b5.scope: Deactivated successfully.
Feb 01 07:40:38 np0005604215.localdomain sudo[28699]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:38 np0005604215.localdomain sudo[30703]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:38 np0005604215.localdomain sudo[30703]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:38 np0005604215.localdomain sudo[30703]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:38 np0005604215.localdomain sudo[30718]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- lvm list --format json
Feb 01 07:40:38 np0005604215.localdomain sudo[30718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 2026-02-01 07:40:39.027922512 +0000 UTC m=+0.065601599 container create 4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_moore, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664.scope.
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 2026-02-01 07:40:39.097544763 +0000 UTC m=+0.135223850 container init 4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_moore, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, release=1764794109, io.buildah.version=1.41.4, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, version=7)
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 2026-02-01 07:40:39.000228162 +0000 UTC m=+0.037907229 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 2026-02-01 07:40:39.106419073 +0000 UTC m=+0.144098150 container start 4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_moore, GIT_BRANCH=main, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public)
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 2026-02-01 07:40:39.106667128 +0000 UTC m=+0.144346205 container attach 4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_moore, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_BRANCH=main, architecture=x86_64, com.redhat.component=rhceph-container, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, name=rhceph, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Feb 01 07:40:39 np0005604215.localdomain charming_moore[30787]: 167 167
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: libpod-4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664.scope: Deactivated successfully.
Feb 01 07:40:39 np0005604215.localdomain podman[30771]: 2026-02-01 07:40:39.109462558 +0000 UTC m=+0.147141665 container died 4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_moore, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, version=7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:39 np0005604215.localdomain podman[30792]: 2026-02-01 07:40:39.193221001 +0000 UTC m=+0.075138331 container remove 4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=charming_moore, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, name=rhceph, release=1764794109, ceph=True, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: libpod-conmon-4147ab26c27bf4a62f9072d7edda6626c8fb328ea7cb47a3dc70d4c6a114d664.scope: Deactivated successfully.
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e00815c0359ba2efc37bc5e28b0a90648ba1a9b71983ed0b85748cf45629ce98-merged.mount: Deactivated successfully.
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 2026-02-01 07:40:39.384612306 +0000 UTC m=+0.062254366 container create f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_banach, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, version=7, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069.scope.
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 2026-02-01 07:40:39.363867324 +0000 UTC m=+0.041509374 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a094506de0ce8b7e95177f71b78b01da7419612a29aac5bd02f320116ad65da/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a094506de0ce8b7e95177f71b78b01da7419612a29aac5bd02f320116ad65da/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6a094506de0ce8b7e95177f71b78b01da7419612a29aac5bd02f320116ad65da/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 2026-02-01 07:40:39.505282136 +0000 UTC m=+0.182924196 container init f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_banach, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, distribution-scope=public, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vendor=Red Hat, Inc., name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, release=1764794109, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 2026-02-01 07:40:39.517887003 +0000 UTC m=+0.195529063 container start f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_banach, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, ceph=True, name=rhceph, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, CEPH_POINT_RELEASE=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 2026-02-01 07:40:39.518221942 +0000 UTC m=+0.195864002 container attach f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_banach, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, RELEASE=main, distribution-scope=public, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc.)
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]: {
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:     "2": [
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:         {
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "devices": [
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "/dev/loop3"
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             ],
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_name": "ceph_lv0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_path": "/dev/ceph_vg0/ceph_lv0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_size": "7511998464",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_tags": "ceph.block_device=/dev/ceph_vg0/ceph_lv0,ceph.block_uuid=hPFg2o-8f7Z-SgAH-W30q-gPcL-92rt-G22z3d,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=33fac0b9-80c7-560f-918a-c92d3021ca1e,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=91738c8a-fd02-4668-b2ac-8ebbd36126da,ceph.osd_id=2,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_uuid": "hPFg2o-8f7Z-SgAH-W30q-gPcL-92rt-G22z3d",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "name": "ceph_lv0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "path": "/dev/ceph_vg0/ceph_lv0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "tags": {
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.block_device": "/dev/ceph_vg0/ceph_lv0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.block_uuid": "hPFg2o-8f7Z-SgAH-W30q-gPcL-92rt-G22z3d",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.cephx_lockbox_secret": "",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.cluster_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.cluster_name": "ceph",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.crush_device_class": "",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.encrypted": "0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.osd_fsid": "91738c8a-fd02-4668-b2ac-8ebbd36126da",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.osd_id": "2",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.type": "block",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.vdo": "0"
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             },
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "type": "block",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "vg_name": "ceph_vg0"
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:         }
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:     ],
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:     "5": [
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:         {
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "devices": [
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "/dev/loop4"
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             ],
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_name": "ceph_lv1",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_path": "/dev/ceph_vg1/ceph_lv1",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_size": "7511998464",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_tags": "ceph.block_device=/dev/ceph_vg1/ceph_lv1,ceph.block_uuid=dW1DpF-krs4-WN69-efR8-xEtz-9YoL-U2ay9d,ceph.cephx_lockbox_secret=,ceph.cluster_fsid=33fac0b9-80c7-560f-918a-c92d3021ca1e,ceph.cluster_name=ceph,ceph.crush_device_class=,ceph.encrypted=0,ceph.osd_fsid=dc0298a4-c2cb-4512-baf8-45dcc8aa1439,ceph.osd_id=5,ceph.osdspec_affinity=default_drive_group,ceph.type=block,ceph.vdo=0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "lv_uuid": "dW1DpF-krs4-WN69-efR8-xEtz-9YoL-U2ay9d",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "name": "ceph_lv1",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "path": "/dev/ceph_vg1/ceph_lv1",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "tags": {
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.block_device": "/dev/ceph_vg1/ceph_lv1",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.block_uuid": "dW1DpF-krs4-WN69-efR8-xEtz-9YoL-U2ay9d",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.cephx_lockbox_secret": "",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.cluster_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.cluster_name": "ceph",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.crush_device_class": "",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.encrypted": "0",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.osd_fsid": "dc0298a4-c2cb-4512-baf8-45dcc8aa1439",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.osd_id": "5",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.osdspec_affinity": "default_drive_group",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.type": "block",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:                 "ceph.vdo": "0"
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             },
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "type": "block",
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:             "vg_name": "ceph_vg1"
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:         }
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]:     ]
Feb 01 07:40:39 np0005604215.localdomain jolly_banach[30828]: }
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: libpod-f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069.scope: Deactivated successfully.
Feb 01 07:40:39 np0005604215.localdomain podman[30813]: 2026-02-01 07:40:39.874528768 +0000 UTC m=+0.552170858 container died f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_banach, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 07:40:39 np0005604215.localdomain podman[30837]: 2026-02-01 07:40:39.957964445 +0000 UTC m=+0.071494294 container remove f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=jolly_banach, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 07:40:39 np0005604215.localdomain systemd[1]: libpod-conmon-f99117263fe13839efa3ef74150794d4d78abae92cf8bc9d820d384b46e7d069.scope: Deactivated successfully.
Feb 01 07:40:40 np0005604215.localdomain sudo[30718]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:40 np0005604215.localdomain sudo[30849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:40 np0005604215.localdomain sudo[30849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:40 np0005604215.localdomain sudo[30849]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:40 np0005604215.localdomain sudo[30864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 07:40:40 np0005604215.localdomain sudo[30864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-6a094506de0ce8b7e95177f71b78b01da7419612a29aac5bd02f320116ad65da-merged.mount: Deactivated successfully.
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 2026-02-01 07:40:40.751282897 +0000 UTC m=+0.068571482 container create bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_solomon, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., ceph=True, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 07:40:40 np0005604215.localdomain systemd[1]: Started libpod-conmon-bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8.scope.
Feb 01 07:40:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 2026-02-01 07:40:40.820839348 +0000 UTC m=+0.138127943 container init bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_solomon, vcs-type=git, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.buildah.version=1.41.4, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, CEPH_POINT_RELEASE=, distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7)
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 2026-02-01 07:40:40.721982843 +0000 UTC m=+0.039271418 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 2026-02-01 07:40:40.831694418 +0000 UTC m=+0.148982993 container start bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_solomon, architecture=x86_64, description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, release=1764794109, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=)
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 2026-02-01 07:40:40.831988386 +0000 UTC m=+0.149276971 container attach bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_solomon, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, name=rhceph, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, release=1764794109, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:40 np0005604215.localdomain eloquent_solomon[30936]: 167 167
Feb 01 07:40:40 np0005604215.localdomain systemd[1]: libpod-bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8.scope: Deactivated successfully.
Feb 01 07:40:40 np0005604215.localdomain podman[30921]: 2026-02-01 07:40:40.836620674 +0000 UTC m=+0.153909299 container died bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_solomon, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., ceph=True, vcs-type=git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.tags=rhceph ceph)
Feb 01 07:40:40 np0005604215.localdomain podman[30941]: 2026-02-01 07:40:40.91626766 +0000 UTC m=+0.064548145 container remove bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=eloquent_solomon, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, GIT_BRANCH=main, name=rhceph, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 07:40:40 np0005604215.localdomain systemd[1]: libpod-conmon-bd14782795be0cea33ff7054ae78181cb5c0915fbd4277e095800df3588f6ca8.scope: Deactivated successfully.
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 2026-02-01 07:40:41.240567675 +0000 UTC m=+0.068945519 container create 6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, release=1764794109, RELEASE=main, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: Started libpod-conmon-6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773.scope.
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-adf03fbc7013ca98f20cde79d1ff9535d9ecfd43d1c920cebf522278aa277f2b-merged.mount: Deactivated successfully.
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/559cd5098828c2299c8f2f5f52b7f8312caf53b413ee9deea39b91f1754cab98/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 2026-02-01 07:40:41.212870966 +0000 UTC m=+0.041248800 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/559cd5098828c2299c8f2f5f52b7f8312caf53b413ee9deea39b91f1754cab98/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/559cd5098828c2299c8f2f5f52b7f8312caf53b413ee9deea39b91f1754cab98/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/559cd5098828c2299c8f2f5f52b7f8312caf53b413ee9deea39b91f1754cab98/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/559cd5098828c2299c8f2f5f52b7f8312caf53b413ee9deea39b91f1754cab98/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 2026-02-01 07:40:41.359656421 +0000 UTC m=+0.188034265 container init 6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=Red Hat Ceph Storage 7, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True)
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 2026-02-01 07:40:41.372048184 +0000 UTC m=+0.200426028 container start 6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True)
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 2026-02-01 07:40:41.372872703 +0000 UTC m=+0.201250537 container attach 6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:41 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test[30983]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Feb 01 07:40:41 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test[30983]:                             [--no-systemd] [--no-tmpfs]
Feb 01 07:40:41 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test[30983]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: libpod-6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773.scope: Deactivated successfully.
Feb 01 07:40:41 np0005604215.localdomain podman[30968]: 2026-02-01 07:40:41.593561311 +0000 UTC m=+0.421939165 container died 6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: tmp-crun.JkXOtz.mount: Deactivated successfully.
Feb 01 07:40:41 np0005604215.localdomain systemd-journald[619]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Feb 01 07:40:41 np0005604215.localdomain systemd-journald[619]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 07:40:41 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 07:40:41 np0005604215.localdomain podman[30988]: 2026-02-01 07:40:41.699741372 +0000 UTC m=+0.094508753 container remove 6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate-test, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, name=rhceph, version=7, distribution-scope=public, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7)
Feb 01 07:40:41 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: libpod-conmon-6695d7a303520835fbab217669e6ab54d607e306a53b9afb32fcb39c7d0c6773.scope: Deactivated successfully.
Feb 01 07:40:41 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:42 np0005604215.localdomain systemd-rc-local-generator[31043]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:42 np0005604215.localdomain systemd-sysv-generator[31049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-559cd5098828c2299c8f2f5f52b7f8312caf53b413ee9deea39b91f1754cab98-merged.mount: Deactivated successfully.
Feb 01 07:40:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:42 np0005604215.localdomain systemd-rc-local-generator[31086]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:42 np0005604215.localdomain systemd-sysv-generator[31092]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:42 np0005604215.localdomain systemd[1]: Starting Ceph osd.2 for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 07:40:42 np0005604215.localdomain podman[31150]: 
Feb 01 07:40:42 np0005604215.localdomain podman[31150]: 2026-02-01 07:40:42.872360191 +0000 UTC m=+0.072995225 container create 0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_BRANCH=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, RELEASE=main, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1764794109, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 07:40:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a694a21110d41d62fd7c847fe6016f6c83340ffb7b2c11388ebbed59edacbc3b/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:42 np0005604215.localdomain podman[31150]: 2026-02-01 07:40:42.842110907 +0000 UTC m=+0.042745941 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a694a21110d41d62fd7c847fe6016f6c83340ffb7b2c11388ebbed59edacbc3b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a694a21110d41d62fd7c847fe6016f6c83340ffb7b2c11388ebbed59edacbc3b/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a694a21110d41d62fd7c847fe6016f6c83340ffb7b2c11388ebbed59edacbc3b/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a694a21110d41d62fd7c847fe6016f6c83340ffb7b2c11388ebbed59edacbc3b/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:42 np0005604215.localdomain podman[31150]: 2026-02-01 07:40:42.99020101 +0000 UTC m=+0.190836034 container init 0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate, ceph=True, io.openshift.expose-services=, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4)
Feb 01 07:40:42 np0005604215.localdomain podman[31150]: 2026-02-01 07:40:42.998945046 +0000 UTC m=+0.199580110 container start 0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, version=7, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, release=1764794109, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 07:40:42 np0005604215.localdomain podman[31150]: 2026-02-01 07:40:42.999197772 +0000 UTC m=+0.199832856 container attach 0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, RELEASE=main, name=rhceph, release=1764794109)
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-2 --no-mon-config --dev /dev/mapper/ceph_vg0-ceph_lv0
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg0-ceph_lv0
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-0
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg0-ceph_lv0 /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-2
Feb 01 07:40:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate[31164]: --> ceph-volume raw activate successful for osd ID: 2
Feb 01 07:40:43 np0005604215.localdomain bash[31150]: --> ceph-volume raw activate successful for osd ID: 2
Feb 01 07:40:43 np0005604215.localdomain systemd[1]: libpod-0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce.scope: Deactivated successfully.
Feb 01 07:40:43 np0005604215.localdomain podman[31150]: 2026-02-01 07:40:43.699629756 +0000 UTC m=+0.900264810 container died 0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7)
Feb 01 07:40:43 np0005604215.localdomain systemd[1]: tmp-crun.smkzwY.mount: Deactivated successfully.
Feb 01 07:40:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a694a21110d41d62fd7c847fe6016f6c83340ffb7b2c11388ebbed59edacbc3b-merged.mount: Deactivated successfully.
Feb 01 07:40:43 np0005604215.localdomain podman[31278]: 2026-02-01 07:40:43.773779355 +0000 UTC m=+0.067009168 container remove 0c42d580c452e4c25df87c3809ff3eed63e65bc71359a31d04a1e996cd523bce (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2-activate, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, release=1764794109, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main)
Feb 01 07:40:44 np0005604215.localdomain podman[31339]: 
Feb 01 07:40:44 np0005604215.localdomain podman[31339]: 2026-02-01 07:40:44.088843863 +0000 UTC m=+0.074962577 container create 37ee57bb9daae11ee2cb1908eabf1acb4c153c7a836b58f64ca530a5cf6b5994 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, version=7, ceph=True, io.openshift.tags=rhceph ceph, architecture=x86_64, name=rhceph, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4)
Feb 01 07:40:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cc5983fa8a46038d31ddaeafcd1fd31857f0ea27c6925ecd079eb9ed30b3af/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:44 np0005604215.localdomain podman[31339]: 2026-02-01 07:40:44.059859247 +0000 UTC m=+0.045977971 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cc5983fa8a46038d31ddaeafcd1fd31857f0ea27c6925ecd079eb9ed30b3af/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cc5983fa8a46038d31ddaeafcd1fd31857f0ea27c6925ecd079eb9ed30b3af/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cc5983fa8a46038d31ddaeafcd1fd31857f0ea27c6925ecd079eb9ed30b3af/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f4cc5983fa8a46038d31ddaeafcd1fd31857f0ea27c6925ecd079eb9ed30b3af/merged/var/lib/ceph/osd/ceph-2 supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:44 np0005604215.localdomain podman[31339]: 2026-02-01 07:40:44.202406072 +0000 UTC m=+0.188524746 container init 37ee57bb9daae11ee2cb1908eabf1acb4c153c7a836b58f64ca530a5cf6b5994 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2, com.redhat.component=rhceph-container, version=7, ceph=True, distribution-scope=public, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=)
Feb 01 07:40:44 np0005604215.localdomain podman[31339]: 2026-02-01 07:40:44.209109864 +0000 UTC m=+0.195228548 container start 37ee57bb9daae11ee2cb1908eabf1acb4c153c7a836b58f64ca530a5cf6b5994 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2, build-date=2025-12-08T17:28:53Z, name=rhceph, distribution-scope=public, RELEASE=main, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 07:40:44 np0005604215.localdomain bash[31339]: 37ee57bb9daae11ee2cb1908eabf1acb4c153c7a836b58f64ca530a5cf6b5994
Feb 01 07:40:44 np0005604215.localdomain systemd[1]: Started Ceph osd.2 for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 07:40:44 np0005604215.localdomain sudo[30864]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: set uid:gid to 167:167 (ceph:ceph)
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: pidfile_write: ignore empty --pid-file
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) close
Feb 01 07:40:44 np0005604215.localdomain sudo[31370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:44 np0005604215.localdomain sudo[31370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:44 np0005604215.localdomain sudo[31370]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:44 np0005604215.localdomain sudo[31385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 07:40:44 np0005604215.localdomain sudo[31385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) close
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: starting osd.2 osd_data /var/lib/ceph/osd/ceph-2 /var/lib/ceph/osd/ceph-2/journal
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: load: jerasure load: lrc 
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:44 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) close
Feb 01 07:40:44 np0005604215.localdomain podman[31451]: 
Feb 01 07:40:44 np0005604215.localdomain podman[31451]: 2026-02-01 07:40:44.984892133 +0000 UTC m=+0.074278123 container create 508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_wright, architecture=x86_64, build-date=2025-12-08T17:28:53Z, release=1764794109, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_CLEAN=True, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: Started libpod-conmon-508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c.scope.
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:45 np0005604215.localdomain podman[31451]: 2026-02-01 07:40:45.050346246 +0000 UTC m=+0.139732246 container init 508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_wright, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, io.openshift.tags=rhceph ceph, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container)
Feb 01 07:40:45 np0005604215.localdomain podman[31451]: 2026-02-01 07:40:44.95141417 +0000 UTC m=+0.040800190 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:45 np0005604215.localdomain podman[31451]: 2026-02-01 07:40:45.059242197 +0000 UTC m=+0.148628187 container start 508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_wright, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, name=rhceph, architecture=x86_64, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:45 np0005604215.localdomain podman[31451]: 2026-02-01 07:40:45.059573844 +0000 UTC m=+0.148959874 container attach 508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_wright, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, version=7, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, name=rhceph)
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: libpod-508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c.scope: Deactivated successfully.
Feb 01 07:40:45 np0005604215.localdomain upbeat_wright[31467]: 167 167
Feb 01 07:40:45 np0005604215.localdomain podman[31451]: 2026-02-01 07:40:45.06364428 +0000 UTC m=+0.153030270 container died 508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_wright, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, version=7, io.openshift.expose-services=, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) close
Feb 01 07:40:45 np0005604215.localdomain podman[31472]: 2026-02-01 07:40:45.153878711 +0000 UTC m=+0.076175973 container remove 508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=upbeat_wright, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, GIT_BRANCH=main, ceph=True, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: libpod-conmon-508d3ad3699183007ffd3c9f0db14b70315b73fc4eaeda0feb1b143690c2da6c.scope: Deactivated successfully.
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d2e00 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs mount
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs mount shared_bdev_used = 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: RocksDB version: 7.9.2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Git sha 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Compile date 2025-09-23 00:00:00
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DB SUMMARY
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DB Session ID:  RCWH0G1AUIL75TA504WC
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: CURRENT file:  CURRENT
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: IDENTITY file:  IDENTITY
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.error_if_exists: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.create_if_missing: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.paranoid_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                     Options.env: 0x55aabc466bd0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                Options.info_log: 0x55aabd16c780
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_file_opening_threads: 16
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                              Options.statistics: (nil)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.use_fsync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.max_log_file_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.allow_fallocate: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.use_direct_reads: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.create_missing_column_families: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                              Options.db_log_dir: 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                 Options.wal_dir: db.wal
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.advise_random_on_open: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.write_buffer_manager: 0x55aabc1bc140
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                            Options.rate_limiter: (nil)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.unordered_write: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.row_cache: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                              Options.wal_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.allow_ingest_behind: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.two_write_queues: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.manual_wal_flush: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.wal_compression: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.atomic_flush: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.log_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.allow_data_in_errors: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.db_host_id: __hostname__
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_background_jobs: 4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_background_compactions: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_subcompactions: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.max_open_files: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.bytes_per_sync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.max_background_flushes: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Compression algorithms supported:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kZSTD supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kXpressCompression supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kBZip2Compression supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kLZ4Compression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kZlibCompression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kLZ4HCCompression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kSnappyCompression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16c940)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16cb60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16cb60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd16cb60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 84d3485b-52e5-4ebd-8f93-e8edd0678d8b
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645406026, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645406363, "job": 1, "event": "recovery_finished"}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old nid_max 1025
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta old blobid_max 10240
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _open_super_meta min_alloc_size 0x1000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: freelist init
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: freelist _read_cfg
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs umount
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) close
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 2026-02-01 07:40:45.479911364 +0000 UTC m=+0.071748819 container create d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, release=1764794109, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: Started libpod-conmon-d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde.scope.
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:45 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28d353abdd1b6ecd25446a446502c66ed96fbf47eec85309756a0f5b1fd5e2a/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 2026-02-01 07:40:45.45153377 +0000 UTC m=+0.043371235 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:45 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28d353abdd1b6ecd25446a446502c66ed96fbf47eec85309756a0f5b1fd5e2a/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:45 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28d353abdd1b6ecd25446a446502c66ed96fbf47eec85309756a0f5b1fd5e2a/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:45 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28d353abdd1b6ecd25446a446502c66ed96fbf47eec85309756a0f5b1fd5e2a/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:45 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d28d353abdd1b6ecd25446a446502c66ed96fbf47eec85309756a0f5b1fd5e2a/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 2026-02-01 07:40:45.639307408 +0000 UTC m=+0.231144833 container init d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, name=rhceph, version=7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container)
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 2026-02-01 07:40:45.650725011 +0000 UTC m=+0.242562466 container start d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, release=1764794109, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, RELEASE=main, version=7, vcs-type=git, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 2026-02-01 07:40:45.650992447 +0000 UTC m=+0.242829872 container attach d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test, release=1764794109, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7, io.buildah.version=1.41.4, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) open path /var/lib/ceph/osd/ceph-2/block
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-2/block failed: (22) Invalid argument
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bdev(0x55aabc1d3180 /var/lib/ceph/osd/ceph-2/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-2/block size 7.0 GiB
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs mount
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluefs mount shared_bdev_used = 4718592
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: RocksDB version: 7.9.2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Git sha 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Compile date 2025-09-23 00:00:00
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DB SUMMARY
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DB Session ID:  RCWH0G1AUIL75TA504WD
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: CURRENT file:  CURRENT
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: IDENTITY file:  IDENTITY
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.error_if_exists: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.create_if_missing: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.paranoid_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                     Options.env: 0x55aabc216460
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                Options.info_log: 0x55aabd1fe380
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_file_opening_threads: 16
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                              Options.statistics: (nil)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.use_fsync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.max_log_file_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.allow_fallocate: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.use_direct_reads: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.create_missing_column_families: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                              Options.db_log_dir: 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                                 Options.wal_dir: db.wal
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.advise_random_on_open: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.write_buffer_manager: 0x55aabc1bd5e0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                            Options.rate_limiter: (nil)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.unordered_write: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.row_cache: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                              Options.wal_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.allow_ingest_behind: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.two_write_queues: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.manual_wal_flush: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.wal_compression: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.atomic_flush: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.log_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.allow_data_in_errors: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.db_host_id: __hostname__
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_background_jobs: 4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_background_compactions: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_subcompactions: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.max_open_files: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.bytes_per_sync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.max_background_flushes: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Compression algorithms supported:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kZSTD supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kXpressCompression supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kBZip2Compression supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kLZ4Compression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kZlibCompression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kLZ4HCCompression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         kSnappyCompression supported: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: 
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1fe5e0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1aa2d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1ff900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1ab610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1ff900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1ab610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55aabd1ff900)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55aabc1ab610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 84d3485b-52e5-4ebd-8f93-e8edd0678d8b
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645691508, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645697618, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931645, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84d3485b-52e5-4ebd-8f93-e8edd0678d8b", "db_session_id": "RCWH0G1AUIL75TA504WD", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645701791, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931645, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84d3485b-52e5-4ebd-8f93-e8edd0678d8b", "db_session_id": "RCWH0G1AUIL75TA504WD", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645706026, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931645, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "84d3485b-52e5-4ebd-8f93-e8edd0678d8b", "db_session_id": "RCWH0G1AUIL75TA504WD", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931645710560, "job": 1, "event": "recovery_finished"}
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55aabd1b0700
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: DB pointer 0x55aabd0bda00
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super from 4, latest 4
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _upgrade_super done
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 460.80 MB usage: 1.39 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(2,0.72 KB,0.000152323%) FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: _get_class not permitted to load lua
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: _get_class not permitted to load sdk
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: _get_class not permitted to load test_remote_reads
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 load_pgs
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 load_pgs opened 0 pgs
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-fac9935f95d33bcb68200c6471366f7f43990c0a87521344e46f17901a168b1d-merged.mount: Deactivated successfully.
Feb 01 07:40:45 np0005604215.localdomain ceph-osd[31357]: osd.2 0 log_to_monitors true
Feb 01 07:40:45 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2[31353]: 2026-02-01T07:40:45.752+0000 7fe9a603da80 -1 osd.2 0 log_to_monitors true
Feb 01 07:40:45 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test[31716]: usage: ceph-volume activate [-h] [--osd-id OSD_ID] [--osd-uuid OSD_UUID]
Feb 01 07:40:45 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test[31716]:                             [--no-systemd] [--no-tmpfs]
Feb 01 07:40:45 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test[31716]: ceph-volume activate: error: unrecognized arguments: --bad-option
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: libpod-d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde.scope: Deactivated successfully.
Feb 01 07:40:45 np0005604215.localdomain podman[31565]: 2026-02-01 07:40:45.88974824 +0000 UTC m=+0.481585705 container died d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, distribution-scope=public, name=rhceph, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d28d353abdd1b6ecd25446a446502c66ed96fbf47eec85309756a0f5b1fd5e2a-merged.mount: Deactivated successfully.
Feb 01 07:40:45 np0005604215.localdomain podman[31936]: 2026-02-01 07:40:45.959558657 +0000 UTC m=+0.061010671 container remove d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate-test, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, version=7, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, CEPH_POINT_RELEASE=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7)
Feb 01 07:40:45 np0005604215.localdomain systemd[1]: libpod-conmon-d2eea971760230ceda92e13a2782d5c9c2f96aa5d54ff1b485c939be8909cfde.scope: Deactivated successfully.
Feb 01 07:40:46 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:46 np0005604215.localdomain systemd-rc-local-generator[31992]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:46 np0005604215.localdomain systemd-sysv-generator[31997]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:46 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:40:46 np0005604215.localdomain systemd-sysv-generator[32036]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:40:46 np0005604215.localdomain systemd-rc-local-generator[32033]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:40:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 01 07:40:46 np0005604215.localdomain systemd[1]: Starting Ceph osd.5 for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0 done with init, starting boot process
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0 start_boot
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 01 07:40:46 np0005604215.localdomain ceph-osd[31357]: osd.2 0  bench count 12288000 bsize 4 KiB
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 2026-02-01 07:40:47.133301839 +0000 UTC m=+0.078241817 container create 36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, com.redhat.component=rhceph-container, RELEASE=main, ceph=True, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_CLEAN=True, io.openshift.expose-services=, distribution-scope=public)
Feb 01 07:40:47 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 2026-02-01 07:40:47.098935618 +0000 UTC m=+0.043875616 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e1801a4fc2bec716e4d074dd0377cf0c241b68e1f1a98e97d826cb022aeee0/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e1801a4fc2bec716e4d074dd0377cf0c241b68e1f1a98e97d826cb022aeee0/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e1801a4fc2bec716e4d074dd0377cf0c241b68e1f1a98e97d826cb022aeee0/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e1801a4fc2bec716e4d074dd0377cf0c241b68e1f1a98e97d826cb022aeee0/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47e1801a4fc2bec716e4d074dd0377cf0c241b68e1f1a98e97d826cb022aeee0/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 2026-02-01 07:40:47.271717256 +0000 UTC m=+0.216657234 container init 36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate, CEPH_POINT_RELEASE=, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, release=1764794109, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, vendor=Red Hat, Inc.)
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 2026-02-01 07:40:47.295556544 +0000 UTC m=+0.240496532 container start 36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.buildah.version=1.41.4)
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 2026-02-01 07:40:47.295937162 +0000 UTC m=+0.240877180 container attach 36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.41.4, vcs-type=git, CEPH_POINT_RELEASE=, name=rhceph)
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: Running command: /usr/bin/ceph-bluestore-tool prime-osd-dir --path /var/lib/ceph/osd/ceph-5 --no-mon-config --dev /dev/mapper/ceph_vg1-ceph_lv1
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: Running command: /usr/bin/chown -h ceph:ceph /dev/mapper/ceph_vg1-ceph_lv1
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: Running command: /usr/bin/chown -R ceph:ceph /dev/dm-1
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: Running command: /usr/bin/ln -s /dev/mapper/ceph_vg1-ceph_lv1 /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: Running command: /usr/bin/chown -R ceph:ceph /var/lib/ceph/osd/ceph-5
Feb 01 07:40:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate[32112]: --> ceph-volume raw activate successful for osd ID: 5
Feb 01 07:40:47 np0005604215.localdomain bash[32098]: --> ceph-volume raw activate successful for osd ID: 5
Feb 01 07:40:47 np0005604215.localdomain systemd[1]: libpod-36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2.scope: Deactivated successfully.
Feb 01 07:40:47 np0005604215.localdomain podman[32098]: 2026-02-01 07:40:47.98193296 +0000 UTC m=+0.926872938 container died 36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vcs-type=git, ceph=True, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., architecture=x86_64, GIT_BRANCH=main, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 07:40:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-47e1801a4fc2bec716e4d074dd0377cf0c241b68e1f1a98e97d826cb022aeee0-merged.mount: Deactivated successfully.
Feb 01 07:40:48 np0005604215.localdomain podman[32239]: 2026-02-01 07:40:48.068420031 +0000 UTC m=+0.080179058 container remove 36861a26c3f7b486f33da731f5a5ad899c89c0e4966f5188079bc31335796fe2 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5-activate, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, architecture=x86_64, name=rhceph, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:48 np0005604215.localdomain podman[32300]: 
Feb 01 07:40:48 np0005604215.localdomain podman[32300]: 2026-02-01 07:40:48.413378466 +0000 UTC m=+0.079040363 container create 0892bf7f52004db1c7d54184bf179169b4e0add1d0e942ba56af5041d91a8912 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, architecture=x86_64, version=7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Feb 01 07:40:48 np0005604215.localdomain podman[32300]: 2026-02-01 07:40:48.385595555 +0000 UTC m=+0.051257482 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683526f88687e73b27941c5962cadab4561ff25659e6c5d067f7f9f43e332ea5/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683526f88687e73b27941c5962cadab4561ff25659e6c5d067f7f9f43e332ea5/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683526f88687e73b27941c5962cadab4561ff25659e6c5d067f7f9f43e332ea5/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683526f88687e73b27941c5962cadab4561ff25659e6c5d067f7f9f43e332ea5/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/683526f88687e73b27941c5962cadab4561ff25659e6c5d067f7f9f43e332ea5/merged/var/lib/ceph/osd/ceph-5 supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:48 np0005604215.localdomain podman[32300]: 2026-02-01 07:40:48.563782599 +0000 UTC m=+0.229444496 container init 0892bf7f52004db1c7d54184bf179169b4e0add1d0e942ba56af5041d91a8912 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:48 np0005604215.localdomain systemd[1]: tmp-crun.57R3hN.mount: Deactivated successfully.
Feb 01 07:40:48 np0005604215.localdomain podman[32300]: 2026-02-01 07:40:48.575864156 +0000 UTC m=+0.241526073 container start 0892bf7f52004db1c7d54184bf179169b4e0add1d0e942ba56af5041d91a8912 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, version=7, release=1764794109, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 07:40:48 np0005604215.localdomain bash[32300]: 0892bf7f52004db1c7d54184bf179169b4e0add1d0e942ba56af5041d91a8912
Feb 01 07:40:48 np0005604215.localdomain systemd[1]: Started Ceph osd.5 for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: set uid:gid to 167:167 (ceph:ceph)
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-osd, pid 2
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: pidfile_write: ignore empty --pid-file
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) close
Feb 01 07:40:48 np0005604215.localdomain sudo[31385]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:48 np0005604215.localdomain sudo[32331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:48 np0005604215.localdomain sudo[32331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:48 np0005604215.localdomain sudo[32331]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:48 np0005604215.localdomain sudo[32346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- raw list --format json
Feb 01 07:40:48 np0005604215.localdomain sudo[32346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:48 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) close
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: starting osd.5 osd_data /var/lib/ceph/osd/ceph-5 /var/lib/ceph/osd/ceph-5/journal
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: load: jerasure load: lrc 
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) close
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 2026-02-01 07:40:49.402518298 +0000 UTC m=+0.072171668 container create bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_chaplygin, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-type=git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, architecture=x86_64, RELEASE=main, distribution-scope=public, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) close
Feb 01 07:40:49 np0005604215.localdomain systemd[1]: Started libpod-conmon-bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62.scope.
Feb 01 07:40:49 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 2026-02-01 07:40:49.371767393 +0000 UTC m=+0.041420793 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 2026-02-01 07:40:49.516427934 +0000 UTC m=+0.186081304 container init bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_chaplygin, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, architecture=x86_64, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 07:40:49 np0005604215.localdomain systemd[1]: tmp-crun.uSw8jT.mount: Deactivated successfully.
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 2026-02-01 07:40:49.53363934 +0000 UTC m=+0.203292730 container start bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_chaplygin, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109)
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 2026-02-01 07:40:49.534032318 +0000 UTC m=+0.203685688 container attach bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_chaplygin, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1764794109, name=rhceph, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, RELEASE=main, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 07:40:49 np0005604215.localdomain epic_chaplygin[32428]: 167 167
Feb 01 07:40:49 np0005604215.localdomain systemd[1]: libpod-bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62.scope: Deactivated successfully.
Feb 01 07:40:49 np0005604215.localdomain podman[32408]: 2026-02-01 07:40:49.538907962 +0000 UTC m=+0.208561332 container died bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_chaplygin, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, distribution-scope=public, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, name=rhceph, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:49 np0005604215.localdomain podman[32433]: 2026-02-01 07:40:49.642880176 +0000 UTC m=+0.096048017 container remove bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=epic_chaplygin, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, vcs-type=git, io.buildah.version=1.41.4, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, build-date=2025-12-08T17:28:53Z, release=1764794109, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main)
Feb 01 07:40:49 np0005604215.localdomain systemd[1]: libpod-conmon-bc86a720f94d15711eaf46c54d89af48202919088b7b2e02e38bf55deab9fb62.scope: Deactivated successfully.
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: mClockScheduler: set_osd_capacity_params_from_config: osd_bandwidth_cost_per_io: 499321.90 bytes/io, osd_bandwidth_capacity_per_shard 157286400.00 bytes/second
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: osd.5:0.OSDShard using op scheduler mclock_scheduler, cutoff=196
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee0e00 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _set_cache_sizes cache_size 1073741824 meta 0.45 kv 0.45 data 0.06
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluefs mount
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluefs mount shared_bdev_used = 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: RocksDB version: 7.9.2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Git sha 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Compile date 2025-09-23 00:00:00
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: DB SUMMARY
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: DB Session ID:  6H3HWCHWKTA3X6X83VNN
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: CURRENT file:  CURRENT
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: IDENTITY file:  IDENTITY
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.error_if_exists: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.create_if_missing: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.paranoid_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                     Options.env: 0x55797f174c40
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                Options.info_log: 0x55797fe7c900
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_file_opening_threads: 16
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                              Options.statistics: (nil)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.use_fsync: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.max_log_file_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.allow_fallocate: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.use_direct_reads: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.create_missing_column_families: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                              Options.db_log_dir: 
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                 Options.wal_dir: db.wal
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.advise_random_on_open: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.write_buffer_manager: 0x55797eeca140
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                            Options.rate_limiter: (nil)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.unordered_write: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.row_cache: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                              Options.wal_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.allow_ingest_behind: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.two_write_queues: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.manual_wal_flush: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.wal_compression: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.atomic_flush: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.log_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.allow_data_in_errors: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.db_host_id: __hostname__
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_background_jobs: 4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_background_compactions: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_subcompactions: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.max_open_files: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.bytes_per_sync: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.max_background_flushes: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Compression algorithms supported:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kZSTD supported: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kXpressCompression supported: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kBZip2Compression supported: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kLZ4Compression supported: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kZlibCompression supported: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kLZ4HCCompression supported: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kSnappyCompression supported: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_readonly.cc:25] Opening the db in read only mode
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: 
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cac0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb8850
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[31357]: osd.2 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 24.069 iops: 6161.642 elapsed_sec: 0.487
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [WRN] : OSD bench result of 6161.642331 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.2. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[31357]: osd.2 0 waiting for initial osdmap
Feb 01 07:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2[31353]: 2026-02-01T07:40:49.742+0000 7fe9a1fbc640 -1 osd.2 0 waiting for initial osdmap
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cce0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cce0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7cce0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3a89ed51-aa97-4c8c-8741-d02f18e7055a
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649733052, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931649733299, "job": 1, "event": "recovery_finished"}
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old nid_max 1025
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta old blobid_max 10240
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta ondisk_format 4 compat_ondisk_format 3
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _open_super_meta min_alloc_size 0x1000
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: freelist init
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: freelist _read_cfg
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _init_alloc loaded 7.0 GiB in 2 extents, allocator type hybrid, capacity 0x1bfc00000, block size 0x1000, free 0x1bfbfd000, fragmentation 5.5e-07
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:496] Shutdown: canceling all background work
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:704] Shutdown complete
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bluefs umount
Feb 01 07:40:49 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) close
Feb 01 07:40:49 np0005604215.localdomain podman[32647]: 
Feb 01 07:40:49 np0005604215.localdomain podman[32647]: 2026-02-01 07:40:49.819791393 +0000 UTC m=+0.042557287 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 crush map has features 288514050185494528, adjusting msgr requires for clients
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 crush map has features 288514050185494528 was 288232575208792577, adjusting msgr requires for mons
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 crush map has features 3314932999778484224, adjusting msgr requires for osds
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 check_osdmap_features require_osd_release unknown -> reef
Feb 01 07:40:50 np0005604215.localdomain podman[32647]: 2026-02-01 07:40:50.15454899 +0000 UTC m=+0.377314914 container create f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heyrovsky, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vcs-type=git, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 07:40:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-2[31353]: 2026-02-01T07:40:50.160+0000 7fe99d5e6640 -1 osd.2 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 set_numa_affinity not setting numa affinity
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 _collect_metadata loop3:  no unique device id for loop3: fallback method has no model nor serial
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) open path /var/lib/ceph/osd/ceph-5/block
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) ioctl(F_SET_FILE_RW_HINT) on /var/lib/ceph/osd/ceph-5/block failed: (22) Invalid argument
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bdev(0x55797eee1180 /var/lib/ceph/osd/ceph-5/block) open size 7511998464 (0x1bfc00000, 7.0 GiB) block_size 4096 (4 KiB) rotational device, discard supported
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluefs add_block_device bdev 1 path /var/lib/ceph/osd/ceph-5/block size 7.0 GiB
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluefs mount
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluefs _init_alloc shared, id 1, capacity 0x1bfc00000, block size 0x10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluefs mount shared_bdev_used = 4718592
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _prepare_db_environment set db_paths to db,7136398540 db.slow,7136398540
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: RocksDB version: 7.9.2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Git sha 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Compile date 2025-09-23 00:00:00
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: DB SUMMARY
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: DB Session ID:  6H3HWCHWKTA3X6X83VNM
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: CURRENT file:  CURRENT
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: IDENTITY file:  IDENTITY
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: MANIFEST file:  MANIFEST-000032 size: 1007 Bytes
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: SST files in db dir, Total Num: 1, files: 000030.sst 
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: SST files in db.slow dir, Total Num: 0, files: 
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Write Ahead Log file in db.wal: 000031.log size: 5093 ; 
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.error_if_exists: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.create_if_missing: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.paranoid_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                     Options.env: 0x55797f006310
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                      Options.fs: LegacyFileSystem
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                Options.info_log: 0x55797ef79320
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_file_opening_threads: 16
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                              Options.statistics: (nil)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.use_fsync: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.max_log_file_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.allow_fallocate: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.use_direct_reads: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.create_missing_column_families: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                              Options.db_log_dir: 
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                                 Options.wal_dir: db.wal
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.advise_random_on_open: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.write_buffer_manager: 0x55797eecb540
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                            Options.rate_limiter: (nil)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.unordered_write: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.row_cache: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                              Options.wal_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.allow_ingest_behind: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.two_write_queues: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.manual_wal_flush: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.wal_compression: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.atomic_flush: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.log_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.allow_data_in_errors: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.db_host_id: __hostname__
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_background_jobs: 4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_background_compactions: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_subcompactions: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.writable_file_max_buffer_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.max_total_wal_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.max_open_files: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.bytes_per_sync: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.compaction_readahead_size: 2097152
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.max_background_flushes: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Compression algorithms supported:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kZSTD supported: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kXpressCompression supported: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kBZip2Compression supported: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kLZ4Compression supported: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kZlibCompression supported: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kLZ4HCCompression supported: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         kSnappyCompression supported: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: db/MANIFEST-000032
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 0, name: default)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: .T:int64_array.b:bitwise_xor
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: 
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 1, name: m-0)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-0]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 2, name: m-1)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-1]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 3, name: m-2)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [m-2]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674.scope.
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 4, name: p-0)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-0]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 5, name: p-1)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-1]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 6, name: p-2)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [p-2]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dd60)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb82d0
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 483183820
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 7, name: O-0)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-0]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dfa0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb9610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 8, name: O-1)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-1]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dfa0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb9610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 9, name: O-2)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [O-2]:
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:           Options.merge_operator: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x55797fe7dfa0)
                                                            cache_index_and_filter_blocks: 1
                                                            cache_index_and_filter_blocks_with_high_priority: 0
                                                            pin_l0_filter_and_index_blocks_in_cache: 0
                                                            pin_top_level_index_and_filter: 1
                                                            index_type: 0
                                                            data_block_index_type: 0
                                                            index_shortening: 1
                                                            data_block_hash_table_util_ratio: 0.750000
                                                            checksum: 4
                                                            no_block_cache: 0
                                                            block_cache: 0x55797eeb9610
                                                            block_cache_name: BinnedLRUCache
                                                            block_cache_options:
                                                              capacity : 536870912
                                                              num_shard_bits : 4
                                                              strict_capacity_limit : 0
                                                              high_pri_pool_ratio: 0.000
                                                            block_cache_compressed: (nil)
                                                            persistent_cache: (nil)
                                                            block_size: 4096
                                                            block_size_deviation: 10
                                                            block_restart_interval: 16
                                                            index_block_restart_interval: 1
                                                            metadata_block_size: 4096
                                                            partition_filters: 0
                                                            use_delta_encoding: 1
                                                            filter_policy: bloomfilter
                                                            whole_key_filtering: 1
                                                            verify_compression: 0
                                                            read_amp_bytes_per_bit: 0
                                                            format_version: 5
                                                            enable_index_compression: 1
                                                            block_align: 0
                                                            max_auto_readahead_size: 262144
                                                            prepopulate_block_cache: 0
                                                            initial_auto_readahead_size: 8192
                                                            num_file_reads_for_auto_readahead: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.write_buffer_size: 16777216
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.max_write_buffer_number: 64
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.compression: LZ4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.num_levels: 7
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:        Options.min_write_buffer_number_to_merge: 6
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.level0_file_num_compaction_trigger: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.max_bytes_for_level_base: 1073741824
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.level_compaction_dynamic_level_bytes: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.max_bytes_for_level_multiplier: 8.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 07:40:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.table_properties_collectors: CompactOnDeletionCollector (Sliding window size = 32768 Deletion trigger = 16384 Deletion ratio = 0);
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.bloom_locality: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                               Options.ttl: 2592000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                       Options.enable_blob_files: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                           Options.min_blob_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 10, name: L)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:578] Failed to register data paths of column family (id: 11, name: P)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/column_family.cc:635]         (skipping printing options)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:db/MANIFEST-000032 succeeded,manifest_file_number is 32, next_file_number is 34, last_sequence is 12, log_number is 5,prev_log_number is 0,max_column_family is 11,min_log_number_to_keep is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [m-0] (ID 1), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [m-1] (ID 2), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [m-2] (ID 3), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [p-0] (ID 4), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [p-1] (ID 5), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [p-2] (ID 6), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [O-0] (ID 7), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [O-1] (ID 8), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [O-2] (ID 9), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [L] (ID 10), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5581] Column family [P] (ID 11), log number is 5
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: 3a89ed51-aa97-4c8c-8741-d02f18e7055a
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931650213332, "job": 1, "event": "recovery_started", "wal_files": [31]}
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #31 mode 2
Feb 01 07:40:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b62613d8e1ec585adf6ae8ec7e3389610f689ed50fffcc6732ab2e4b96caeee8/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931650224819, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 35, "file_size": 1261, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13, "largest_seqno": 21, "table_properties": {"data_size": 128, "index_size": 27, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 87, "raw_average_key_size": 17, "raw_value_size": 82, "raw_average_value_size": 16, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 2, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": ".T:int64_array.b:bitwise_xor", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931650, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3a89ed51-aa97-4c8c-8741-d02f18e7055a", "db_session_id": "6H3HWCHWKTA3X6X83VNM", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 01 07:40:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b62613d8e1ec585adf6ae8ec7e3389610f689ed50fffcc6732ab2e4b96caeee8/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931650232203, "cf_name": "p-0", "job": 1, "event": "table_file_creation", "file_number": 36, "file_size": 1609, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14, "largest_seqno": 15, "table_properties": {"data_size": 468, "index_size": 39, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 72, "raw_average_key_size": 36, "raw_value_size": 567, "raw_average_value_size": 283, "num_data_blocks": 1, "num_entries": 2, "num_filter_entries": 2, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "p-0", "column_family_id": 4, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931650, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3a89ed51-aa97-4c8c-8741-d02f18e7055a", "db_session_id": "6H3HWCHWKTA3X6X83VNM", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931650236656, "cf_name": "O-2", "job": 1, "event": "table_file_creation", "file_number": 37, "file_size": 1290, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16, "largest_seqno": 16, "table_properties": {"data_size": 121, "index_size": 64, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 55, "raw_average_key_size": 55, "raw_value_size": 50, "raw_average_value_size": 50, "num_data_blocks": 1, "num_entries": 1, "num_filter_entries": 1, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "O-2", "column_family_id": 9, "comparator": "leveldb.BytewiseComparator", "merge_operator": "nullptr", "prefix_extractor_name": "nullptr", "property_collectors": "[CompactOnDeletionCollector]", "compression": "LZ4", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769931650, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "3a89ed51-aa97-4c8c-8741-d02f18e7055a", "db_session_id": "6H3HWCHWKTA3X6X83VNM", "orig_file_number": 37, "seqno_to_time_mapping": "N/A"}}
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_open.cc:1432] Failed to truncate log #31: IO error: No such file or directory: While open a file for appending: db.wal/000031.log: No such file or directory
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769931650239931, "job": 1, "event": "recovery_finished"}
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/version_set.cc:5047] Creating manifest 40
Feb 01 07:40:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b62613d8e1ec585adf6ae8ec7e3389610f689ed50fffcc6732ab2e4b96caeee8/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:50 np0005604215.localdomain podman[32647]: 2026-02-01 07:40:50.24465484 +0000 UTC m=+0.467420764 container init f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heyrovsky, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 07:40:50 np0005604215.localdomain podman[32647]: 2026-02-01 07:40:50.255423338 +0000 UTC m=+0.478189222 container start f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heyrovsky, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, vcs-type=git, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 07:40:50 np0005604215.localdomain podman[32647]: 2026-02-01 07:40:50.255635804 +0000 UTC m=+0.478401748 container attach f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heyrovsky, ceph=True, distribution-scope=public, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, build-date=2025-12-08T17:28:53Z, release=1764794109, vendor=Red Hat, Inc., version=7)
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x55797ef80380
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: DB pointer 0x55797fdd9a00
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _open_db opened rocksdb path db options compression=kLZ4Compression,max_write_buffer_number=64,min_write_buffer_number_to_merge=6,compaction_style=kCompactionStyleLevel,write_buffer_size=16777216,max_background_jobs=4,level0_file_num_compaction_trigger=8,max_bytes_for_level_base=1073741824,max_bytes_for_level_multiplier=8,compaction_readahead_size=2MB,max_total_wal_size=1073741824,writable_file_max_buffer_size=0
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super from 4, latest 4
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _upgrade_super done
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                          Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.02 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 512.00 MB usage: 0.25 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.14 KB,2.68221e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 0.1 total, 0.1 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 460.80 MB usage: 0.67 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 8 last_secs: 2.8e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): FilterBlock(3,0.33 KB,6.95388e-05%) IndexBlock(3,0.34 KB,7.28501e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/cephfs/cls_cephfs.cc:201: loading cephfs
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: <cls> /builddir/build/BUILD/ceph-18.2.1/src/cls/hello/cls_hello.cc:316: loading cls_hello
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: _get_class not permitted to load lua
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: _get_class not permitted to load sdk
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: _get_class not permitted to load test_remote_reads
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for clients
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 crush map has features 288232575208783872 was 8705, adjusting msgr requires for mons
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 crush map has features 288232575208783872, adjusting msgr requires for osds
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 check_osdmap_features enabling on-disk ERASURE CODES compat feature
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 load_pgs
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 load_pgs opened 0 pgs
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[32318]: osd.5 0 log_to_monitors true
Feb 01 07:40:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5[32314]: 2026-02-01T07:40:50.276+0000 7f0ff47d0a80 -1 osd.5 0 log_to_monitors true
Feb 01 07:40:50 np0005604215.localdomain systemd[1]: tmp-crun.amLOiP.mount: Deactivated successfully.
Feb 01 07:40:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-66c3020dd7ec370c750e92e3346cf8a35f6796a895fc8a0f83255e23cf8dbfd3-merged.mount: Deactivated successfully.
Feb 01 07:40:50 np0005604215.localdomain ceph-osd[31357]: osd.2 11 tick checking mon for new map
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]: {
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:     "91738c8a-fd02-4668-b2ac-8ebbd36126da": {
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "ceph_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e",
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "device": "/dev/mapper/ceph_vg0-ceph_lv0",
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "osd_id": 2,
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "osd_uuid": "91738c8a-fd02-4668-b2ac-8ebbd36126da",
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "type": "bluestore"
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:     },
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:     "dc0298a4-c2cb-4512-baf8-45dcc8aa1439": {
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "ceph_fsid": "33fac0b9-80c7-560f-918a-c92d3021ca1e",
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "device": "/dev/mapper/ceph_vg1-ceph_lv1",
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "osd_id": 5,
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "osd_uuid": "dc0298a4-c2cb-4512-baf8-45dcc8aa1439",
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:         "type": "bluestore"
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]:     }
Feb 01 07:40:50 np0005604215.localdomain relaxed_heyrovsky[32695]: }
Feb 01 07:40:50 np0005604215.localdomain systemd[1]: libpod-f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674.scope: Deactivated successfully.
Feb 01 07:40:50 np0005604215.localdomain podman[32647]: 2026-02-01 07:40:50.910367124 +0000 UTC m=+1.133133088 container died f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heyrovsky, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, RELEASE=main, vcs-type=git, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, architecture=x86_64, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 07:40:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b62613d8e1ec585adf6ae8ec7e3389610f689ed50fffcc6732ab2e4b96caeee8-merged.mount: Deactivated successfully.
Feb 01 07:40:51 np0005604215.localdomain podman[32915]: 2026-02-01 07:40:51.013694345 +0000 UTC m=+0.090298854 container remove f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_heyrovsky, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, release=1764794109, name=rhceph, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 07:40:51 np0005604215.localdomain systemd[1]: libpod-conmon-f59b1a2b2f90ec887f5cf9241f24b3e26570eb6a068387300a8a262f58be7674.scope: Deactivated successfully.
Feb 01 07:40:51 np0005604215.localdomain sudo[32346]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:51 np0005604215.localdomain ceph-osd[31357]: osd.2 13 state: booting -> active
Feb 01 07:40:51 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : purged_snaps scrub starts
Feb 01 07:40:51 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : purged_snaps scrub ok
Feb 01 07:40:51 np0005604215.localdomain sudo[32930]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:40:51 np0005604215.localdomain sudo[32930]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:51 np0005604215.localdomain sudo[32930]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:51 np0005604215.localdomain sudo[32945]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:51 np0005604215.localdomain sudo[32945]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:51 np0005604215.localdomain sudo[32945]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:51 np0005604215.localdomain sudo[32960]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 07:40:51 np0005604215.localdomain sudo[32960]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0 done with init, starting boot process
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0 start_boot
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0 maybe_override_options_for_qos osd_max_backfills set to 1
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active set to 0
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_hdd set to 3
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0 maybe_override_options_for_qos osd_recovery_max_active_ssd set to 10
Feb 01 07:40:52 np0005604215.localdomain ceph-osd[32318]: osd.5 0  bench count 12288000 bsize 4 KiB
Feb 01 07:40:52 np0005604215.localdomain systemd[1]: tmp-crun.CTMiWs.mount: Deactivated successfully.
Feb 01 07:40:52 np0005604215.localdomain podman[33047]: 2026-02-01 07:40:52.53561403 +0000 UTC m=+0.080743719 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.openshift.expose-services=, GIT_CLEAN=True, ceph=True, release=1764794109, name=rhceph, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 07:40:52 np0005604215.localdomain podman[33047]: 2026-02-01 07:40:52.676524532 +0000 UTC m=+0.221654151 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, release=1764794109, name=rhceph, description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main)
Feb 01 07:40:52 np0005604215.localdomain sudo[32960]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:53 np0005604215.localdomain sudo[33108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:53 np0005604215.localdomain sudo[33108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:53 np0005604215.localdomain sudo[33108]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:53 np0005604215.localdomain sudo[33123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:40:53 np0005604215.localdomain sudo[33123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:53 np0005604215.localdomain ceph-osd[31357]: osd.2 15 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 01 07:40:53 np0005604215.localdomain ceph-osd[31357]: osd.2 15 crush map has features 288514051259236352 was 288514050185503233, adjusting msgr requires for mons
Feb 01 07:40:53 np0005604215.localdomain ceph-osd[31357]: osd.2 15 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 01 07:40:53 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 15 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=15) [2] r=0 lpr=15 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:40:53 np0005604215.localdomain sudo[33123]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:54 np0005604215.localdomain sudo[33167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:40:54 np0005604215.localdomain sudo[33167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:54 np0005604215.localdomain sudo[33167]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:54 np0005604215.localdomain sudo[33182]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- inventory --format=json-pretty --filter-for-batch
Feb 01 07:40:54 np0005604215.localdomain sudo[33182]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=16) [2,4,3] r=0 lpr=16 pi=[15,16)/0 crt=0'0 mlcod 0'0 unknown mbc={}] start_peering_interval up [2] -> [2,4,3], acting [2] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 16 pg[1.0( empty local-lis/les=0/0 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=16) [2,4,3] r=0 lpr=16 pi=[15,16)/0 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 0 maybe_override_max_osd_capacity_for_qos osd bench result - bandwidth (MiB/sec): 29.832 iops: 7636.984 elapsed_sec: 0.393
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [WRN] : OSD bench result of 7636.984107 IOPS is not within the threshold limit range of 50.000000 IOPS and 500.000000 IOPS for osd.5. IOPS capacity is unchanged at 315.000000 IOPS. The recommendation is to establish the osd's IOPS capacity using other benchmark tools (e.g. Fio) and then override osd_mclock_max_capacity_iops_[hdd|ssd].
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 0 waiting for initial osdmap
Feb 01 07:40:54 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5[32314]: 2026-02-01T07:40:54.694+0000 7f0ff0f64640 -1 osd.5 0 waiting for initial osdmap
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 crush map has features 288514051259236352, adjusting msgr requires for clients
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 crush map has features 288514051259236352 was 288232575208792577, adjusting msgr requires for mons
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 crush map has features 3314933000852226048, adjusting msgr requires for osds
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 check_osdmap_features require_osd_release unknown -> reef
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 01 07:40:54 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-osd-5[32314]: 2026-02-01T07:40:54.715+0000 7f0febd79640 -1 osd.5 16 set_numa_affinity unable to identify public interface '' numa node: (2) No such file or directory
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 set_numa_affinity not setting numa affinity
Feb 01 07:40:54 np0005604215.localdomain ceph-osd[32318]: osd.5 16 _collect_metadata loop4:  no unique device id for loop4: fallback method has no model nor serial
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 2026-02-01 07:40:54.774896592 +0000 UTC m=+0.068435468 container create 5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_bassi, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, vendor=Red Hat, Inc.)
Feb 01 07:40:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab.scope.
Feb 01 07:40:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 2026-02-01 07:40:54.840047189 +0000 UTC m=+0.133586065 container init 5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_bassi, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., version=7, release=1764794109, ceph=True, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:54 np0005604215.localdomain systemd[1]: tmp-crun.FZCXIA.mount: Deactivated successfully.
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 2026-02-01 07:40:54.756418039 +0000 UTC m=+0.049956935 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 2026-02-01 07:40:54.85933978 +0000 UTC m=+0.152878626 container start 5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_bassi, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, RELEASE=main, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z)
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 2026-02-01 07:40:54.859699278 +0000 UTC m=+0.153238204 container attach 5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_bassi, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, RELEASE=main, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., architecture=x86_64, release=1764794109, CEPH_POINT_RELEASE=, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 07:40:54 np0005604215.localdomain modest_bassi[33252]: 167 167
Feb 01 07:40:54 np0005604215.localdomain systemd[1]: libpod-5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab.scope: Deactivated successfully.
Feb 01 07:40:54 np0005604215.localdomain podman[33235]: 2026-02-01 07:40:54.863183502 +0000 UTC m=+0.156722378 container died 5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_bassi, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, release=1764794109, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph)
Feb 01 07:40:54 np0005604215.localdomain podman[33257]: 2026-02-01 07:40:54.961829742 +0000 UTC m=+0.084907299 container remove 5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=modest_bassi, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-type=git, release=1764794109, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:40:54 np0005604215.localdomain systemd[1]: libpod-conmon-5363a64e012e0ca14c34f02c953828773e1341d05894395587ab28312dcd1cab.scope: Deactivated successfully.
Feb 01 07:40:55 np0005604215.localdomain podman[33278]: 
Feb 01 07:40:55 np0005604215.localdomain podman[33278]: 2026-02-01 07:40:55.159695576 +0000 UTC m=+0.071228508 container create 47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_hermann, name=rhceph, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, release=1764794109, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Feb 01 07:40:55 np0005604215.localdomain systemd[1]: Started libpod-conmon-47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9.scope.
Feb 01 07:40:55 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:40:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50eaca4ffa7289cf3a114bd6e609e146c8d484c0d2ed8a9916aa89c6f2f56041/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:55 np0005604215.localdomain podman[33278]: 2026-02-01 07:40:55.13077445 +0000 UTC m=+0.042307432 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 07:40:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50eaca4ffa7289cf3a114bd6e609e146c8d484c0d2ed8a9916aa89c6f2f56041/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50eaca4ffa7289cf3a114bd6e609e146c8d484c0d2ed8a9916aa89c6f2f56041/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 07:40:55 np0005604215.localdomain ceph-osd[32318]: osd.5 17 state: booting -> active
Feb 01 07:40:55 np0005604215.localdomain podman[33278]: 2026-02-01 07:40:55.26137636 +0000 UTC m=+0.172909292 container init 47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_hermann, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, name=rhceph, distribution-scope=public)
Feb 01 07:40:55 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 17 pg[1.0( empty local-lis/les=16/17 n=0 ec=15/15 lis/c=0/0 les/c/f=0/0/0 sis=16) [2,4,3] r=0 lpr=16 pi=[15,16)/0 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:40:55 np0005604215.localdomain podman[33278]: 2026-02-01 07:40:55.272677561 +0000 UTC m=+0.184210493 container start 47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_hermann, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, com.redhat.component=rhceph-container, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, release=1764794109, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, name=rhceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 07:40:55 np0005604215.localdomain podman[33278]: 2026-02-01 07:40:55.272896407 +0000 UTC m=+0.184429379 container attach 47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_hermann, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, version=7, GIT_CLEAN=True, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1764794109, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:40:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c26321bde21cea8e9d3570bba4458286791b41fcce28dfb1e3f503e3d43c4052-merged.mount: Deactivated successfully.
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]: [
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:     {
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "available": false,
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "ceph_device": false,
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "lsm_data": {},
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "lvs": [],
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "path": "/dev/sr0",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "rejected_reasons": [
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "Insufficient space (<5GB)",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "Has a FileSystem"
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         ],
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         "sys_api": {
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "actuators": null,
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "device_nodes": "sr0",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "human_readable_size": "482.00 KB",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "id_bus": "ata",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "model": "QEMU DVD-ROM",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "nr_requests": "2",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "partitions": {},
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "path": "/dev/sr0",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "removable": "1",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "rev": "2.5+",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "ro": "0",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "rotational": "1",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "sas_address": "",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "sas_device_handle": "",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "scheduler_mode": "mq-deadline",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "sectors": 0,
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "sectorsize": "2048",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "size": 493568.0,
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "support_discard": "0",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "type": "disk",
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:             "vendor": "QEMU"
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:         }
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]:     }
Feb 01 07:40:56 np0005604215.localdomain ecstatic_hermann[33293]: ]
Feb 01 07:40:56 np0005604215.localdomain systemd[1]: libpod-47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9.scope: Deactivated successfully.
Feb 01 07:40:56 np0005604215.localdomain podman[34766]: 2026-02-01 07:40:56.259422162 +0000 UTC m=+0.055341819 container died 47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_hermann, distribution-scope=public, com.redhat.component=rhceph-container, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_CLEAN=True, version=7)
Feb 01 07:40:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-50eaca4ffa7289cf3a114bd6e609e146c8d484c0d2ed8a9916aa89c6f2f56041-merged.mount: Deactivated successfully.
Feb 01 07:40:56 np0005604215.localdomain podman[34766]: 2026-02-01 07:40:56.300023677 +0000 UTC m=+0.095943284 container remove 47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ecstatic_hermann, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, name=rhceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=1764794109, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 07:40:56 np0005604215.localdomain systemd[1]: libpod-conmon-47631966979b6d34c4a3bf2abc3abca3a62b8a007e119dfc302fbe42fe8c20a9.scope: Deactivated successfully.
Feb 01 07:40:56 np0005604215.localdomain sudo[33182]: pam_unix(sudo:session): session closed for user root
Feb 01 07:40:58 np0005604215.localdomain sudo[34780]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:40:58 np0005604215.localdomain sudo[34780]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:40:58 np0005604215.localdomain sudo[34780]: pam_unix(sudo:session): session closed for user root
Feb 01 07:41:01 np0005604215.localdomain anacron[18977]: Job `cron.daily' started
Feb 01 07:41:01 np0005604215.localdomain anacron[18977]: Job `cron.daily' terminated
Feb 01 07:41:04 np0005604215.localdomain sudo[34797]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:41:04 np0005604215.localdomain sudo[34797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:41:04 np0005604215.localdomain sudo[34797]: pam_unix(sudo:session): session closed for user root
Feb 01 07:41:04 np0005604215.localdomain sudo[34812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 07:41:04 np0005604215.localdomain sudo[34812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:41:05 np0005604215.localdomain podman[34899]: 2026-02-01 07:41:05.232097409 +0000 UTC m=+0.096213690 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 07:41:05 np0005604215.localdomain podman[34899]: 2026-02-01 07:41:05.361818761 +0000 UTC m=+0.225935062 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, distribution-scope=public, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, version=7, io.openshift.tags=rhceph ceph)
Feb 01 07:41:05 np0005604215.localdomain sudo[34812]: pam_unix(sudo:session): session closed for user root
Feb 01 07:41:06 np0005604215.localdomain sudo[34966]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:41:06 np0005604215.localdomain sudo[34966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:41:06 np0005604215.localdomain sudo[34966]: pam_unix(sudo:session): session closed for user root
Feb 01 07:41:40 np0005604215.localdomain systemd[26134]: Starting Mark boot as successful...
Feb 01 07:41:40 np0005604215.localdomain systemd[26134]: Finished Mark boot as successful.
Feb 01 07:42:06 np0005604215.localdomain sudo[34982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:42:06 np0005604215.localdomain sudo[34982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:42:06 np0005604215.localdomain sudo[34982]: pam_unix(sudo:session): session closed for user root
Feb 01 07:42:06 np0005604215.localdomain sudo[34997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 07:42:06 np0005604215.localdomain sudo[34997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:42:07 np0005604215.localdomain podman[35077]: 2026-02-01 07:42:07.034366294 +0000 UTC m=+0.079311349 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_CLEAN=True, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public)
Feb 01 07:42:07 np0005604215.localdomain podman[35077]: 2026-02-01 07:42:07.1348808 +0000 UTC m=+0.179825815 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph)
Feb 01 07:42:07 np0005604215.localdomain sudo[34997]: pam_unix(sudo:session): session closed for user root
Feb 01 07:42:07 np0005604215.localdomain sudo[35144]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:42:07 np0005604215.localdomain sudo[35144]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:42:07 np0005604215.localdomain sudo[35144]: pam_unix(sudo:session): session closed for user root
Feb 01 07:42:07 np0005604215.localdomain sudo[35159]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:42:07 np0005604215.localdomain sudo[35159]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:42:08 np0005604215.localdomain sudo[35159]: pam_unix(sudo:session): session closed for user root
Feb 01 07:42:08 np0005604215.localdomain sudo[35206]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:42:08 np0005604215.localdomain sudo[35206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:42:08 np0005604215.localdomain sudo[35206]: pam_unix(sudo:session): session closed for user root
Feb 01 07:42:18 np0005604215.localdomain sshd[24640]: Received disconnect from 192.168.122.100 port 41002:11: disconnected by user
Feb 01 07:42:18 np0005604215.localdomain sshd[24640]: Disconnected from user zuul 192.168.122.100 port 41002
Feb 01 07:42:18 np0005604215.localdomain sshd[24637]: pam_unix(sshd:session): session closed for user zuul
Feb 01 07:42:18 np0005604215.localdomain systemd[1]: session-13.scope: Deactivated successfully.
Feb 01 07:42:18 np0005604215.localdomain systemd[1]: session-13.scope: Consumed 20.971s CPU time.
Feb 01 07:42:18 np0005604215.localdomain systemd-logind[761]: Session 13 logged out. Waiting for processes to exit.
Feb 01 07:42:18 np0005604215.localdomain systemd-logind[761]: Removed session 13.
Feb 01 07:43:08 np0005604215.localdomain sudo[35221]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:43:08 np0005604215.localdomain sudo[35221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:43:08 np0005604215.localdomain sudo[35221]: pam_unix(sudo:session): session closed for user root
Feb 01 07:43:09 np0005604215.localdomain sudo[35236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:43:09 np0005604215.localdomain sudo[35236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:43:09 np0005604215.localdomain sudo[35236]: pam_unix(sudo:session): session closed for user root
Feb 01 07:43:10 np0005604215.localdomain sudo[35283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:43:10 np0005604215.localdomain sudo[35283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:43:10 np0005604215.localdomain sudo[35283]: pam_unix(sudo:session): session closed for user root
Feb 01 07:44:10 np0005604215.localdomain sudo[35299]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:44:10 np0005604215.localdomain sudo[35299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:44:10 np0005604215.localdomain sudo[35299]: pam_unix(sudo:session): session closed for user root
Feb 01 07:44:10 np0005604215.localdomain sudo[35314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:44:10 np0005604215.localdomain sudo[35314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:44:11 np0005604215.localdomain sudo[35314]: pam_unix(sudo:session): session closed for user root
Feb 01 07:44:11 np0005604215.localdomain sudo[35360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:44:11 np0005604215.localdomain sudo[35360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:44:11 np0005604215.localdomain sudo[35360]: pam_unix(sudo:session): session closed for user root
Feb 01 07:44:40 np0005604215.localdomain systemd[26134]: Created slice User Background Tasks Slice.
Feb 01 07:44:40 np0005604215.localdomain systemd[26134]: Starting Cleanup of User's Temporary Files and Directories...
Feb 01 07:44:40 np0005604215.localdomain systemd[26134]: Finished Cleanup of User's Temporary Files and Directories.
Feb 01 07:45:11 np0005604215.localdomain sudo[35376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:45:11 np0005604215.localdomain sudo[35376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:45:11 np0005604215.localdomain sudo[35376]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:11 np0005604215.localdomain sudo[35391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:45:11 np0005604215.localdomain sudo[35391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:45:12 np0005604215.localdomain sudo[35391]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:13 np0005604215.localdomain sudo[35438]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:45:13 np0005604215.localdomain sudo[35438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:45:13 np0005604215.localdomain sudo[35438]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:43 np0005604215.localdomain sshd[35453]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:45:44 np0005604215.localdomain sshd[35453]: Accepted publickey for zuul from 192.168.122.100 port 49824 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:45:44 np0005604215.localdomain systemd-logind[761]: New session 27 of user zuul.
Feb 01 07:45:44 np0005604215.localdomain systemd[1]: Started Session 27 of User zuul.
Feb 01 07:45:44 np0005604215.localdomain sshd[35453]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 07:45:44 np0005604215.localdomain sudo[35499]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyvywjrogwlupebswbebendaziniftlc ; /usr/bin/python3
Feb 01 07:45:44 np0005604215.localdomain sudo[35499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:44 np0005604215.localdomain python3[35501]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 01 07:45:44 np0005604215.localdomain sudo[35499]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:45 np0005604215.localdomain sudo[35544]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyttlrqrctttatjznvrfyivpkwchhxhv ; /usr/bin/python3
Feb 01 07:45:45 np0005604215.localdomain sudo[35544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:45 np0005604215.localdomain python3[35546]: ansible-setup Invoked with gather_subset=['!facter', '!ohai'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 07:45:45 np0005604215.localdomain sudo[35544]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:45 np0005604215.localdomain sudo[35564]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aaygrjkggmfxlpyqzcdxdipbhtoecvfh ; /usr/bin/python3
Feb 01 07:45:45 np0005604215.localdomain sudo[35564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:45 np0005604215.localdomain python3[35566]: ansible-user Invoked with name=tripleo-admin generate_ssh_key=False state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604215.localdomain update_password=always uid=None group=None groups=None comment=None home=None shell=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 01 07:45:45 np0005604215.localdomain useradd[35568]: new group: name=tripleo-admin, GID=1003
Feb 01 07:45:45 np0005604215.localdomain useradd[35568]: new user: name=tripleo-admin, UID=1003, GID=1003, home=/home/tripleo-admin, shell=/bin/bash, from=none
Feb 01 07:45:45 np0005604215.localdomain sudo[35564]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:46 np0005604215.localdomain sudo[35620]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfiqumzcruhffinlzpfujyvotmcpsjvj ; /usr/bin/python3
Feb 01 07:45:46 np0005604215.localdomain sudo[35620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:46 np0005604215.localdomain python3[35622]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/tripleo-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:45:46 np0005604215.localdomain sudo[35620]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:46 np0005604215.localdomain sudo[35663]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxklumlfsngufzzixxahudgmhhwxymqa ; /usr/bin/python3
Feb 01 07:45:46 np0005604215.localdomain sudo[35663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:46 np0005604215.localdomain python3[35665]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/tripleo-admin mode=288 owner=root group=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769931946.126945-65903-19266130206660/source _original_basename=tmpr_kl_g4x follow=False checksum=b3e7ecdcc699d217c6b083a91b07208207813d93 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:45:46 np0005604215.localdomain sudo[35663]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:47 np0005604215.localdomain sudo[35693]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrpoadyjaxtjhendiqkkvoprlxxxfndu ; /usr/bin/python3
Feb 01 07:45:47 np0005604215.localdomain sudo[35693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:47 np0005604215.localdomain python3[35695]: ansible-file Invoked with path=/home/tripleo-admin state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:45:47 np0005604215.localdomain sudo[35693]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:47 np0005604215.localdomain sudo[35709]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rumzjlswuvencrknxfzhsjzolqooktze ; /usr/bin/python3
Feb 01 07:45:47 np0005604215.localdomain sudo[35709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:47 np0005604215.localdomain python3[35711]: ansible-file Invoked with path=/home/tripleo-admin/.ssh state=directory owner=tripleo-admin group=tripleo-admin mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:45:47 np0005604215.localdomain sudo[35709]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:48 np0005604215.localdomain sudo[35725]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yzsqutwuyrwobwctgfqmsdnpiovgpjqw ; /usr/bin/python3
Feb 01 07:45:48 np0005604215.localdomain sudo[35725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:48 np0005604215.localdomain python3[35727]: ansible-file Invoked with path=/home/tripleo-admin/.ssh/authorized_keys state=touch owner=tripleo-admin group=tripleo-admin mode=384 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:45:48 np0005604215.localdomain sudo[35725]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:49 np0005604215.localdomain sudo[35741]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uahipwlcjcmugdjkplptpwovwseryadc ; /usr/bin/python3
Feb 01 07:45:49 np0005604215.localdomain sudo[35741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 07:45:49 np0005604215.localdomain python3[35743]: ansible-lineinfile Invoked with path=/home/tripleo-admin/.ssh/authorized_keys line=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3PGk8eQ/HLnsyRzL8h5NfdCJluWZpaEZ6rXdDgbE0lw7uxHSFELY1SydQI6S9XXHDytudbXCsaTQesdKbdbGdHJj+Vg1gGMFqFoy4uSiBfcCXPrrCkLSus2YLNROASby08xEMRmyMtENrRZxLnhhab6au+uTTgjrzCQYER0PPqsmRyQSw+7T1mpjYdlu7KIQYLe0QTYZWg9qnRz3OQs3ed297w+gXNzQDadWOmWRrqVrG76umhtGZrmJCY+I0xUANvOtiQSFT89RlUBKK2jyA9a/TXr/TBu9+r7PJ/Y4ayoabn3z0m1V8WEY0u5V2/k3yqFndPYU//bBN0nlq90J+EMZPG7yU8fXbmL3KQQG9wWh4grfR0sRjBLd3o2eYVr2minX8gho1p+AosyJZ8aSpq86KLny3WC9JVc4/RqUWVvQ34IbOKg2Ef1+HJDFpRGvPN6pvTfUfBHSYnk3sX22e11wLjEi2Z+2kffa1GY++d6pvqQLop2x0re8+mhNTRaE= zuul-build-sshkey
                                                          regexp=Generated by TripleO state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:45:49 np0005604215.localdomain sudo[35741]: pam_unix(sudo:session): session closed for user root
Feb 01 07:45:50 np0005604215.localdomain python3[35757]: ansible-ping Invoked with data=pong
Feb 01 07:46:00 np0005604215.localdomain sshd[35759]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:46:01 np0005604215.localdomain sshd[35759]: Accepted publickey for tripleo-admin from 192.168.122.100 port 33956 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 07:46:01 np0005604215.localdomain systemd-logind[761]: New session 28 of user tripleo-admin.
Feb 01 07:46:01 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 01 07:46:01 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 01 07:46:01 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 01 07:46:01 np0005604215.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Queued start job for default target Main User Target.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Created slice User Application Slice.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Reached target Paths.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Reached target Timers.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Starting D-Bus User Message Bus Socket...
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Starting Create User's Volatile Files and Directories...
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Listening on D-Bus User Message Bus Socket.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Finished Create User's Volatile Files and Directories.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Reached target Sockets.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Reached target Basic System.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Reached target Main User Target.
Feb 01 07:46:01 np0005604215.localdomain systemd[35763]: Startup finished in 121ms.
Feb 01 07:46:01 np0005604215.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 01 07:46:01 np0005604215.localdomain systemd[1]: Started Session 28 of User tripleo-admin.
Feb 01 07:46:01 np0005604215.localdomain sshd[35759]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 01 07:46:01 np0005604215.localdomain sudo[35823]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-siupgpwuuzaljwrnnuppcvarvkugfued ; /usr/bin/python3
Feb 01 07:46:01 np0005604215.localdomain sudo[35823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:01 np0005604215.localdomain python3[35825]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all', 'min'] gather_timeout=45 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 07:46:01 np0005604215.localdomain sudo[35823]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:06 np0005604215.localdomain sudo[35843]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfsyvykwyihucssuulhqkobwzfhyxjlu ; /usr/bin/python3
Feb 01 07:46:06 np0005604215.localdomain sudo[35843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:06 np0005604215.localdomain python3[35845]: ansible-selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config
Feb 01 07:46:06 np0005604215.localdomain sudo[35843]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:07 np0005604215.localdomain sudo[35859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwigzrnbjzqfonicrqtsqgwqbjzuckjy ; /usr/bin/python3
Feb 01 07:46:07 np0005604215.localdomain sudo[35859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:07 np0005604215.localdomain python3[35861]: ansible-tempfile Invoked with state=file suffix=tmphosts prefix=ansible. path=None
Feb 01 07:46:07 np0005604215.localdomain sudo[35859]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:07 np0005604215.localdomain sudo[35907]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-khbihjykekjguzkojqhbsuyhesbyeipg ; /usr/bin/python3
Feb 01 07:46:07 np0005604215.localdomain sudo[35907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:08 np0005604215.localdomain python3[35909]: ansible-ansible.legacy.copy Invoked with remote_src=True src=/etc/hosts dest=/tmp/ansible.8k3rx8dptmphosts mode=preserve backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:46:08 np0005604215.localdomain sudo[35907]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:08 np0005604215.localdomain sudo[35937]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqzuqgbtslyylfwaweovrirppajtpsxh ; /usr/bin/python3
Feb 01 07:46:08 np0005604215.localdomain sudo[35937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:08 np0005604215.localdomain python3[35939]: ansible-blockinfile Invoked with state=absent path=/tmp/ansible.8k3rx8dptmphosts block= marker=# {mark} marker_begin=HEAT_HOSTS_START - Do not edit manually within this section! marker_end=HEAT_HOSTS_END create=False backup=False unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:46:08 np0005604215.localdomain sudo[35937]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:09 np0005604215.localdomain sudo[35953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvcedahbcgkgrfwsxhbluebrmatcfgzn ; /usr/bin/python3
Feb 01 07:46:09 np0005604215.localdomain sudo[35953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:09 np0005604215.localdomain python3[35955]: ansible-blockinfile Invoked with create=True path=/tmp/ansible.8k3rx8dptmphosts insertbefore=BOF block=172.17.0.106 np0005604212.localdomain np0005604212
                                                         172.18.0.106 np0005604212.storage.localdomain np0005604212.storage
                                                         172.20.0.106 np0005604212.storagemgmt.localdomain np0005604212.storagemgmt
                                                         172.17.0.106 np0005604212.internalapi.localdomain np0005604212.internalapi
                                                         172.19.0.106 np0005604212.tenant.localdomain np0005604212.tenant
                                                         192.168.122.106 np0005604212.ctlplane.localdomain np0005604212.ctlplane
                                                         172.17.0.107 np0005604213.localdomain np0005604213
                                                         172.18.0.107 np0005604213.storage.localdomain np0005604213.storage
                                                         172.20.0.107 np0005604213.storagemgmt.localdomain np0005604213.storagemgmt
                                                         172.17.0.107 np0005604213.internalapi.localdomain np0005604213.internalapi
                                                         172.19.0.107 np0005604213.tenant.localdomain np0005604213.tenant
                                                         192.168.122.107 np0005604213.ctlplane.localdomain np0005604213.ctlplane
                                                         172.17.0.108 np0005604215.localdomain np0005604215
                                                         172.18.0.108 np0005604215.storage.localdomain np0005604215.storage
                                                         172.20.0.108 np0005604215.storagemgmt.localdomain np0005604215.storagemgmt
                                                         172.17.0.108 np0005604215.internalapi.localdomain np0005604215.internalapi
                                                         172.19.0.108 np0005604215.tenant.localdomain np0005604215.tenant
                                                         192.168.122.108 np0005604215.ctlplane.localdomain np0005604215.ctlplane
                                                         172.17.0.103 np0005604209.localdomain np0005604209
                                                         172.18.0.103 np0005604209.storage.localdomain np0005604209.storage
                                                         172.20.0.103 np0005604209.storagemgmt.localdomain np0005604209.storagemgmt
                                                         172.17.0.103 np0005604209.internalapi.localdomain np0005604209.internalapi
                                                         172.19.0.103 np0005604209.tenant.localdomain np0005604209.tenant
                                                         192.168.122.103 np0005604209.ctlplane.localdomain np0005604209.ctlplane
                                                         172.17.0.104 np0005604210.localdomain np0005604210
                                                         172.18.0.104 np0005604210.storage.localdomain np0005604210.storage
                                                         172.20.0.104 np0005604210.storagemgmt.localdomain np0005604210.storagemgmt
                                                         172.17.0.104 np0005604210.internalapi.localdomain np0005604210.internalapi
                                                         172.19.0.104 np0005604210.tenant.localdomain np0005604210.tenant
                                                         192.168.122.104 np0005604210.ctlplane.localdomain np0005604210.ctlplane
                                                         172.17.0.105 np0005604211.localdomain np0005604211
                                                         172.18.0.105 np0005604211.storage.localdomain np0005604211.storage
                                                         172.20.0.105 np0005604211.storagemgmt.localdomain np0005604211.storagemgmt
                                                         172.17.0.105 np0005604211.internalapi.localdomain np0005604211.internalapi
                                                         172.19.0.105 np0005604211.tenant.localdomain np0005604211.tenant
                                                         192.168.122.105 np0005604211.ctlplane.localdomain np0005604211.ctlplane
                                                         
                                                         192.168.122.100 undercloud.ctlplane.localdomain undercloud.ctlplane
                                                         192.168.122.99  overcloud.ctlplane.localdomain
                                                         172.18.0.154  overcloud.storage.localdomain
                                                         172.20.0.122  overcloud.storagemgmt.localdomain
                                                         172.17.0.228  overcloud.internalapi.localdomain
                                                         172.21.0.164  overcloud.localdomain
                                                          marker=# {mark} marker_begin=START_HOST_ENTRIES_FOR_STACK: overcloud marker_end=END_HOST_ENTRIES_FOR_STACK: overcloud state=present backup=False unsafe_writes=False insertafter=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:46:09 np0005604215.localdomain sudo[35953]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:09 np0005604215.localdomain sudo[35969]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pkgevcolhmqhyjdlgmhycrbuihfxexef ; /usr/bin/python3
Feb 01 07:46:09 np0005604215.localdomain sudo[35969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:10 np0005604215.localdomain python3[35971]: ansible-ansible.legacy.command Invoked with _raw_params=cp "/tmp/ansible.8k3rx8dptmphosts" "/etc/hosts" _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:46:10 np0005604215.localdomain sudo[35969]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:10 np0005604215.localdomain sudo[35986]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngorxiwhyqyusgbgsvzprrbznpdyerqe ; /usr/bin/python3
Feb 01 07:46:10 np0005604215.localdomain sudo[35986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:10 np0005604215.localdomain python3[35988]: ansible-file Invoked with path=/tmp/ansible.8k3rx8dptmphosts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:46:10 np0005604215.localdomain sudo[35986]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:11 np0005604215.localdomain sudo[36002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lathsjwcresjxzaujmvunyvsbslehwvy ; /usr/bin/python3
Feb 01 07:46:11 np0005604215.localdomain sudo[36002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:11 np0005604215.localdomain python3[36004]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides rhosp-release _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:46:11 np0005604215.localdomain sudo[36002]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:12 np0005604215.localdomain sudo[36019]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtalihmlcuipjscmwsepywqhzdswvlbm ; /usr/bin/python3
Feb 01 07:46:12 np0005604215.localdomain sudo[36019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:12 np0005604215.localdomain python3[36021]: ansible-ansible.legacy.dnf Invoked with name=['rhosp-release'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:46:13 np0005604215.localdomain sudo[36023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:46:13 np0005604215.localdomain sudo[36023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:46:13 np0005604215.localdomain sudo[36023]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:13 np0005604215.localdomain sudo[36038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:46:13 np0005604215.localdomain sudo[36038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:46:13 np0005604215.localdomain sudo[36038]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:14 np0005604215.localdomain sudo[36084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:46:14 np0005604215.localdomain sudo[36084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:46:14 np0005604215.localdomain sudo[36084]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:15 np0005604215.localdomain sudo[36019]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:16 np0005604215.localdomain sudo[36114]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebxjrhdvmxkzbpbmfvjlrpxrmogecjfu ; /usr/bin/python3
Feb 01 07:46:16 np0005604215.localdomain sudo[36114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:17 np0005604215.localdomain python3[36116]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:46:17 np0005604215.localdomain sudo[36114]: pam_unix(sudo:session): session closed for user root
Feb 01 07:46:17 np0005604215.localdomain sudo[36131]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aufltyeushftdmusptsmkhrlrezelhlu ; /usr/bin/python3
Feb 01 07:46:17 np0005604215.localdomain sudo[36131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:46:17 np0005604215.localdomain python3[36133]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'jq', 'nftables', 'openvswitch', 'openstack-heat-agents', 'openstack-selinux', 'os-net-config', 'python3-libselinux', 'python3-pyyaml', 'puppet-tripleo', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:46:31 np0005604215.localdomain groupadd[36305]: group added to /etc/group: name=puppet, GID=52
Feb 01 07:46:31 np0005604215.localdomain groupadd[36305]: group added to /etc/gshadow: name=puppet
Feb 01 07:46:31 np0005604215.localdomain groupadd[36305]: new group: name=puppet, GID=52
Feb 01 07:46:31 np0005604215.localdomain useradd[36312]: new user: name=puppet, UID=52, GID=52, home=/var/lib/puppet, shell=/sbin/nologin, from=none
Feb 01 07:47:14 np0005604215.localdomain sudo[37195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:47:14 np0005604215.localdomain sudo[37195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:47:14 np0005604215.localdomain sudo[37195]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:14 np0005604215.localdomain sudo[37210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:47:14 np0005604215.localdomain sudo[37210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:47:15 np0005604215.localdomain sudo[37210]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:16 np0005604215.localdomain sudo[37284]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:47:16 np0005604215.localdomain sudo[37284]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:47:16 np0005604215.localdomain sudo[37284]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  Converting 2698 SID table entries...
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:47:24 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:47:24 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Feb 01 07:47:24 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:47:24 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 07:47:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:47:24 np0005604215.localdomain systemd-rc-local-generator[37426]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:47:24 np0005604215.localdomain systemd-sysv-generator[37430]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:47:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:47:24 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 07:47:25 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 07:47:25 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 07:47:25 np0005604215.localdomain systemd[1]: run-rc6c62f276abf46498b39f4472c490dd4.service: Deactivated successfully.
Feb 01 07:47:26 np0005604215.localdomain sudo[36131]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:33 np0005604215.localdomain sudo[37867]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fvxomjixtntpxxejosqjfcaeciozkehj ; /usr/bin/python3
Feb 01 07:47:33 np0005604215.localdomain sudo[37867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:33 np0005604215.localdomain python3[37869]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 jq nftables openvswitch openstack-heat-agents openstack-selinux os-net-config python3-libselinux python3-pyyaml puppet-tripleo rsync tmpwatch sysstat iproute-tc _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:34 np0005604215.localdomain sudo[37867]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:34 np0005604215.localdomain sudo[38006]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koclrexpyfvsudbwnimzomjebnehvksx ; /usr/bin/python3
Feb 01 07:47:34 np0005604215.localdomain sudo[38006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:35 np0005604215.localdomain python3[38008]: ansible-ansible.legacy.systemd Invoked with name=openvswitch enabled=True state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:47:35 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:47:35 np0005604215.localdomain systemd-rc-local-generator[38038]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:47:35 np0005604215.localdomain systemd-sysv-generator[38041]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:47:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:47:35 np0005604215.localdomain sudo[38006]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:36 np0005604215.localdomain sudo[38060]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjvrliqqaldsobybcakrhbagojsvhxam ; /usr/bin/python3
Feb 01 07:47:36 np0005604215.localdomain sudo[38060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:36 np0005604215.localdomain python3[38062]: ansible-file Invoked with path=/var/lib/heat-config/tripleo-config-download state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:47:37 np0005604215.localdomain sudo[38060]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:37 np0005604215.localdomain sudo[38076]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-itqgnwriwufpstwxspawfwxovpxpomok ; /usr/bin/python3
Feb 01 07:47:37 np0005604215.localdomain sudo[38076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:37 np0005604215.localdomain python3[38078]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides openstack-network-scripts _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:37 np0005604215.localdomain sudo[38076]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:37 np0005604215.localdomain sudo[38093]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxhwjutsergidkmiqparemsamjbqhoxk ; /usr/bin/python3
Feb 01 07:47:37 np0005604215.localdomain sudo[38093]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:38 np0005604215.localdomain python3[38095]: ansible-systemd Invoked with name=NetworkManager enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 01 07:47:38 np0005604215.localdomain sudo[38093]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:38 np0005604215.localdomain sudo[38111]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vnetwcfljlakgbpewneopzvfkoemqoaq ; /usr/bin/python3
Feb 01 07:47:38 np0005604215.localdomain sudo[38111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:38 np0005604215.localdomain python3[38113]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=dns value=none backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:47:38 np0005604215.localdomain sudo[38111]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:38 np0005604215.localdomain sudo[38129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlsefivkhsjeluugbavvqyeoomzijzej ; /usr/bin/python3
Feb 01 07:47:38 np0005604215.localdomain sudo[38129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:39 np0005604215.localdomain python3[38131]: ansible-ini_file Invoked with path=/etc/NetworkManager/NetworkManager.conf state=present no_extra_spaces=True section=main option=rc-manager value=unmanaged backup=True exclusive=True allow_no_value=False create=True unsafe_writes=False values=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:47:39 np0005604215.localdomain sudo[38129]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:39 np0005604215.localdomain sudo[38147]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvltxxhwpbyabnelqgfapmhgkxwwiifd ; /usr/bin/python3
Feb 01 07:47:39 np0005604215.localdomain sudo[38147]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:39 np0005604215.localdomain python3[38149]: ansible-ansible.legacy.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:47:39 np0005604215.localdomain systemd[1]: Reloading Network Manager...
Feb 01 07:47:39 np0005604215.localdomain NetworkManager[5972]: <info>  [1769932059.7595] audit: op="reload" arg="0" pid=38152 uid=0 result="success"
Feb 01 07:47:39 np0005604215.localdomain NetworkManager[5972]: <info>  [1769932059.7607] config: signal: SIGHUP,config-files,values,values-user,no-auto-default,dns-mode,rc-manager (/etc/NetworkManager/NetworkManager.conf (lib: 00-server.conf) (run: 15-carrier-timeout.conf))
Feb 01 07:47:39 np0005604215.localdomain NetworkManager[5972]: <info>  [1769932059.7607] dns-mgr: init: dns=none,systemd-resolved rc-manager=unmanaged
Feb 01 07:47:39 np0005604215.localdomain systemd[1]: Reloaded Network Manager.
Feb 01 07:47:39 np0005604215.localdomain sudo[38147]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:40 np0005604215.localdomain sudo[38166]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pljjolcwazszagffnmtwpmixtewqxcdk ; /usr/bin/python3
Feb 01 07:47:40 np0005604215.localdomain sudo[38166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:41 np0005604215.localdomain python3[38168]: ansible-ansible.legacy.command Invoked with _raw_params=ln -f -s /usr/share/openstack-puppet/modules/* /etc/puppet/modules/ _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:41 np0005604215.localdomain sudo[38166]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:41 np0005604215.localdomain sudo[38183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acfqbrevbgosxzpfaeveukmgvssyprfr ; /usr/bin/python3
Feb 01 07:47:41 np0005604215.localdomain sudo[38183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:41 np0005604215.localdomain python3[38185]: ansible-stat Invoked with path=/usr/bin/ansible-playbook follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:47:41 np0005604215.localdomain sudo[38183]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:41 np0005604215.localdomain sudo[38201]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtecmhaeqoppiqtmcecbcgcwkezmfywo ; /usr/bin/python3
Feb 01 07:47:41 np0005604215.localdomain sudo[38201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:42 np0005604215.localdomain python3[38203]: ansible-stat Invoked with path=/usr/bin/ansible-playbook-3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:47:42 np0005604215.localdomain sudo[38201]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:42 np0005604215.localdomain sudo[38217]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecvnfrgqfuwayhmkzgrdjemiugebnjuy ; /usr/bin/python3
Feb 01 07:47:42 np0005604215.localdomain sudo[38217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:42 np0005604215.localdomain python3[38219]: ansible-file Invoked with state=link src=/usr/bin/ansible-playbook path=/usr/bin/ansible-playbook-3 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:47:42 np0005604215.localdomain sudo[38217]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:42 np0005604215.localdomain sudo[38233]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzgwwtswgbfiyekjkfcxorcugudmaoix ; /usr/bin/python3
Feb 01 07:47:42 np0005604215.localdomain sudo[38233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:43 np0005604215.localdomain python3[38235]: ansible-tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 01 07:47:43 np0005604215.localdomain sudo[38233]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:43 np0005604215.localdomain sudo[38249]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-baismsurowqjhtybyokhmmktauibcvba ; /usr/bin/python3
Feb 01 07:47:43 np0005604215.localdomain sudo[38249]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:43 np0005604215.localdomain python3[38251]: ansible-stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:47:43 np0005604215.localdomain sudo[38249]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:44 np0005604215.localdomain sudo[38265]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qeoinrmdmtgchzgtblcfeacxrzskoeiy ; /usr/bin/python3
Feb 01 07:47:44 np0005604215.localdomain sudo[38265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:44 np0005604215.localdomain python3[38267]: ansible-blockinfile Invoked with path=/tmp/ansible.g_mn0wml block=[192.168.122.106]*,[np0005604212.ctlplane.localdomain]*,[172.17.0.106]*,[np0005604212.internalapi.localdomain]*,[172.18.0.106]*,[np0005604212.storage.localdomain]*,[172.20.0.106]*,[np0005604212.storagemgmt.localdomain]*,[172.19.0.106]*,[np0005604212.tenant.localdomain]*,[np0005604212.localdomain]*,[np0005604212]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCx/MKX//74FswFkw1c1lfM5mahSRoD4B8bhCZSm2/IQ//syuq+Qpi1sEoMv/N1mOrU8atXNtYkVNozl/ypDe2YJkUS8OTt37bT9A7XnBlfFSc5OwXS7VGHpVWbiMbImJibSV7HjoQP0yA8SvCJCcrI3Eh14+cna8tT1rJ9lOFRHvxLfG52XnzFiNUVDU+TG3uRtWEjY5epI8j/U73tEqdP4OAk7ZQ9riN1nllCCIs9FOErOEw14VW+151TbOCzcm9kvzeQMit9jPXTGqmTPKoidZFLhJwEAXq4M9+DFfKQWkVSqfcU3cvPz6S03lUcpPWiJxgGZiIPXxCdRjvI3bKCm898lFYwZq8EfdAwUFMyhmz4GHSyhMwqZWE46cikXf/skoSrEF8ji3NjmyQL7T304iKenZca6rHDI56veO0+PTzZj/pBiaWBWXlqF0WQLAn804z3yapsLNuR8R4EaREmk1Tc2ESg1//73pCUypwEMQWESHsAJ/LCHhyqNHY6Bjc=
                                                         [192.168.122.107]*,[np0005604213.ctlplane.localdomain]*,[172.17.0.107]*,[np0005604213.internalapi.localdomain]*,[172.18.0.107]*,[np0005604213.storage.localdomain]*,[172.20.0.107]*,[np0005604213.storagemgmt.localdomain]*,[172.19.0.107]*,[np0005604213.tenant.localdomain]*,[np0005604213.localdomain]*,[np0005604213]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDhh44DuXnO4hBZJvT1vLnO8ZhT8GKLkBI0M+Q/lXSbHymnCyNerLMqVRhTb5ZUw07lkP6FtBJS95SUtdJuAbUi4jphShtJfBdicoa+uGqI1icHUQCbtCAACtas0lGeGi5q/q1LfzeuKh+LTRj60W+r2OZoChKxeSWYBQ8gIScKe1HgVCJVEESXwNv4CBs6ffOWVYHE+3JDUA3AN3nX931xw4oLMBkwi0q4sNh9Sb0oS79OX+dKdlGfnPLLWKF9QrLrHYdHVkKtPre9d1BdNkl38gRE45uwrAAxXBfeZjbzzfbUlWb54SZwL8P2ej29L5VAbE/97j1HD6+kUZ5wFb6v9oJyFwq8udFDqO1SUMkW4t1VmwD5G4rIU2+u0yHd4H7//fgbf8WAhPv1Qx5tXEqB6LIHqYCz7RekNQO5Xv8ge/gVMzzlxB0DJP6a4DJ8E0/Djnyzw81L2fmyeriPLqt/n/wHscNr1RRI4T1X2iINRwk5QfrxwTEHhJ00FY1kB90=
                                                         [192.168.122.108]*,[np0005604215.ctlplane.localdomain]*,[172.17.0.108]*,[np0005604215.internalapi.localdomain]*,[172.18.0.108]*,[np0005604215.storage.localdomain]*,[172.20.0.108]*,[np0005604215.storagemgmt.localdomain]*,[172.19.0.108]*,[np0005604215.tenant.localdomain]*,[np0005604215.localdomain]*,[np0005604215]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/jKlZ/vxfazmNjpekfENGpQi8TTD6ErYy0BH9P8CRIiiKVdA/53XGSAQlY17b4tT5hzyHsUuXDmbv5R98FSy/Fi8F4KrjgogVPhd/zYoMrffr9ydwv+ih2mIyCPjZC+N92i92gM2OBHBXj5vqyh5yl1t4H1LhFab7P/m42K75mcTytGvGTLKXZbcs/1Ot/APGrs5wqg/c9XFQtgBEn6ttSKQ9caqbgUw88VGRkzaHvzheQvtIjZL0AwigTS24tqFx+bF+liSnSaYk1R8TKe1yMNODv5OCUmFYvPqls4Y3AQkpuroQQXHcQCe0QPuz9nGgPebNOxyTHsK66oDWIUskoYIbrZZhjDxlpdzJ+POEU/jXtGox0/0wlpRK7jNN6r4Fzx6uIzxB5SWn/UJ4BYS853pUsC32TeD0pZXfUAzOGUOzQfvYkUCElyRi8zDN4ubwEWnxvCEPaAFihafbviqQwLNFFmth36owDHV2zU/Q/BtW8vrwfx0cPr2A4WvQvp8=
                                                         [192.168.122.103]*,[np0005604209.ctlplane.localdomain]*,[172.17.0.103]*,[np0005604209.internalapi.localdomain]*,[172.18.0.103]*,[np0005604209.storage.localdomain]*,[172.20.0.103]*,[np0005604209.storagemgmt.localdomain]*,[172.19.0.103]*,[np0005604209.tenant.localdomain]*,[np0005604209.localdomain]*,[np0005604209]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAdXF2/8XBq3bWgr/9swIkzjlkm7PzpC1vdYXglaExGeIUwK5n05/HLobUMrYjOh6yE81+tctBT51wuPLw9qOGf4X3lRx3x0AHUqWSs00OL5nZsMRAd6PknZVyeCWf9jv13mVWIExCYbP8e4VK4M3w1m2xSLFd1aHtGkEUYJKCmacxrxFu2opq+kNCclpMC0BlFeSeX/NZeGwcfVCEyP46JVB9pNDo6D4s98FzzQNtG4DTv8NqE0S8Fj44dajq/80IKXeVEbhVmBikwFGMMEHhsRass2m0Q0rBw1Cv2jqW9hrTO1AWHY2aNDDqr6cKttP27XKfc/unDFFDb0mcc/HRa8JAUYEvuO0FIV6n28+Q5hWoYHAZfMU15U/bQPN1UxbF/MmSIZWvwY+vzCJ+icSJ9qfhDfbd1DttRuV0F3Jdi0jq01TyyPdOz8qT7kKSftD3Awn6BNLlseR8MaOTS+YF4fOnSP/xzj0B+nx/nr5Mrq8+QzKb2YyqdMfWWMGdCw8=
                                                         [192.168.122.104]*,[np0005604210.ctlplane.localdomain]*,[172.17.0.104]*,[np0005604210.internalapi.localdomain]*,[172.18.0.104]*,[np0005604210.storage.localdomain]*,[172.20.0.104]*,[np0005604210.storagemgmt.localdomain]*,[172.19.0.104]*,[np0005604210.tenant.localdomain]*,[np0005604210.localdomain]*,[np0005604210]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDeVlqpmEgZX6yoZkE7SzVbEM6MqJe/9qDZPPgFZPb/N85k+uB3cINsoq0pMJYeKjcKY8H56WyuNkVVwVHaouZnJCN4p1rCJmATIDieU8QMDwGucQpbrNRrQWheWQDkmHNIPOxnUDCRgEzDfYiaE4prLHMPKtf8XJAKUKVd6lpZrVSCovGz0UC3U1Le/0N1PJOi4kYEuipVrcfoYHC63A32I+w+7tybU8Rpknhc/UHhdn39PBGuAhbkSf2JEJbLLzLaPkZXT6HOPiBUT9jWKnymCGEcfPjIWOkeelx3fkPoXZCtnYHlSoQSkCVsUmXgHNj7X3+6sJi9+iV/+8jRWQyk6aCC+HjXDhSwxbBUaM9AOimJ9EK7vo8/IK9pQ3gNsEct6rHuvGytACNMWpaT5sRRaVEnS8uz/PL8urB6+59GYGunjAaw8lCQcxw+VNVJaLtj+BpVJZA2EA6XE4fwq7v0s9u0ApIMSyV3DcYzIcDFlT11I5g3RM8vZNipXfnub3U=
                                                         [192.168.122.105]*,[np0005604211.ctlplane.localdomain]*,[172.17.0.105]*,[np0005604211.internalapi.localdomain]*,[172.18.0.105]*,[np0005604211.storage.localdomain]*,[172.20.0.105]*,[np0005604211.storagemgmt.localdomain]*,[172.19.0.105]*,[np0005604211.tenant.localdomain]*,[np0005604211.localdomain]*,[np0005604211]* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCQ5JUOdiESLpaYomijw3u9LxHN4VxpmenW9EczyVvVdofuEESAIR1Q8BIVkW7gxgVyrzHxOpbaoAS+aZaKazruu7/chC8MkDw1lvfeyQwMZax6UziUan2wIFVTaCc7kITOHrdWkJm+OIvCs/ImtkSgsTmvTiQedvs86ME3gHNyA+7taoDXnH6UCB6d5ex6PzwXsKI03iUVWFfsGP3ZU7r52IBwgrLG+VplbaPBRNNP/RvKULVsokG3UCMd3pjHv3VYBdXPYTFOPf666ZEuxEz+Frz43oXzEhr4W61RN70cAFJDDFoOmBDxXzZqrmF7r1vSV3ojl+aHaVLCGL4Wnjrp9wl5Zq8XCGN/7ttzaZKrjj/flccfBEiYL9odgqp92EjmxsRqG4bFq/nEzS/DTJ88QQVpGQNC2T6bElJVdBIrpZAyv7n5HlwNQwfsltQtzbqe1E32azZb1wq13ajV9Ii7QrVd81nGYFM79NqiVVbXs5NypsJOMQ6ZoqyHK5+yyHk=
                                                          create=True state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:47:44 np0005604215.localdomain sudo[38265]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:44 np0005604215.localdomain sudo[38281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqroyviodtfrxuqkcykczzubapblaxtk ; /usr/bin/python3
Feb 01 07:47:44 np0005604215.localdomain sudo[38281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:44 np0005604215.localdomain python3[38283]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.g_mn0wml' > /etc/ssh/ssh_known_hosts _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:44 np0005604215.localdomain sudo[38281]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:45 np0005604215.localdomain sudo[38299]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-akaljiscesysapnbbiubdzcuytznmruv ; /usr/bin/python3
Feb 01 07:47:45 np0005604215.localdomain sudo[38299]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:45 np0005604215.localdomain python3[38301]: ansible-file Invoked with path=/tmp/ansible.g_mn0wml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:47:45 np0005604215.localdomain sudo[38299]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:45 np0005604215.localdomain sudo[38315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nshwmpipfzsfoyjedxvumcksxnpflwqm ; /usr/bin/python3
Feb 01 07:47:45 np0005604215.localdomain sudo[38315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:46 np0005604215.localdomain python3[38317]: ansible-file Invoked with path=/var/log/journal state=directory mode=0750 owner=root group=root setype=var_log_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:47:46 np0005604215.localdomain sudo[38315]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:46 np0005604215.localdomain sudo[38331]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjzeeodfcxtzgmypuvbwochdzppwytpb ; /usr/bin/python3
Feb 01 07:47:46 np0005604215.localdomain sudo[38331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:46 np0005604215.localdomain python3[38333]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active cloud-init.service || systemctl is-enabled cloud-init.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:46 np0005604215.localdomain sudo[38331]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:46 np0005604215.localdomain sudo[38349]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byqlulzpybpxtqspngjsvtyrvdpjrlqb ; /usr/bin/python3
Feb 01 07:47:46 np0005604215.localdomain sudo[38349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:46 np0005604215.localdomain python3[38351]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline | grep -q cloud-init=disabled _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:46 np0005604215.localdomain sudo[38349]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:47 np0005604215.localdomain sudo[38368]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oputprbnbtbdpgwvypafvpdewfhxhfac ; /usr/bin/python3
Feb 01 07:47:47 np0005604215.localdomain sudo[38368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:47 np0005604215.localdomain python3[38370]: ansible-community.general.cloud_init_data_facts Invoked with filter=status
Feb 01 07:47:47 np0005604215.localdomain sudo[38368]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:47 np0005604215.localdomain sudo[38384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfrwbgeltpkkjbbeavnkoqymtseuqyzr ; /usr/bin/python3
Feb 01 07:47:47 np0005604215.localdomain sudo[38384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:47 np0005604215.localdomain sudo[38384]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:47 np0005604215.localdomain sudo[38432]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekjxecippsqkssmmwsllnztigedcoojw ; /usr/bin/python3
Feb 01 07:47:47 np0005604215.localdomain sudo[38432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:48 np0005604215.localdomain sudo[38432]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:48 np0005604215.localdomain sudo[38475]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxlngvdrmqjcpehjgvwzmwqwyycessxb ; /usr/bin/python3
Feb 01 07:47:48 np0005604215.localdomain sudo[38475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:48 np0005604215.localdomain sudo[38475]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:49 np0005604215.localdomain sudo[38505]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apecovdlazguglrjinxwpstxpiuxphst ; /usr/bin/python3
Feb 01 07:47:49 np0005604215.localdomain sudo[38505]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:49 np0005604215.localdomain python3[38507]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:49 np0005604215.localdomain sudo[38505]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:50 np0005604215.localdomain sudo[38522]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdlbtjvcxzkznzujfhhqjrfbluxwggwa ; /usr/bin/python3
Feb 01 07:47:50 np0005604215.localdomain sudo[38522]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:50 np0005604215.localdomain python3[38524]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:47:53 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 07:47:53 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 07:47:53 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:47:53 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 07:47:53 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:47:53 np0005604215.localdomain systemd-sysv-generator[38595]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:47:53 np0005604215.localdomain systemd-rc-local-generator[38588]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:47:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:47:53 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 07:47:53 np0005604215.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: tuned.service: Consumed 1.455s CPU time.
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 07:47:54 np0005604215.localdomain systemd[1]: run-r63042f02508a4418809bd1306bcf4753.service: Deactivated successfully.
Feb 01 07:47:55 np0005604215.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 01 07:47:55 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:47:55 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 07:47:55 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 07:47:55 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 07:47:55 np0005604215.localdomain systemd[1]: run-r1a023bd0904c40ffaf91e2c965e2a52d.service: Deactivated successfully.
Feb 01 07:47:56 np0005604215.localdomain sudo[38522]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:56 np0005604215.localdomain sudo[38959]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egmjrsdycvbdnfkntzcldgbeachdrgsn ; /usr/bin/python3
Feb 01 07:47:56 np0005604215.localdomain sudo[38959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:56 np0005604215.localdomain python3[38961]: ansible-systemd Invoked with name=tuned state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:47:56 np0005604215.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 01 07:47:56 np0005604215.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 01 07:47:56 np0005604215.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 01 07:47:56 np0005604215.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 01 07:47:58 np0005604215.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 01 07:47:58 np0005604215.localdomain sudo[38959]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:58 np0005604215.localdomain sudo[39154]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umdeonleeyuxnisrhpydybecfphsexus ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 01 07:47:58 np0005604215.localdomain sudo[39154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:58 np0005604215.localdomain python3[39156]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:47:58 np0005604215.localdomain sudo[39154]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:59 np0005604215.localdomain sudo[39171]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uskxzjkwrkcfruqpdidaqlidztzrcnaw ; /usr/bin/python3
Feb 01 07:47:59 np0005604215.localdomain sudo[39171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:59 np0005604215.localdomain python3[39173]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Feb 01 07:47:59 np0005604215.localdomain sudo[39171]: pam_unix(sudo:session): session closed for user root
Feb 01 07:47:59 np0005604215.localdomain sudo[39187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rycexhciuwuwyyvrffrnjvfypznklxah ; /usr/bin/python3
Feb 01 07:47:59 np0005604215.localdomain sudo[39187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:47:59 np0005604215.localdomain python3[39189]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:47:59 np0005604215.localdomain sudo[39187]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:00 np0005604215.localdomain sudo[39203]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnmvhgimejgxxnnwjzggshzalelvmfgl ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 01 07:48:00 np0005604215.localdomain sudo[39203]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:00 np0005604215.localdomain python3[39205]: ansible-ansible.legacy.command Invoked with _raw_params=tuned-adm profile throughput-performance _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:01 np0005604215.localdomain sudo[39203]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:01 np0005604215.localdomain sudo[39223]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axshqmfstuunfdqlkalfzpngdwjfvpbt ; /usr/bin/python3
Feb 01 07:48:01 np0005604215.localdomain sudo[39223]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:01 np0005604215.localdomain python3[39225]: ansible-ansible.legacy.command Invoked with _raw_params=cat /proc/cmdline _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:02 np0005604215.localdomain sudo[39223]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:02 np0005604215.localdomain sudo[39240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thlfdgubmndjlxbkoyohmqvvfrkvltxx ; /usr/bin/python3
Feb 01 07:48:02 np0005604215.localdomain sudo[39240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:02 np0005604215.localdomain python3[39242]: ansible-stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:48:02 np0005604215.localdomain sudo[39240]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:04 np0005604215.localdomain sudo[39256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgekgotydmobczytggakthkdyemaqzua ; /usr/bin/python3
Feb 01 07:48:04 np0005604215.localdomain sudo[39256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:04 np0005604215.localdomain python3[39258]: ansible-replace Invoked with regexp=TRIPLEO_HEAT_TEMPLATE_KERNEL_ARGS dest=/etc/default/grub replace= path=/etc/default/grub backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:04 np0005604215.localdomain sudo[39256]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:10 np0005604215.localdomain sudo[39272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wacpulazjprmcjiqrxnssytzlukyuepb ; /usr/bin/python3
Feb 01 07:48:10 np0005604215.localdomain sudo[39272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:10 np0005604215.localdomain python3[39274]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:10 np0005604215.localdomain sudo[39272]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:11 np0005604215.localdomain sudo[39320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlshtlmppbqwpgidlhqmalzaokeayfiq ; /usr/bin/python3
Feb 01 07:48:11 np0005604215.localdomain sudo[39320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:11 np0005604215.localdomain python3[39322]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:11 np0005604215.localdomain sudo[39320]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:11 np0005604215.localdomain sudo[39365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiehvxzdglevwdooollcilmrajtddaah ; /usr/bin/python3
Feb 01 07:48:11 np0005604215.localdomain sudo[39365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:11 np0005604215.localdomain python3[39367]: ansible-ansible.legacy.copy Invoked with mode=384 dest=/etc/puppet/hiera.yaml src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932090.879111-70662-167416958266394/source _original_basename=tmpu1d0fu25 follow=False checksum=aaf3699defba931d532f4955ae152f505046749a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:11 np0005604215.localdomain sudo[39365]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:11 np0005604215.localdomain sudo[39395]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inmjzkofddznanvstbsxwcfnithngjnk ; /usr/bin/python3
Feb 01 07:48:11 np0005604215.localdomain sudo[39395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:12 np0005604215.localdomain python3[39397]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:12 np0005604215.localdomain sudo[39395]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:12 np0005604215.localdomain sudo[39443]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwdecazpakdvfsknhtffyqdyfjvoadez ; /usr/bin/python3
Feb 01 07:48:12 np0005604215.localdomain sudo[39443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:12 np0005604215.localdomain python3[39445]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:12 np0005604215.localdomain sudo[39443]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:12 np0005604215.localdomain sudo[39486]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtiwjolxovqzlcvopzhmurrejapxwefr ; /usr/bin/python3
Feb 01 07:48:12 np0005604215.localdomain sudo[39486]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:13 np0005604215.localdomain python3[39488]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932092.4219818-70756-76883123223862/source dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json follow=False checksum=d0cfa4bd89bcc42c9513572d4ad38f679529236d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:13 np0005604215.localdomain sudo[39486]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:13 np0005604215.localdomain sudo[39548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfipxvpunzwcdzfxcnjemxcqmyefcqzn ; /usr/bin/python3
Feb 01 07:48:13 np0005604215.localdomain sudo[39548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:13 np0005604215.localdomain python3[39550]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:13 np0005604215.localdomain sudo[39548]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:13 np0005604215.localdomain sudo[39591]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zutrzonmnnbsffeemkvuvgqftcfmvexm ; /usr/bin/python3
Feb 01 07:48:13 np0005604215.localdomain sudo[39591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:14 np0005604215.localdomain python3[39593]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932093.3264253-70815-4068009467879/source dest=/etc/puppet/hieradata/bootstrap_node.json mode=None follow=False _original_basename=bootstrap_node.j2 checksum=1f1a1c0de88e28e1c405f8e299af3f6bf8624260 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:14 np0005604215.localdomain sudo[39591]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:14 np0005604215.localdomain sudo[39653]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iacgakrfbhhpecbewjwputaclcvkpqgz ; /usr/bin/python3
Feb 01 07:48:14 np0005604215.localdomain sudo[39653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:14 np0005604215.localdomain python3[39655]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:14 np0005604215.localdomain sudo[39653]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:14 np0005604215.localdomain sudo[39696]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prpjuwlclsnkvkmnjvjxwezzmqwtzbri ; /usr/bin/python3
Feb 01 07:48:14 np0005604215.localdomain sudo[39696]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:14 np0005604215.localdomain python3[39698]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932094.2129455-70815-100366947357955/source dest=/etc/puppet/hieradata/vip_data.json mode=None follow=False _original_basename=vip_data.j2 checksum=a14c85776d2e39c2e9398053dff459a83e663446 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:14 np0005604215.localdomain sudo[39696]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:15 np0005604215.localdomain sudo[39758]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swifzkwifsypqysezhlzqnbqbiitcsqb ; /usr/bin/python3
Feb 01 07:48:15 np0005604215.localdomain sudo[39758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:15 np0005604215.localdomain python3[39760]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:15 np0005604215.localdomain sudo[39758]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:15 np0005604215.localdomain sudo[39801]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edsrlptlskfzsmjdaqibrwqopifwksgf ; /usr/bin/python3
Feb 01 07:48:15 np0005604215.localdomain sudo[39801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:15 np0005604215.localdomain python3[39803]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932095.1389735-70815-89402326267497/source dest=/etc/puppet/hieradata/net_ip_map.json mode=None follow=False _original_basename=net_ip_map.j2 checksum=1bd75eeb71ad8a06f7ad5bd2e02e7279e09e867f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:16 np0005604215.localdomain sudo[39801]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:16 np0005604215.localdomain sudo[39839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:48:16 np0005604215.localdomain sudo[39839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:48:16 np0005604215.localdomain sudo[39839]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:16 np0005604215.localdomain sudo[39888]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgdyhzwgrotfsryhbzbwdrxwtdzunhmw ; /usr/bin/python3
Feb 01 07:48:16 np0005604215.localdomain sudo[39888]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:16 np0005604215.localdomain sudo[39871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:48:16 np0005604215.localdomain sudo[39871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:48:16 np0005604215.localdomain python3[39894]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:16 np0005604215.localdomain sudo[39888]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:16 np0005604215.localdomain sudo[39951]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gesogxalyitlduqjztnqmdijgzourwsx ; /usr/bin/python3
Feb 01 07:48:16 np0005604215.localdomain sudo[39951]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:16 np0005604215.localdomain python3[39953]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932096.1468594-70815-269833929399320/source dest=/etc/puppet/hieradata/cloud_domain.json mode=None follow=False _original_basename=cloud_domain.j2 checksum=5dd835a63e6a03d74797c2e2eadf4bea1cecd9d9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:16 np0005604215.localdomain sudo[39951]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:16 np0005604215.localdomain sudo[39871]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:17 np0005604215.localdomain sudo[40030]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljjvvzmfafuljahvvktqpsxtxtnxadex ; /usr/bin/python3
Feb 01 07:48:17 np0005604215.localdomain sudo[40030]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:17 np0005604215.localdomain python3[40032]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:17 np0005604215.localdomain sudo[40030]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:17 np0005604215.localdomain sudo[40073]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exocmxfsyadbodiygrrpjdwmfjzxzqlq ; /usr/bin/python3
Feb 01 07:48:17 np0005604215.localdomain sudo[40073]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:17 np0005604215.localdomain sudo[40076]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:48:17 np0005604215.localdomain sudo[40076]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:48:17 np0005604215.localdomain sudo[40076]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:17 np0005604215.localdomain python3[40075]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932096.9315221-70815-7637395490528/source dest=/etc/puppet/hieradata/fqdn.json mode=None follow=False _original_basename=fqdn.j2 checksum=4b0728b1a4158e6417d66a1cc37f4e4d26059385 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:17 np0005604215.localdomain sudo[40073]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:17 np0005604215.localdomain sudo[40150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvhturqmhnzmgcrmbgcoklhvdfszteht ; /usr/bin/python3
Feb 01 07:48:17 np0005604215.localdomain sudo[40150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:18 np0005604215.localdomain python3[40152]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:18 np0005604215.localdomain sudo[40150]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:18 np0005604215.localdomain sudo[40193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcpmlozhzurteszgxsobdjmbnejxfexk ; /usr/bin/python3
Feb 01 07:48:18 np0005604215.localdomain sudo[40193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:18 np0005604215.localdomain python3[40195]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932097.7887776-70815-280030628976059/source dest=/etc/puppet/hieradata/service_names.json mode=None follow=False _original_basename=service_names.j2 checksum=ff586b96402d8ae133745cf06f17e772b2f22d52 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:18 np0005604215.localdomain sudo[40193]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:18 np0005604215.localdomain sudo[40255]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uidgyghejxmaqnesscsuoyhjxjygtmvv ; /usr/bin/python3
Feb 01 07:48:18 np0005604215.localdomain sudo[40255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:18 np0005604215.localdomain python3[40257]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:18 np0005604215.localdomain sudo[40255]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:19 np0005604215.localdomain sudo[40298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irkevvzmbbjmbgsedfrksjmitvnjbizc ; /usr/bin/python3
Feb 01 07:48:19 np0005604215.localdomain sudo[40298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:19 np0005604215.localdomain python3[40300]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932098.6073034-70815-63759684348808/source dest=/etc/puppet/hieradata/service_configs.json mode=None follow=False _original_basename=service_configs.j2 checksum=b186a8b61d7f8cda474e1db6d9f709185a517ec4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:19 np0005604215.localdomain sudo[40298]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:19 np0005604215.localdomain sudo[40360]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjwkhutjttmlvtnvsmstvrpvivsshmfk ; /usr/bin/python3
Feb 01 07:48:19 np0005604215.localdomain sudo[40360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:19 np0005604215.localdomain python3[40362]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:19 np0005604215.localdomain sudo[40360]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:20 np0005604215.localdomain sudo[40403]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvbdkdirxepicmvhlxveecpzwfffuxym ; /usr/bin/python3
Feb 01 07:48:20 np0005604215.localdomain sudo[40403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:20 np0005604215.localdomain python3[40405]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932099.4436457-70815-195874346355848/source dest=/etc/puppet/hieradata/extraconfig.json mode=None follow=False _original_basename=extraconfig.j2 checksum=5f36b2ea290645ee34d943220a14b54ee5ea5be5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:20 np0005604215.localdomain sudo[40403]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:20 np0005604215.localdomain sudo[40465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpljiwbppsekhadnzunjtgxwxbhxjzzw ; /usr/bin/python3
Feb 01 07:48:20 np0005604215.localdomain sudo[40465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:20 np0005604215.localdomain python3[40467]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:20 np0005604215.localdomain sudo[40465]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:20 np0005604215.localdomain sudo[40508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqsduurmombroakomtnrylkiuoszezwd ; /usr/bin/python3
Feb 01 07:48:20 np0005604215.localdomain sudo[40508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:20 np0005604215.localdomain python3[40510]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932100.3453958-70815-50343333910865/source dest=/etc/puppet/hieradata/role_extraconfig.json mode=None follow=False _original_basename=role_extraconfig.j2 checksum=34875968bf996542162e620523f9dcfb3deac331 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:21 np0005604215.localdomain sudo[40508]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:21 np0005604215.localdomain sudo[40570]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwybrvjhuoszvuxvvmzpetxmhhufhgic ; /usr/bin/python3
Feb 01 07:48:21 np0005604215.localdomain sudo[40570]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:21 np0005604215.localdomain python3[40572]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:21 np0005604215.localdomain sudo[40570]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:21 np0005604215.localdomain sudo[40613]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kohzieclxzovhoqobkewciyrwjjuvdji ; /usr/bin/python3
Feb 01 07:48:21 np0005604215.localdomain sudo[40613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:21 np0005604215.localdomain python3[40615]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932101.1587975-70815-155741913345682/source dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json mode=None follow=False _original_basename=ovn_chassis_mac_map.j2 checksum=3d5ed7edeabd971026d9e415515c8db40416d5cd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:21 np0005604215.localdomain sudo[40613]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:22 np0005604215.localdomain sudo[40643]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwiphkvfijrqmowifrjbspzcozmdlihf ; /usr/bin/python3
Feb 01 07:48:22 np0005604215.localdomain sudo[40643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:22 np0005604215.localdomain python3[40645]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:48:22 np0005604215.localdomain sudo[40643]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:23 np0005604215.localdomain sudo[40691]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upuvjcpiriymkacejpplwcpjukrgnmzb ; /usr/bin/python3
Feb 01 07:48:23 np0005604215.localdomain sudo[40691]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:23 np0005604215.localdomain python3[40693]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:48:23 np0005604215.localdomain sudo[40691]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:23 np0005604215.localdomain sudo[40734]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msbnditxwqpaiabeljddrimggpzkkswf ; /usr/bin/python3
Feb 01 07:48:23 np0005604215.localdomain sudo[40734]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:23 np0005604215.localdomain python3[40736]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/ansible_managed.json owner=root group=root mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932103.079721-71631-132883807427961/source _original_basename=tmpaj41dn9z follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:48:23 np0005604215.localdomain sudo[40734]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:28 np0005604215.localdomain sudo[40764]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fntzdpftjyunhachcrmwzcrhevmnngyf ; /usr/bin/python3
Feb 01 07:48:28 np0005604215.localdomain sudo[40764]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:28 np0005604215.localdomain python3[40766]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_default_ipv4'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 07:48:28 np0005604215.localdomain sudo[40764]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:28 np0005604215.localdomain sudo[40825]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdpaqbpjxzqvuzggjxzfznssmwuoxtyr ; /usr/bin/python3
Feb 01 07:48:28 np0005604215.localdomain sudo[40825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:28 np0005604215.localdomain python3[40827]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 38.102.83.1 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:33 np0005604215.localdomain sudo[40825]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:33 np0005604215.localdomain sudo[40842]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-matjqoylupwcicttlwdlrsqkwyysjqgy ; /usr/bin/python3
Feb 01 07:48:33 np0005604215.localdomain sudo[40842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:33 np0005604215.localdomain python3[40844]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.10 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:37 np0005604215.localdomain sudo[40842]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:38 np0005604215.localdomain sudo[40859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlwfsjqbexpflwxrirolfjgsgnahyemt ; /usr/bin/python3
Feb 01 07:48:38 np0005604215.localdomain sudo[40859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:38 np0005604215.localdomain python3[40861]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 192.168.122.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:38 np0005604215.localdomain sudo[40859]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:39 np0005604215.localdomain sudo[40882]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clkueagsyjzmswfdanmxicqgodgmuodq ; /usr/bin/python3
Feb 01 07:48:39 np0005604215.localdomain sudo[40882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:39 np0005604215.localdomain python3[40884]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 192.168.122.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:40 np0005604215.localdomain systemd[35763]: Starting Mark boot as successful...
Feb 01 07:48:40 np0005604215.localdomain systemd[35763]: Finished Mark boot as successful.
Feb 01 07:48:43 np0005604215.localdomain sudo[40882]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:43 np0005604215.localdomain sudo[40900]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcqalogynnpxuvdsnryzyyytuimgrfkn ; /usr/bin/python3
Feb 01 07:48:43 np0005604215.localdomain sudo[40900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:43 np0005604215.localdomain python3[40902]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.18.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:43 np0005604215.localdomain sudo[40900]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:44 np0005604215.localdomain sudo[40923]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sifxsucovahovmgnjqsqbymsaxuhbori ; /usr/bin/python3
Feb 01 07:48:44 np0005604215.localdomain sudo[40923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:44 np0005604215.localdomain python3[40925]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:48 np0005604215.localdomain sudo[40923]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:48 np0005604215.localdomain sudo[40940]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qoonqncwowgxicncmzhwasriewmcdeyp ; /usr/bin/python3
Feb 01 07:48:48 np0005604215.localdomain sudo[40940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:48 np0005604215.localdomain python3[40942]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.18.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:52 np0005604215.localdomain sudo[40940]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:53 np0005604215.localdomain sudo[40957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhrgyeyyrnraswiozwnbbvuljjirevku ; /usr/bin/python3
Feb 01 07:48:53 np0005604215.localdomain sudo[40957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:53 np0005604215.localdomain python3[40959]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.20.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:53 np0005604215.localdomain sudo[40957]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:53 np0005604215.localdomain sudo[40980]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqlnchmyllccbgmbqnxxrcsgxdihrcpy ; /usr/bin/python3
Feb 01 07:48:53 np0005604215.localdomain sudo[40980]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:53 np0005604215.localdomain python3[40982]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:48:57 np0005604215.localdomain sudo[40980]: pam_unix(sudo:session): session closed for user root
Feb 01 07:48:58 np0005604215.localdomain sudo[40997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhnjeyhzkystyhwqhjflytilnbzgnpkj ; /usr/bin/python3
Feb 01 07:48:58 np0005604215.localdomain sudo[40997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:48:58 np0005604215.localdomain python3[40999]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.20.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:02 np0005604215.localdomain sudo[40997]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:02 np0005604215.localdomain sudo[41014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzvesvwgzoherofnvewmboqnvqfdjcaj ; /usr/bin/python3
Feb 01 07:49:02 np0005604215.localdomain sudo[41014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:02 np0005604215.localdomain python3[41016]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.17.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:02 np0005604215.localdomain sudo[41014]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:03 np0005604215.localdomain sudo[41037]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wliuihvlykhuzrthrnqfnzoojanphmbb ; /usr/bin/python3
Feb 01 07:49:03 np0005604215.localdomain sudo[41037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:03 np0005604215.localdomain python3[41039]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:07 np0005604215.localdomain sudo[41037]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:07 np0005604215.localdomain sudo[41054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwouqoueprfwrgpxbxtvbpzmuewduxcm ; /usr/bin/python3
Feb 01 07:49:07 np0005604215.localdomain sudo[41054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:07 np0005604215.localdomain python3[41056]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.17.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:11 np0005604215.localdomain sudo[41054]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:12 np0005604215.localdomain sudo[41071]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixyqfftqqdzavlvdfhiwcvksydykfylw ; /usr/bin/python3
Feb 01 07:49:12 np0005604215.localdomain sudo[41071]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:12 np0005604215.localdomain python3[41073]: ansible-ansible.legacy.command Invoked with _raw_params=INT=$(ip ro get 172.19.0.106 | head -1 | sed -nr "s/.* dev (\w+) .*/\1/p")
                                                         MTU=$(cat /sys/class/net/${INT}/mtu 2>/dev/null || echo "0")
                                                         echo "$INT $MTU"
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:12 np0005604215.localdomain sudo[41071]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:12 np0005604215.localdomain sudo[41094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viufphabpomyvntpaeomszismpxypfhw ; /usr/bin/python3
Feb 01 07:49:12 np0005604215.localdomain sudo[41094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:12 np0005604215.localdomain python3[41096]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:16 np0005604215.localdomain sudo[41094]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:16 np0005604215.localdomain sudo[41111]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ckbvvnbnekbhqdeygpvgjdfttspqhhak ; /usr/bin/python3
Feb 01 07:49:16 np0005604215.localdomain sudo[41111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:17 np0005604215.localdomain python3[41113]: ansible-ansible.legacy.command Invoked with _raw_params=ping -w 10 -s 1472 -c 5 172.19.0.106 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:17 np0005604215.localdomain sudo[41115]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:49:17 np0005604215.localdomain sudo[41115]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:49:17 np0005604215.localdomain sudo[41115]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:17 np0005604215.localdomain sudo[41130]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:49:17 np0005604215.localdomain sudo[41130]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:49:18 np0005604215.localdomain sudo[41130]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:20 np0005604215.localdomain sudo[41178]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:49:20 np0005604215.localdomain sudo[41178]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:49:20 np0005604215.localdomain sudo[41178]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:21 np0005604215.localdomain sudo[41111]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:22 np0005604215.localdomain sudo[41206]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkjpcqrifnnedswfbhkzubetvscxthcv ; /usr/bin/python3
Feb 01 07:49:22 np0005604215.localdomain sudo[41206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:22 np0005604215.localdomain python3[41208]: ansible-file Invoked with path=/etc/puppet/hieradata state=directory mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:22 np0005604215.localdomain sudo[41206]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:22 np0005604215.localdomain sudo[41254]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mmsvmhibtphwohlihjvaadkkdbzaymdu ; /usr/bin/python3
Feb 01 07:49:22 np0005604215.localdomain sudo[41254]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:22 np0005604215.localdomain python3[41256]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hiera.yaml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:22 np0005604215.localdomain sudo[41254]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:22 np0005604215.localdomain sudo[41272]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iukteeagslkrgbwmrrjecnjuxwtgekhw ; /usr/bin/python3
Feb 01 07:49:22 np0005604215.localdomain sudo[41272]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:22 np0005604215.localdomain python3[41274]: ansible-ansible.legacy.file Invoked with mode=384 dest=/etc/puppet/hiera.yaml _original_basename=tmpjdco962d recurse=False state=file path=/etc/puppet/hiera.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:22 np0005604215.localdomain sudo[41272]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:23 np0005604215.localdomain sudo[41302]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxkajycrugljolamnxuffjdzwfxyyspj ; /usr/bin/python3
Feb 01 07:49:23 np0005604215.localdomain sudo[41302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:23 np0005604215.localdomain python3[41304]: ansible-file Invoked with src=/etc/puppet/hiera.yaml dest=/etc/hiera.yaml state=link force=True path=/etc/hiera.yaml recurse=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:23 np0005604215.localdomain sudo[41302]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:24 np0005604215.localdomain sudo[41350]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vjkjfyyjkqyhxgcjnongzyptftwpggcf ; /usr/bin/python3
Feb 01 07:49:24 np0005604215.localdomain sudo[41350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:24 np0005604215.localdomain python3[41352]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/all_nodes.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:24 np0005604215.localdomain sudo[41350]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:24 np0005604215.localdomain sudo[41368]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkrpkbhuqpcyytkfltqunldhanhmprsm ; /usr/bin/python3
Feb 01 07:49:24 np0005604215.localdomain sudo[41368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:24 np0005604215.localdomain python3[41370]: ansible-ansible.legacy.file Invoked with dest=/etc/puppet/hieradata/all_nodes.json _original_basename=overcloud.json recurse=False state=file path=/etc/puppet/hieradata/all_nodes.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:24 np0005604215.localdomain sudo[41368]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:24 np0005604215.localdomain sudo[41430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmzlxzxauzbrrvggsnhepcivlhhbkeer ; /usr/bin/python3
Feb 01 07:49:24 np0005604215.localdomain sudo[41430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:25 np0005604215.localdomain python3[41432]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/bootstrap_node.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:25 np0005604215.localdomain sudo[41430]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:25 np0005604215.localdomain sudo[41448]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfwzmbnamachrdezwrntcueaohovvkzg ; /usr/bin/python3
Feb 01 07:49:25 np0005604215.localdomain sudo[41448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:25 np0005604215.localdomain python3[41450]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/bootstrap_node.json _original_basename=bootstrap_node.j2 recurse=False state=file path=/etc/puppet/hieradata/bootstrap_node.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:25 np0005604215.localdomain sudo[41448]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:25 np0005604215.localdomain sudo[41510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yizjjrytyrijlpyrxjcfzxmzdltrhkvm ; /usr/bin/python3
Feb 01 07:49:25 np0005604215.localdomain sudo[41510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:25 np0005604215.localdomain python3[41512]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/vip_data.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:25 np0005604215.localdomain sudo[41510]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:25 np0005604215.localdomain sudo[41528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uokgdhaknlnjyfsdppwxfnnewbwwdmvy ; /usr/bin/python3
Feb 01 07:49:25 np0005604215.localdomain sudo[41528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:26 np0005604215.localdomain python3[41530]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/vip_data.json _original_basename=vip_data.j2 recurse=False state=file path=/etc/puppet/hieradata/vip_data.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:26 np0005604215.localdomain sudo[41528]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:26 np0005604215.localdomain sudo[41590]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztxuiequjqyodvixwybcczmhhpnmxfcb ; /usr/bin/python3
Feb 01 07:49:26 np0005604215.localdomain sudo[41590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:26 np0005604215.localdomain python3[41592]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/net_ip_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:26 np0005604215.localdomain sudo[41590]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:26 np0005604215.localdomain sudo[41608]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxtmnvfhnonyuxwcqiqhsquciwyhasfz ; /usr/bin/python3
Feb 01 07:49:26 np0005604215.localdomain sudo[41608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:26 np0005604215.localdomain python3[41610]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/net_ip_map.json _original_basename=net_ip_map.j2 recurse=False state=file path=/etc/puppet/hieradata/net_ip_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:26 np0005604215.localdomain sudo[41608]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:27 np0005604215.localdomain sudo[41670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obgdujuiazexfintqdpjcgrelqaqeesd ; /usr/bin/python3
Feb 01 07:49:27 np0005604215.localdomain sudo[41670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:27 np0005604215.localdomain python3[41672]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/cloud_domain.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:27 np0005604215.localdomain sudo[41670]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:27 np0005604215.localdomain sudo[41688]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbokjsolzlcobghvnwaqpdlscachcyem ; /usr/bin/python3
Feb 01 07:49:27 np0005604215.localdomain sudo[41688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:27 np0005604215.localdomain python3[41690]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/cloud_domain.json _original_basename=cloud_domain.j2 recurse=False state=file path=/etc/puppet/hieradata/cloud_domain.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:27 np0005604215.localdomain sudo[41688]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:27 np0005604215.localdomain sudo[41750]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvzlieaaybqcownmnlduzhokgumgfnrk ; /usr/bin/python3
Feb 01 07:49:27 np0005604215.localdomain sudo[41750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:28 np0005604215.localdomain python3[41752]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/fqdn.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:28 np0005604215.localdomain sudo[41750]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:28 np0005604215.localdomain sudo[41768]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfvypdlcbuhuihxbkxbtfuonhwbobunv ; /usr/bin/python3
Feb 01 07:49:28 np0005604215.localdomain sudo[41768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:28 np0005604215.localdomain python3[41770]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/fqdn.json _original_basename=fqdn.j2 recurse=False state=file path=/etc/puppet/hieradata/fqdn.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:28 np0005604215.localdomain sudo[41768]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:28 np0005604215.localdomain sudo[41830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdigzyfoywjmzjvvkhzodpwdhfmvhuyt ; /usr/bin/python3
Feb 01 07:49:28 np0005604215.localdomain sudo[41830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:28 np0005604215.localdomain python3[41832]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_names.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:28 np0005604215.localdomain sudo[41830]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:28 np0005604215.localdomain sudo[41848]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ampxphevqcahpfkrpxgycyetakdlvjvr ; /usr/bin/python3
Feb 01 07:49:28 np0005604215.localdomain sudo[41848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:28 np0005604215.localdomain python3[41850]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_names.json _original_basename=service_names.j2 recurse=False state=file path=/etc/puppet/hieradata/service_names.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:28 np0005604215.localdomain sudo[41848]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:29 np0005604215.localdomain sudo[41910]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkbmsybjverwaxgvqdivkmbqumdetscp ; /usr/bin/python3
Feb 01 07:49:29 np0005604215.localdomain sudo[41910]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:29 np0005604215.localdomain python3[41912]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/service_configs.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:29 np0005604215.localdomain sudo[41910]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:29 np0005604215.localdomain sudo[41928]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjxfoawsdbhsondgonllfrkodopreusb ; /usr/bin/python3
Feb 01 07:49:29 np0005604215.localdomain sudo[41928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:29 np0005604215.localdomain python3[41930]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/service_configs.json _original_basename=service_configs.j2 recurse=False state=file path=/etc/puppet/hieradata/service_configs.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:29 np0005604215.localdomain sudo[41928]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:29 np0005604215.localdomain sudo[41990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxsgfbsjuiklknxuelyjkrvxlvfmkvvm ; /usr/bin/python3
Feb 01 07:49:29 np0005604215.localdomain sudo[41990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:30 np0005604215.localdomain python3[41992]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:30 np0005604215.localdomain sudo[41990]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:30 np0005604215.localdomain sudo[42008]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxlevpysloncuthutvgncnhpjffcuirn ; /usr/bin/python3
Feb 01 07:49:30 np0005604215.localdomain sudo[42008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:30 np0005604215.localdomain python3[42010]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/extraconfig.json _original_basename=extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:30 np0005604215.localdomain sudo[42008]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:30 np0005604215.localdomain sudo[42070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msaoinaaikwiptyfzqtpubtkgjpenftr ; /usr/bin/python3
Feb 01 07:49:30 np0005604215.localdomain sudo[42070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:30 np0005604215.localdomain python3[42072]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/role_extraconfig.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:30 np0005604215.localdomain sudo[42070]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:30 np0005604215.localdomain sudo[42088]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlsbubseuiqdghiwvkclwgkqrsswbusd ; /usr/bin/python3
Feb 01 07:49:30 np0005604215.localdomain sudo[42088]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:31 np0005604215.localdomain python3[42090]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/role_extraconfig.json _original_basename=role_extraconfig.j2 recurse=False state=file path=/etc/puppet/hieradata/role_extraconfig.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:31 np0005604215.localdomain sudo[42088]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:31 np0005604215.localdomain sudo[42150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dbfdmnlkklxpbxygbijwfammtjhqthnr ; /usr/bin/python3
Feb 01 07:49:31 np0005604215.localdomain sudo[42150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:31 np0005604215.localdomain python3[42152]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ovn_chassis_mac_map.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:31 np0005604215.localdomain sudo[42150]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:31 np0005604215.localdomain sudo[42168]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-npqhomjnndxytarowqwtepoffpydqass ; /usr/bin/python3
Feb 01 07:49:31 np0005604215.localdomain sudo[42168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:31 np0005604215.localdomain python3[42170]: ansible-ansible.legacy.file Invoked with mode=None dest=/etc/puppet/hieradata/ovn_chassis_mac_map.json _original_basename=ovn_chassis_mac_map.j2 recurse=False state=file path=/etc/puppet/hieradata/ovn_chassis_mac_map.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:31 np0005604215.localdomain sudo[42168]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:32 np0005604215.localdomain sudo[42198]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwwtudxhuqervcdzrxasdqnhgpivvxqs ; /usr/bin/python3
Feb 01 07:49:32 np0005604215.localdomain sudo[42198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:32 np0005604215.localdomain python3[42200]: ansible-stat Invoked with path={'src': '/etc/puppet/hieradata/ansible_managed.json'} follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:49:32 np0005604215.localdomain sudo[42198]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:33 np0005604215.localdomain sudo[42246]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkfmdaaiqzxtcznwdddryibsvqokztbi ; /usr/bin/python3
Feb 01 07:49:33 np0005604215.localdomain sudo[42246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:33 np0005604215.localdomain python3[42248]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/ansible_managed.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:33 np0005604215.localdomain sudo[42246]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:33 np0005604215.localdomain sudo[42264]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xopbcrmtpocldfnesvggiygmjkrdzkkj ; /usr/bin/python3
Feb 01 07:49:33 np0005604215.localdomain sudo[42264]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:33 np0005604215.localdomain python3[42266]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=0644 dest=/etc/puppet/hieradata/ansible_managed.json _original_basename=tmpnk58qvu3 recurse=False state=file path=/etc/puppet/hieradata/ansible_managed.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:33 np0005604215.localdomain sudo[42264]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:36 np0005604215.localdomain sudo[42294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-srplmzoobhayijjfiefesktzyrwusugi ; /usr/bin/python3
Feb 01 07:49:36 np0005604215.localdomain sudo[42294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:36 np0005604215.localdomain python3[42296]: ansible-dnf Invoked with name=['firewalld'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:49:38 np0005604215.localdomain sudo[42294]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:40 np0005604215.localdomain sudo[42311]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfzqzqgxbhdkbuwmkzftxtaqfpezsdsc ; /usr/bin/python3
Feb 01 07:49:40 np0005604215.localdomain sudo[42311]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:41 np0005604215.localdomain python3[42313]: ansible-ansible.builtin.systemd Invoked with name=iptables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:49:41 np0005604215.localdomain sudo[42311]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:41 np0005604215.localdomain sudo[42329]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sedrxdmiunhaivhtmvvdbfseoixyfufy ; /usr/bin/python3
Feb 01 07:49:41 np0005604215.localdomain sudo[42329]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:41 np0005604215.localdomain python3[42331]: ansible-ansible.builtin.systemd Invoked with name=ip6tables.service state=stopped enabled=False daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:49:41 np0005604215.localdomain sudo[42329]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:41 np0005604215.localdomain sudo[42347]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbbpscpfabggedjbnekhgafozskwdbjg ; /usr/bin/python3
Feb 01 07:49:41 np0005604215.localdomain sudo[42347]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:42 np0005604215.localdomain python3[42349]: ansible-ansible.builtin.systemd Invoked with name=nftables state=started enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:49:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:49:42 np0005604215.localdomain systemd-sysv-generator[42379]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:49:42 np0005604215.localdomain systemd-rc-local-generator[42376]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:49:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:49:42 np0005604215.localdomain systemd[1]: Starting Netfilter Tables...
Feb 01 07:49:42 np0005604215.localdomain systemd[1]: Finished Netfilter Tables.
Feb 01 07:49:42 np0005604215.localdomain sudo[42347]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:43 np0005604215.localdomain sudo[42437]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhfcsetjbefavdepoyfovxpuvebcwxjj ; /usr/bin/python3
Feb 01 07:49:43 np0005604215.localdomain sudo[42437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:43 np0005604215.localdomain python3[42439]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:43 np0005604215.localdomain sudo[42437]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:43 np0005604215.localdomain sudo[42480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryqfgaluskybtoxlokjghmumfbozgfyz ; /usr/bin/python3
Feb 01 07:49:43 np0005604215.localdomain sudo[42480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:43 np0005604215.localdomain python3[42482]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932182.985839-74376-100395968078890/source _original_basename=iptables.nft follow=False checksum=ede9860c99075946a7bc827210247aac639bc84a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:43 np0005604215.localdomain sudo[42480]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:43 np0005604215.localdomain sudo[42510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcfoatuapbjretxszkyzavsytrxuctel ; /usr/bin/python3
Feb 01 07:49:43 np0005604215.localdomain sudo[42510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:44 np0005604215.localdomain python3[42512]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:44 np0005604215.localdomain sudo[42510]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:44 np0005604215.localdomain sudo[42528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewapifndpgbucdzfjaxumsuuqfnffohw ; /usr/bin/python3
Feb 01 07:49:44 np0005604215.localdomain sudo[42528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:44 np0005604215.localdomain python3[42530]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:44 np0005604215.localdomain sudo[42528]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:45 np0005604215.localdomain sudo[42577]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhkynjxsnxewqbdjsfhxqdenrykmxmwp ; /usr/bin/python3
Feb 01 07:49:45 np0005604215.localdomain sudo[42577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:45 np0005604215.localdomain python3[42579]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:45 np0005604215.localdomain sudo[42577]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:45 np0005604215.localdomain sudo[42620]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsehwwslohvwmtbtiqwswrbpblwsmzff ; /usr/bin/python3
Feb 01 07:49:45 np0005604215.localdomain sudo[42620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:45 np0005604215.localdomain python3[42622]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932184.8879824-74487-153447734538570/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:45 np0005604215.localdomain sudo[42620]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:46 np0005604215.localdomain sudo[42682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onxcuvfamocrlroygdaddzmvwfmoyijr ; /usr/bin/python3
Feb 01 07:49:46 np0005604215.localdomain sudo[42682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:46 np0005604215.localdomain python3[42684]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-update-jumps.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:46 np0005604215.localdomain sudo[42682]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:46 np0005604215.localdomain sudo[42725]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dywuqxshhyllbowamyukfnfaolnzkcqc ; /usr/bin/python3
Feb 01 07:49:46 np0005604215.localdomain sudo[42725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:46 np0005604215.localdomain python3[42727]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-update-jumps.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932185.8449843-74545-210916059093489/source mode=None follow=False _original_basename=jump-chain.j2 checksum=eec306c3276262a27663d76bd0ea526457445afa backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:46 np0005604215.localdomain sudo[42725]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:46 np0005604215.localdomain sudo[42787]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjcdibbknulxhkwlvusdguunxflhemtf ; /usr/bin/python3
Feb 01 07:49:47 np0005604215.localdomain sudo[42787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:47 np0005604215.localdomain python3[42789]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-flushes.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:47 np0005604215.localdomain sudo[42787]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:47 np0005604215.localdomain sudo[42830]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncqqtfpywqiyhyfpqvgwsdvtmydmlkyh ; /usr/bin/python3
Feb 01 07:49:47 np0005604215.localdomain sudo[42830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:47 np0005604215.localdomain python3[42832]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-flushes.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932186.8493836-74611-201200344791116/source mode=None follow=False _original_basename=flush-chain.j2 checksum=e8e7b8db0d61a7fe393441cc91613f470eb34a6e backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:47 np0005604215.localdomain sudo[42830]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:47 np0005604215.localdomain sudo[42892]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hettkmwzrmcdwbsdrikczcarzzjsbuuu ; /usr/bin/python3
Feb 01 07:49:47 np0005604215.localdomain sudo[42892]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:48 np0005604215.localdomain python3[42894]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-chains.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:48 np0005604215.localdomain sudo[42892]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:48 np0005604215.localdomain sudo[42935]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muantymsfhufxoiqkzhgyqkgvmlcaxwt ; /usr/bin/python3
Feb 01 07:49:48 np0005604215.localdomain sudo[42935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:48 np0005604215.localdomain python3[42937]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-chains.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932187.7543635-74857-56068773496544/source mode=None follow=False _original_basename=chains.j2 checksum=e60ee651f5014e83924f4e901ecc8e25b1906610 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:48 np0005604215.localdomain sudo[42935]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:49 np0005604215.localdomain sudo[42997]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnuedmwoyvtvsuvtwdsdxjnnbyvrnnah ; /usr/bin/python3
Feb 01 07:49:49 np0005604215.localdomain sudo[42997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:49 np0005604215.localdomain python3[42999]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/tripleo-rules.nft follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:49:49 np0005604215.localdomain sudo[42997]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:49 np0005604215.localdomain sudo[43040]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yylngkserkmfiapnpiiellmhouwcupdx ; /usr/bin/python3
Feb 01 07:49:49 np0005604215.localdomain sudo[43040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:49 np0005604215.localdomain python3[43042]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/tripleo-rules.nft src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932188.643439-74907-38474285717260/source mode=None follow=False _original_basename=ruleset.j2 checksum=0444e4206083f91e2fb2aabfa2928244c2db35ed backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:49 np0005604215.localdomain sudo[43040]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:49 np0005604215.localdomain sudo[43070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mintmaetzsvqangqrnaoydrbrdnwfbay ; /usr/bin/python3
Feb 01 07:49:49 np0005604215.localdomain sudo[43070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:50 np0005604215.localdomain python3[43072]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-chains.nft /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft /etc/nftables/tripleo-jumps.nft | nft -c -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:50 np0005604215.localdomain sudo[43070]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:50 np0005604215.localdomain sudo[43135]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juhwyegsrawtfpwgievxxbavnkvnrxxw ; /usr/bin/python3
Feb 01 07:49:50 np0005604215.localdomain sudo[43135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:50 np0005604215.localdomain python3[43137]: ansible-ansible.builtin.blockinfile Invoked with path=/etc/sysconfig/nftables.conf backup=False validate=nft -c -f %s block=include "/etc/nftables/iptables.nft"
                                                         include "/etc/nftables/tripleo-chains.nft"
                                                         include "/etc/nftables/tripleo-rules.nft"
                                                         include "/etc/nftables/tripleo-jumps.nft"
                                                          state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:49:50 np0005604215.localdomain sudo[43135]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:51 np0005604215.localdomain sudo[43152]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffamyvcpggkzbpmokaigbwdmvgaivlgv ; /usr/bin/python3
Feb 01 07:49:51 np0005604215.localdomain sudo[43152]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:51 np0005604215.localdomain python3[43154]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/tripleo-chains.nft _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:51 np0005604215.localdomain sudo[43152]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:52 np0005604215.localdomain sudo[43169]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keozqxxebwdouuwrpkyrswijkmaaezeb ; /usr/bin/python3
Feb 01 07:49:52 np0005604215.localdomain sudo[43169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:52 np0005604215.localdomain python3[43171]: ansible-ansible.legacy.command Invoked with _raw_params=cat /etc/nftables/tripleo-flushes.nft /etc/nftables/tripleo-rules.nft /etc/nftables/tripleo-update-jumps.nft | nft -f - _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:49:52 np0005604215.localdomain sudo[43169]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:52 np0005604215.localdomain sudo[43188]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzelxlmrabyqyhdopklrcnxmpccpfutv ; /usr/bin/python3
Feb 01 07:49:52 np0005604215.localdomain sudo[43188]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:52 np0005604215.localdomain python3[43190]: ansible-file Invoked with mode=0750 path=/var/log/containers/collectd setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:49:52 np0005604215.localdomain sudo[43188]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:52 np0005604215.localdomain sudo[43204]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqilfhsvgwaafqbpjmoqudvgzilpikxw ; /usr/bin/python3
Feb 01 07:49:52 np0005604215.localdomain sudo[43204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:52 np0005604215.localdomain python3[43206]: ansible-file Invoked with mode=0755 path=/var/lib/container-user-scripts/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:49:52 np0005604215.localdomain sudo[43204]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:53 np0005604215.localdomain sudo[43220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsclrwvprlogijipgeisdsbghkivwoju ; /usr/bin/python3
Feb 01 07:49:53 np0005604215.localdomain sudo[43220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:53 np0005604215.localdomain python3[43222]: ansible-file Invoked with mode=0750 path=/var/log/containers/ceilometer setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:49:53 np0005604215.localdomain sudo[43220]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:53 np0005604215.localdomain sudo[43236]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osyyccdpcuhytxfrccxjkiwekgpziolp ; /usr/bin/python3
Feb 01 07:49:53 np0005604215.localdomain sudo[43236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:53 np0005604215.localdomain python3[43238]: ansible-seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 01 07:49:54 np0005604215.localdomain sudo[43236]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:54 np0005604215.localdomain sudo[43256]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzdcvlfzypotkhfgxqqaoxjohjficgav ; /usr/bin/python3
Feb 01 07:49:54 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=7 res=1
Feb 01 07:49:54 np0005604215.localdomain sudo[43256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:55 np0005604215.localdomain python3[43258]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  Converting 2702 SID table entries...
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:49:55 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:49:55 np0005604215.localdomain sudo[43256]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:55 np0005604215.localdomain sudo[43277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnlhvdafjqflvlhcombvctvjhetbqpua ; /usr/bin/python3
Feb 01 07:49:55 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Feb 01 07:49:55 np0005604215.localdomain sudo[43277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:56 np0005604215.localdomain python3[43279]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/etc/target(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  Converting 2702 SID table entries...
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:49:56 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:49:57 np0005604215.localdomain sudo[43277]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:57 np0005604215.localdomain sudo[43298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmrjzrnqwdzmhlsdvspgbhtfyxjmaytk ; /usr/bin/python3
Feb 01 07:49:57 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Feb 01 07:49:57 np0005604215.localdomain sudo[43298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:57 np0005604215.localdomain python3[43300]: ansible-community.general.sefcontext Invoked with setype=container_file_t state=present target=/var/lib/iscsi(/.*)? ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  Converting 2702 SID table entries...
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:49:58 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:49:58 np0005604215.localdomain sudo[43298]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:58 np0005604215.localdomain sudo[43319]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-johewzxfrgbfftpjzqnuizoscsrpzdlu ; /usr/bin/python3
Feb 01 07:49:58 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Feb 01 07:49:58 np0005604215.localdomain sudo[43319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:58 np0005604215.localdomain python3[43321]: ansible-file Invoked with path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:49:58 np0005604215.localdomain sudo[43319]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:58 np0005604215.localdomain sudo[43335]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbosqrtwdkaqzsrvjwofotvkypvtcvlv ; /usr/bin/python3
Feb 01 07:49:58 np0005604215.localdomain sudo[43335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:58 np0005604215.localdomain python3[43337]: ansible-file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:49:59 np0005604215.localdomain sudo[43335]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:59 np0005604215.localdomain sudo[43351]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivdckvqxlbtxlhipncdeoytbuczwmraf ; /usr/bin/python3
Feb 01 07:49:59 np0005604215.localdomain sudo[43351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:59 np0005604215.localdomain python3[43353]: ansible-file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:49:59 np0005604215.localdomain sudo[43351]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:59 np0005604215.localdomain sudo[43367]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vymzjyevcutnfvacrwvkxaeffjywrpzk ; /usr/bin/python3
Feb 01 07:49:59 np0005604215.localdomain sudo[43367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:49:59 np0005604215.localdomain python3[43369]: ansible-stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:49:59 np0005604215.localdomain sudo[43367]: pam_unix(sudo:session): session closed for user root
Feb 01 07:49:59 np0005604215.localdomain sudo[43383]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpfmhtwzhyumktlkacivtxsbdevchoqx ; /usr/bin/python3
Feb 01 07:49:59 np0005604215.localdomain sudo[43383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:00 np0005604215.localdomain python3[43385]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-enabled --quiet iscsi.service _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:00 np0005604215.localdomain sudo[43383]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:00 np0005604215.localdomain sudo[43400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umwiatoqzhtnlbdvyakjlundeofvcbit ; /usr/bin/python3
Feb 01 07:50:00 np0005604215.localdomain sudo[43400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:00 np0005604215.localdomain python3[43402]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:50:04 np0005604215.localdomain sudo[43400]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:04 np0005604215.localdomain sudo[43417]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gggtzokydzywczsbbkbdafxdgnsbwsdy ; /usr/bin/python3
Feb 01 07:50:04 np0005604215.localdomain sudo[43417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:04 np0005604215.localdomain python3[43419]: ansible-file Invoked with path=/etc/modules-load.d state=directory mode=493 owner=root group=root setype=etc_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:04 np0005604215.localdomain sudo[43417]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:04 np0005604215.localdomain sudo[43465]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdtjmbgcoxjocbnhyillzandavxfozjx ; /usr/bin/python3
Feb 01 07:50:04 np0005604215.localdomain sudo[43465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:04 np0005604215.localdomain python3[43467]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:04 np0005604215.localdomain sudo[43465]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:05 np0005604215.localdomain sudo[43508]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nhhrelsmgksakdjtludycjbkkfnyolpu ; /usr/bin/python3
Feb 01 07:50:05 np0005604215.localdomain sudo[43508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:05 np0005604215.localdomain python3[43510]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932204.5064027-75690-235995805173519/source dest=/etc/modules-load.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:05 np0005604215.localdomain sudo[43508]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:05 np0005604215.localdomain sudo[43538]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjktouoqdyrpeczzzdliuyvhjctejlpq ; /usr/bin/python3
Feb 01 07:50:05 np0005604215.localdomain sudo[43538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:05 np0005604215.localdomain python3[43540]: ansible-systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:50:05 np0005604215.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 01 07:50:05 np0005604215.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 01 07:50:05 np0005604215.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 01 07:50:05 np0005604215.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 01 07:50:05 np0005604215.localdomain kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Feb 01 07:50:05 np0005604215.localdomain kernel: Bridge firewalling registered
Feb 01 07:50:05 np0005604215.localdomain systemd-modules-load[43543]: Inserted module 'br_netfilter'
Feb 01 07:50:05 np0005604215.localdomain systemd-modules-load[43543]: Module 'msr' is built in
Feb 01 07:50:05 np0005604215.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 01 07:50:05 np0005604215.localdomain sudo[43538]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:06 np0005604215.localdomain sudo[43592]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klkbtwluiasxlkgdcakayizyemiqyelh ; /usr/bin/python3
Feb 01 07:50:06 np0005604215.localdomain sudo[43592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:06 np0005604215.localdomain python3[43594]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-tripleo.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:06 np0005604215.localdomain sudo[43592]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:06 np0005604215.localdomain sudo[43635]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjkqdoebysptdhfngmipcivfcczpyqfx ; /usr/bin/python3
Feb 01 07:50:06 np0005604215.localdomain sudo[43635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:06 np0005604215.localdomain python3[43637]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932205.9383452-75738-184500077012120/source dest=/etc/sysctl.d/99-tripleo.conf mode=420 owner=root group=root setype=etc_t follow=False _original_basename=tripleo-sysctl.conf.j2 checksum=cddb9401fdafaaf28a4a94b98448f98ae93c94c9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:06 np0005604215.localdomain sudo[43635]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:07 np0005604215.localdomain sudo[43665]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsiowmffqfdjbpwdezfajcopvtordmkr ; /usr/bin/python3
Feb 01 07:50:07 np0005604215.localdomain sudo[43665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:07 np0005604215.localdomain python3[43667]: ansible-sysctl Invoked with name=fs.aio-max-nr value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:07 np0005604215.localdomain sudo[43665]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:07 np0005604215.localdomain sudo[43682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nearxhatfkhiwfbqdkdtwcaynmmqgjxs ; /usr/bin/python3
Feb 01 07:50:07 np0005604215.localdomain sudo[43682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:07 np0005604215.localdomain python3[43684]: ansible-sysctl Invoked with name=fs.inotify.max_user_instances value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:07 np0005604215.localdomain sudo[43682]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:07 np0005604215.localdomain sudo[43700]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fagvwwykwaokforeuaveebwpointfbdv ; /usr/bin/python3
Feb 01 07:50:07 np0005604215.localdomain sudo[43700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:07 np0005604215.localdomain python3[43702]: ansible-sysctl Invoked with name=kernel.pid_max value=1048576 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:07 np0005604215.localdomain sudo[43700]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:08 np0005604215.localdomain sudo[43718]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkfuvyawbzbkmqwlmojliblvopqgejbf ; /usr/bin/python3
Feb 01 07:50:08 np0005604215.localdomain sudo[43718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:08 np0005604215.localdomain python3[43720]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-arptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:08 np0005604215.localdomain sudo[43718]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:08 np0005604215.localdomain sudo[43735]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qseixthwevcqmwxhvgvlgzpxvltlsmum ; /usr/bin/python3
Feb 01 07:50:08 np0005604215.localdomain sudo[43735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:08 np0005604215.localdomain python3[43737]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-ip6tables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:08 np0005604215.localdomain sudo[43735]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:08 np0005604215.localdomain sudo[43752]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqlzmyljuaygzyblzgauqkgnlxhmrqkq ; /usr/bin/python3
Feb 01 07:50:08 np0005604215.localdomain sudo[43752]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:08 np0005604215.localdomain python3[43754]: ansible-sysctl Invoked with name=net.bridge.bridge-nf-call-iptables value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:08 np0005604215.localdomain sudo[43752]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:08 np0005604215.localdomain sudo[43769]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aecduripzbhghbzxnkuajasqpbosalxu ; /usr/bin/python3
Feb 01 07:50:08 np0005604215.localdomain sudo[43769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:09 np0005604215.localdomain python3[43771]: ansible-sysctl Invoked with name=net.ipv4.conf.all.rp_filter value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:09 np0005604215.localdomain sudo[43769]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:09 np0005604215.localdomain sudo[43787]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ickomwbzgfynprjvkapghyjdjgusutbs ; /usr/bin/python3
Feb 01 07:50:09 np0005604215.localdomain sudo[43787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:09 np0005604215.localdomain python3[43789]: ansible-sysctl Invoked with name=net.ipv4.ip_forward value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:09 np0005604215.localdomain sudo[43787]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:09 np0005604215.localdomain sudo[43805]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohyfzwuxbaklvqtpplhfzftmorsutqrl ; /usr/bin/python3
Feb 01 07:50:09 np0005604215.localdomain sudo[43805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:09 np0005604215.localdomain python3[43807]: ansible-sysctl Invoked with name=net.ipv4.ip_local_reserved_ports value=35357,49000-49001 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:09 np0005604215.localdomain sudo[43805]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:09 np0005604215.localdomain sudo[43823]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewcjcixcpolmvnqonpzlyrwzyrlhcyiq ; /usr/bin/python3
Feb 01 07:50:09 np0005604215.localdomain sudo[43823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:10 np0005604215.localdomain python3[43825]: ansible-sysctl Invoked with name=net.ipv4.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:10 np0005604215.localdomain sudo[43823]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:10 np0005604215.localdomain sudo[43841]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jcgmeyigqnjbjgbvewcfxfxhhjbnktvz ; /usr/bin/python3
Feb 01 07:50:10 np0005604215.localdomain sudo[43841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:10 np0005604215.localdomain python3[43843]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh1 value=1024 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:10 np0005604215.localdomain sudo[43841]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:10 np0005604215.localdomain sudo[43859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwlngndmircosebwpbfwtywycdonphxx ; /usr/bin/python3
Feb 01 07:50:10 np0005604215.localdomain sudo[43859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:10 np0005604215.localdomain python3[43861]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh2 value=2048 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:10 np0005604215.localdomain sudo[43859]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:10 np0005604215.localdomain sudo[43877]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wehiqkqiwwakaiwawxnsvrszbpsgclsi ; /usr/bin/python3
Feb 01 07:50:10 np0005604215.localdomain sudo[43877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:10 np0005604215.localdomain python3[43879]: ansible-sysctl Invoked with name=net.ipv4.neigh.default.gc_thresh3 value=4096 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:10 np0005604215.localdomain sudo[43877]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:11 np0005604215.localdomain sudo[43895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyyvldelknqchbgglxdcovhlpghokyzc ; /usr/bin/python3
Feb 01 07:50:11 np0005604215.localdomain sudo[43895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:11 np0005604215.localdomain python3[43897]: ansible-sysctl Invoked with name=net.ipv6.conf.all.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:11 np0005604215.localdomain sudo[43895]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:11 np0005604215.localdomain sudo[43912]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qppfxlwyyukteooocfiwarxaltnrekfk ; /usr/bin/python3
Feb 01 07:50:11 np0005604215.localdomain sudo[43912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:11 np0005604215.localdomain python3[43914]: ansible-sysctl Invoked with name=net.ipv6.conf.all.forwarding value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:11 np0005604215.localdomain sudo[43912]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:11 np0005604215.localdomain sudo[43929]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhgmzoxopniugatkqvieamabhcpiykia ; /usr/bin/python3
Feb 01 07:50:11 np0005604215.localdomain sudo[43929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:11 np0005604215.localdomain python3[43931]: ansible-sysctl Invoked with name=net.ipv6.conf.default.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:11 np0005604215.localdomain sudo[43929]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:11 np0005604215.localdomain sudo[43946]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxnggphdczobbepynwwzgwgecypookis ; /usr/bin/python3
Feb 01 07:50:11 np0005604215.localdomain sudo[43946]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:12 np0005604215.localdomain python3[43948]: ansible-sysctl Invoked with name=net.ipv6.conf.lo.disable_ipv6 value=0 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:12 np0005604215.localdomain sudo[43946]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:12 np0005604215.localdomain sudo[43963]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zalnylqyxefdpqjrrhycvdgaebqfppqe ; /usr/bin/python3
Feb 01 07:50:12 np0005604215.localdomain sudo[43963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:12 np0005604215.localdomain python3[43965]: ansible-sysctl Invoked with name=net.ipv6.ip_nonlocal_bind value=1 sysctl_set=True state=present sysctl_file=/etc/sysctl.d/99-tripleo.conf reload=False ignoreerrors=False
Feb 01 07:50:12 np0005604215.localdomain sudo[43963]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:12 np0005604215.localdomain sudo[43981]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-illlyfqoscnvdqffoowjacrcygoddycg ; /usr/bin/python3
Feb 01 07:50:12 np0005604215.localdomain sudo[43981]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:12 np0005604215.localdomain python3[43983]: ansible-systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 07:50:12 np0005604215.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 01 07:50:12 np0005604215.localdomain systemd[1]: Stopped Apply Kernel Variables.
Feb 01 07:50:12 np0005604215.localdomain systemd[1]: Stopping Apply Kernel Variables...
Feb 01 07:50:12 np0005604215.localdomain systemd[1]: Starting Apply Kernel Variables...
Feb 01 07:50:12 np0005604215.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 01 07:50:12 np0005604215.localdomain systemd[1]: Finished Apply Kernel Variables.
Feb 01 07:50:12 np0005604215.localdomain sudo[43981]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:13 np0005604215.localdomain sudo[44001]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjgajsikugqkbkvfxdytsnzlelwglhre ; /usr/bin/python3
Feb 01 07:50:13 np0005604215.localdomain sudo[44001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:13 np0005604215.localdomain python3[44003]: ansible-file Invoked with mode=0750 path=/var/log/containers/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:13 np0005604215.localdomain sudo[44001]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:13 np0005604215.localdomain sudo[44017]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yatlkuxwayqslxzqdzzoqgeqmhgwanca ; /usr/bin/python3
Feb 01 07:50:13 np0005604215.localdomain sudo[44017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:13 np0005604215.localdomain python3[44019]: ansible-file Invoked with path=/var/lib/metrics_qdr setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:13 np0005604215.localdomain sudo[44017]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:13 np0005604215.localdomain sudo[44033]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwyvfxgojzsftaqimcocvxouvuqltzwr ; /usr/bin/python3
Feb 01 07:50:13 np0005604215.localdomain sudo[44033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:13 np0005604215.localdomain python3[44035]: ansible-file Invoked with mode=0750 path=/var/log/containers/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:13 np0005604215.localdomain sudo[44033]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:14 np0005604215.localdomain sudo[44049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcwiwnrhidxpvfkyiaimcoansccwgapz ; /usr/bin/python3
Feb 01 07:50:14 np0005604215.localdomain sudo[44049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:14 np0005604215.localdomain python3[44051]: ansible-stat Invoked with path=/var/lib/nova/instances follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:50:14 np0005604215.localdomain sudo[44049]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:14 np0005604215.localdomain sudo[44065]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtmqykgewyhekmokucdbcckffehdfqsw ; /usr/bin/python3
Feb 01 07:50:14 np0005604215.localdomain sudo[44065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:14 np0005604215.localdomain python3[44067]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:14 np0005604215.localdomain sudo[44065]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:14 np0005604215.localdomain sudo[44081]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqrxtjglvkaugpbptsezeqbkyfipmdbx ; /usr/bin/python3
Feb 01 07:50:14 np0005604215.localdomain sudo[44081]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:14 np0005604215.localdomain python3[44083]: ansible-file Invoked with path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:14 np0005604215.localdomain sudo[44081]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:14 np0005604215.localdomain sudo[44097]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfxexyuqbkrachopzsuloysfpxreyuvt ; /usr/bin/python3
Feb 01 07:50:14 np0005604215.localdomain sudo[44097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:15 np0005604215.localdomain python3[44099]: ansible-file Invoked with path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:15 np0005604215.localdomain sudo[44097]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:15 np0005604215.localdomain sudo[44113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dglyjhzvsozyjmhnzzgdjaqdhzivlfxz ; /usr/bin/python3
Feb 01 07:50:15 np0005604215.localdomain sudo[44113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:15 np0005604215.localdomain python3[44115]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:15 np0005604215.localdomain sudo[44113]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:15 np0005604215.localdomain sudo[44129]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpkdbrvxpsjgkoqftawzqmsenfnkqkeo ; /usr/bin/python3
Feb 01 07:50:15 np0005604215.localdomain sudo[44129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:15 np0005604215.localdomain python3[44131]: ansible-file Invoked with path=/etc/tmpfiles.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:15 np0005604215.localdomain sudo[44129]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:16 np0005604215.localdomain sudo[44177]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjbhxqskxjnfrhazpnqseppbusiaqhwz ; /usr/bin/python3
Feb 01 07:50:16 np0005604215.localdomain sudo[44177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:16 np0005604215.localdomain python3[44179]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-nova.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:16 np0005604215.localdomain sudo[44177]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:16 np0005604215.localdomain sudo[44220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnqnlwybbjetticymfocvliwodiyvgme ; /usr/bin/python3
Feb 01 07:50:16 np0005604215.localdomain sudo[44220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:16 np0005604215.localdomain python3[44222]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-nova.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932215.8917336-76107-114380676750876/source _original_basename=tmpuhz2h57s follow=False checksum=f834349098718ec09c7562bcb470b717a83ff411 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:16 np0005604215.localdomain sudo[44220]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:16 np0005604215.localdomain sudo[44250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isoopgrmjmryqwngszgkivekzhodpdqh ; /usr/bin/python3
Feb 01 07:50:16 np0005604215.localdomain sudo[44250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:16 np0005604215.localdomain python3[44252]: ansible-ansible.legacy.command Invoked with _raw_params=systemd-tmpfiles --create _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:16 np0005604215.localdomain sudo[44250]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:17 np0005604215.localdomain sudo[44267]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uaoujcjnmowrdhibomheczcouqnkfwsa ; /usr/bin/python3
Feb 01 07:50:17 np0005604215.localdomain sudo[44267]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:18 np0005604215.localdomain python3[44269]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:18 np0005604215.localdomain sudo[44267]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:18 np0005604215.localdomain sudo[44315]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsqgqldwnwfnuhenhlhbjzzhtjaywvbp ; /usr/bin/python3
Feb 01 07:50:18 np0005604215.localdomain sudo[44315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:18 np0005604215.localdomain python3[44317]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/delay-nova-compute follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:18 np0005604215.localdomain sudo[44315]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:18 np0005604215.localdomain sudo[44358]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qotnqsyyvpxqfuzrlafxkvcqrtfeywpj ; /usr/bin/python3
Feb 01 07:50:18 np0005604215.localdomain sudo[44358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:18 np0005604215.localdomain python3[44360]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/nova/delay-nova-compute mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932218.2639725-76212-243758819772904/source _original_basename=tmpyftewfz3 follow=False checksum=f07ad3e8cf3766b3b3b07ae8278826a0ef3bb5e3 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:18 np0005604215.localdomain sudo[44358]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:19 np0005604215.localdomain sudo[44388]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lptfufaavdyyfeuefixeebteiqhhsogb ; /usr/bin/python3
Feb 01 07:50:19 np0005604215.localdomain sudo[44388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:19 np0005604215.localdomain python3[44390]: ansible-file Invoked with mode=0750 path=/var/log/containers/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:19 np0005604215.localdomain sudo[44388]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:19 np0005604215.localdomain sudo[44404]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igixjefahuicktkefrrhcvretbiktedo ; /usr/bin/python3
Feb 01 07:50:19 np0005604215.localdomain sudo[44404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:19 np0005604215.localdomain python3[44406]: ansible-file Invoked with path=/etc/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:19 np0005604215.localdomain sudo[44404]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:19 np0005604215.localdomain sudo[44420]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdrrbzfymmqswlesrjtullsnnmonhlyh ; /usr/bin/python3
Feb 01 07:50:19 np0005604215.localdomain sudo[44420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:20 np0005604215.localdomain python3[44422]: ansible-file Invoked with path=/etc/libvirt/secrets setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:20 np0005604215.localdomain sudo[44420]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:20 np0005604215.localdomain sudo[44436]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qojxxlapacgvvipggzxdwtbkippgzwdh ; /usr/bin/python3
Feb 01 07:50:20 np0005604215.localdomain sudo[44436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:20 np0005604215.localdomain python3[44438]: ansible-file Invoked with path=/etc/libvirt/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:20 np0005604215.localdomain sudo[44436]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:20 np0005604215.localdomain sudo[44452]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xigcifjkgulsjwvggbqzkzfylnumvtnn ; /usr/bin/python3
Feb 01 07:50:20 np0005604215.localdomain sudo[44452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:20 np0005604215.localdomain python3[44454]: ansible-file Invoked with path=/var/lib/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:20 np0005604215.localdomain sudo[44452]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:20 np0005604215.localdomain sudo[44455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:50:20 np0005604215.localdomain sudo[44455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:50:20 np0005604215.localdomain sudo[44455]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:20 np0005604215.localdomain sudo[44483]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-makrwwjfxucjaeiuptbrxajtkvohzbrq ; /usr/bin/python3
Feb 01 07:50:20 np0005604215.localdomain sudo[44483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:20 np0005604215.localdomain sudo[44484]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 07:50:20 np0005604215.localdomain sudo[44484]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:50:20 np0005604215.localdomain python3[44498]: ansible-file Invoked with path=/var/cache/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:20 np0005604215.localdomain sudo[44483]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:21 np0005604215.localdomain sudo[44514]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ravyqenbxawrlsbnvhnmyosjwdeirsyv ; /usr/bin/python3
Feb 01 07:50:21 np0005604215.localdomain sudo[44514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:21 np0005604215.localdomain sudo[44484]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:21 np0005604215.localdomain python3[44524]: ansible-file Invoked with path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:21 np0005604215.localdomain sudo[44514]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:21 np0005604215.localdomain sudo[44564]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wllloxpxrmhixfialnltpcliynozizqq ; /usr/bin/python3
Feb 01 07:50:21 np0005604215.localdomain sudo[44537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:50:21 np0005604215.localdomain sudo[44564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:21 np0005604215.localdomain sudo[44537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:50:21 np0005604215.localdomain sudo[44537]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:21 np0005604215.localdomain sudo[44568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:50:21 np0005604215.localdomain sudo[44568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:50:21 np0005604215.localdomain python3[44566]: ansible-file Invoked with path=/run/libvirt state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:21 np0005604215.localdomain sudo[44564]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:21 np0005604215.localdomain sudo[44596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvgwrjhwlqjdgdlzeffzpgkoqeaefyid ; /usr/bin/python3
Feb 01 07:50:21 np0005604215.localdomain sudo[44596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:21 np0005604215.localdomain python3[44598]: ansible-file Invoked with mode=0770 path=/var/log/containers/libvirt/swtpm setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:21 np0005604215.localdomain sudo[44596]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:22 np0005604215.localdomain sudo[44568]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:22 np0005604215.localdomain sudo[44644]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpudiessijmlpzkizbukwcimqvgzhpeq ; /usr/bin/python3
Feb 01 07:50:22 np0005604215.localdomain sudo[44644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:22 np0005604215.localdomain python3[44646]: ansible-group Invoked with gid=107 name=qemu state=present system=False local=False non_unique=False
Feb 01 07:50:22 np0005604215.localdomain groupadd[44647]: group added to /etc/group: name=qemu, GID=107
Feb 01 07:50:22 np0005604215.localdomain groupadd[44647]: group added to /etc/gshadow: name=qemu
Feb 01 07:50:22 np0005604215.localdomain groupadd[44647]: new group: name=qemu, GID=107
Feb 01 07:50:22 np0005604215.localdomain sudo[44644]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:22 np0005604215.localdomain sudo[44653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:50:22 np0005604215.localdomain sudo[44653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:50:22 np0005604215.localdomain sudo[44653]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:22 np0005604215.localdomain sudo[44681]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rjvweilnpsnywzejpvbylufyhmvkzpaw ; /usr/bin/python3
Feb 01 07:50:22 np0005604215.localdomain sudo[44681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:23 np0005604215.localdomain python3[44683]: ansible-user Invoked with comment=qemu user group=qemu name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604215.localdomain update_password=always groups=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Feb 01 07:50:23 np0005604215.localdomain useradd[44685]: new user: name=qemu, UID=107, GID=107, home=/home/qemu, shell=/sbin/nologin, from=none
Feb 01 07:50:23 np0005604215.localdomain sudo[44681]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:23 np0005604215.localdomain sudo[44705]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cntrpbqkscakckdhnotzhsdropxxclzl ; /usr/bin/python3
Feb 01 07:50:23 np0005604215.localdomain sudo[44705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:23 np0005604215.localdomain python3[44707]: ansible-file Invoked with group=qemu owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None serole=None selevel=None attributes=None
Feb 01 07:50:23 np0005604215.localdomain sudo[44705]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:23 np0005604215.localdomain sudo[44721]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnwqbfltesdbluqxhsbsqcffkjfemsec ; /usr/bin/python3
Feb 01 07:50:23 np0005604215.localdomain sudo[44721]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:23 np0005604215.localdomain python3[44723]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/rpm -q libvirt-daemon _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:23 np0005604215.localdomain sudo[44721]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:24 np0005604215.localdomain sudo[44770]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewahlvpbyxmuxqiskeoyjygfcddkrxxr ; /usr/bin/python3
Feb 01 07:50:24 np0005604215.localdomain sudo[44770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:24 np0005604215.localdomain python3[44772]: ansible-ansible.legacy.stat Invoked with path=/etc/tmpfiles.d/run-libvirt.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:24 np0005604215.localdomain sudo[44770]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:24 np0005604215.localdomain sudo[44813]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhieyzdedpalpxglvixsbqpobdeuebiy ; /usr/bin/python3
Feb 01 07:50:24 np0005604215.localdomain sudo[44813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:24 np0005604215.localdomain python3[44815]: ansible-ansible.legacy.copy Invoked with dest=/etc/tmpfiles.d/run-libvirt.conf src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932223.9916744-76716-5648114303049/source _original_basename=tmpjcbnvtmg follow=False checksum=57f3ff94c666c6aae69ae22e23feb750cf9e8b13 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:24 np0005604215.localdomain sudo[44813]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:24 np0005604215.localdomain sudo[44843]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjuyzernhkzerlbzwznuenrxwqirgnqx ; /usr/bin/python3
Feb 01 07:50:24 np0005604215.localdomain sudo[44843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:25 np0005604215.localdomain python3[44845]: ansible-seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 01 07:50:25 np0005604215.localdomain sudo[44843]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:25 np0005604215.localdomain sudo[44863]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptkcuhfyqqznlvmcirtlavnzewfnqgfd ; /usr/bin/python3
Feb 01 07:50:25 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Feb 01 07:50:25 np0005604215.localdomain sudo[44863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:26 np0005604215.localdomain python3[44865]: ansible-file Invoked with path=/etc/crypto-policies/local.d/gnutls-qemu.config state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:26 np0005604215.localdomain sudo[44863]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:26 np0005604215.localdomain sudo[44879]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kyvuwddtsadqzfwqexhjcbjaxnlsdlja ; /usr/bin/python3
Feb 01 07:50:26 np0005604215.localdomain sudo[44879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:26 np0005604215.localdomain python3[44881]: ansible-file Invoked with path=/run/libvirt setype=virt_var_run_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:26 np0005604215.localdomain sudo[44879]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:26 np0005604215.localdomain sudo[44895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsraodwryuseypbsbrwgebduqnfqlqki ; /usr/bin/python3
Feb 01 07:50:26 np0005604215.localdomain sudo[44895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:26 np0005604215.localdomain python3[44897]: ansible-seboolean Invoked with name=logrotate_read_inside_containers persistent=True state=True ignore_selinux_state=False
Feb 01 07:50:27 np0005604215.localdomain sudo[44895]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:28 np0005604215.localdomain sudo[44915]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcfstlorzkqjmukjjogynpoinzzppqri ; /usr/bin/python3
Feb 01 07:50:28 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Feb 01 07:50:28 np0005604215.localdomain sudo[44915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:28 np0005604215.localdomain python3[44917]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:50:30 np0005604215.localdomain sudo[44915]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:31 np0005604215.localdomain sudo[44932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-skssojcchhndgxaytzpolscguhmllgsu ; /usr/bin/python3
Feb 01 07:50:31 np0005604215.localdomain sudo[44932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:31 np0005604215.localdomain python3[44934]: ansible-setup Invoked with gather_subset=['!all', '!min', 'network'] filter=['ansible_interfaces'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 07:50:31 np0005604215.localdomain sudo[44932]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:31 np0005604215.localdomain sudo[44993]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqojrlaagdqzuocinalnmkjsjiwgsefl ; /usr/bin/python3
Feb 01 07:50:31 np0005604215.localdomain sudo[44993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:31 np0005604215.localdomain python3[44995]: ansible-file Invoked with path=/etc/containers/networks state=directory recurse=True mode=493 owner=root group=root force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:31 np0005604215.localdomain sudo[44993]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:32 np0005604215.localdomain sudo[45009]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mngvjldznbleensrymiahsnrsviwovso ; /usr/bin/python3
Feb 01 07:50:32 np0005604215.localdomain sudo[45009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:32 np0005604215.localdomain python3[45011]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                          _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:32 np0005604215.localdomain sudo[45009]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:32 np0005604215.localdomain sudo[45070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uofuumwvruskvfjmeuvtzimepbnqqtpu ; /usr/bin/python3
Feb 01 07:50:32 np0005604215.localdomain sudo[45070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:32 np0005604215.localdomain python3[45072]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:32 np0005604215.localdomain sudo[45070]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:33 np0005604215.localdomain sudo[45113]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcwtqqvnxaarsjlalxfwdcodwhznzaaw ; /usr/bin/python3
Feb 01 07:50:33 np0005604215.localdomain sudo[45113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:33 np0005604215.localdomain python3[45115]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932232.4804363-77041-35579151803985/source dest=/etc/containers/networks/podman.json mode=0644 owner=root group=root follow=False _original_basename=podman_network_config.j2 checksum=4a4ec5a7bbea6767597329319374590966ea2f65 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:33 np0005604215.localdomain sudo[45113]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:33 np0005604215.localdomain sudo[45175]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okvalcakjrquurenmvgxoleeuwmpntqe ; /usr/bin/python3
Feb 01 07:50:33 np0005604215.localdomain sudo[45175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:33 np0005604215.localdomain python3[45177]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:33 np0005604215.localdomain sudo[45175]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:34 np0005604215.localdomain sudo[45220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opdvbizxdgbhfmzrwwiqgelmtxxnleco ; /usr/bin/python3
Feb 01 07:50:34 np0005604215.localdomain sudo[45220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:34 np0005604215.localdomain python3[45222]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932233.515498-77132-212304177395739/source dest=/etc/containers/registries.conf owner=root group=root setype=etc_t mode=0644 follow=False _original_basename=registries.conf.j2 checksum=710a00cfb11a4c3eba9c028ef1984a9fea9ba83a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:34 np0005604215.localdomain sudo[45220]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:34 np0005604215.localdomain sudo[45250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tktcyeelhyumucrmgudddjqmwrrahfoi ; /usr/bin/python3
Feb 01 07:50:34 np0005604215.localdomain sudo[45250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:34 np0005604215.localdomain python3[45252]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=containers option=pids_limit value=4096 backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:34 np0005604215.localdomain sudo[45250]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:34 np0005604215.localdomain sudo[45266]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvezfwljpvhudhudfdpmzfihzzbzktxd ; /usr/bin/python3
Feb 01 07:50:34 np0005604215.localdomain sudo[45266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:34 np0005604215.localdomain python3[45268]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=events_logger value="journald" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:34 np0005604215.localdomain sudo[45266]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:35 np0005604215.localdomain sudo[45282]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwecaxtxoqjpobbwbzqstwlmsyrfgiui ; /usr/bin/python3
Feb 01 07:50:35 np0005604215.localdomain sudo[45282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:35 np0005604215.localdomain python3[45284]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=engine option=runtime value="crun" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:35 np0005604215.localdomain sudo[45282]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:35 np0005604215.localdomain sudo[45298]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-heaiowlzhrczziwbslsvluembxxsxhbw ; /usr/bin/python3
Feb 01 07:50:35 np0005604215.localdomain sudo[45298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:35 np0005604215.localdomain python3[45300]: ansible-ini_file Invoked with path=/etc/containers/containers.conf owner=root group=root setype=etc_t mode=0644 create=True section=network option=network_backend value="netavark" backup=False state=present exclusive=True no_extra_spaces=False allow_no_value=False unsafe_writes=False values=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:35 np0005604215.localdomain sudo[45298]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:36 np0005604215.localdomain sudo[45346]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nidrpxcvbwiybvfnejonjztolnvbqizv ; /usr/bin/python3
Feb 01 07:50:36 np0005604215.localdomain sudo[45346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:36 np0005604215.localdomain python3[45348]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:36 np0005604215.localdomain sudo[45346]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:36 np0005604215.localdomain sudo[45389]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mquasatforvcqzqjvydzmqltrahlyoej ; /usr/bin/python3
Feb 01 07:50:36 np0005604215.localdomain sudo[45389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:36 np0005604215.localdomain python3[45391]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932235.9042277-77250-181579216637399/source _original_basename=tmposo12t2p follow=False checksum=0bfbc70e9a4740c9004b9947da681f723d529c83 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:36 np0005604215.localdomain sudo[45389]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:36 np0005604215.localdomain sudo[45419]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqykacutjpgijgnpikaclnlizzpwvvrl ; /usr/bin/python3
Feb 01 07:50:36 np0005604215.localdomain sudo[45419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:36 np0005604215.localdomain python3[45421]: ansible-file Invoked with mode=0750 path=/var/log/containers/rsyslog setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:36 np0005604215.localdomain sudo[45419]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:37 np0005604215.localdomain sudo[45435]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjwmuydyhhdepyvttbsupnqidnuhmbjs ; /usr/bin/python3
Feb 01 07:50:37 np0005604215.localdomain sudo[45435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:37 np0005604215.localdomain python3[45437]: ansible-file Invoked with path=/var/lib/rsyslog.container setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:37 np0005604215.localdomain sudo[45435]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:37 np0005604215.localdomain sudo[45451]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vispeqxqpidytvablouwvdqlfsgaanbr ; /usr/bin/python3
Feb 01 07:50:37 np0005604215.localdomain sudo[45451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:38 np0005604215.localdomain python3[45453]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:50:40 np0005604215.localdomain sudo[45451]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:41 np0005604215.localdomain sudo[45500]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qitvrhhzhhbwgpzhbwevtbngzrfumxit ; /usr/bin/python3
Feb 01 07:50:41 np0005604215.localdomain sudo[45500]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:41 np0005604215.localdomain python3[45502]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:41 np0005604215.localdomain sudo[45500]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:41 np0005604215.localdomain sudo[45545]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qojcasqkmrdjwvefugmnsurgjnhryatj ; /usr/bin/python3
Feb 01 07:50:41 np0005604215.localdomain sudo[45545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:41 np0005604215.localdomain python3[45547]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932241.1113806-77545-204720109673888/source validate=/usr/sbin/sshd -T -f %s mode=None follow=False _original_basename=sshd_config_block.j2 checksum=913c99ed7d5c33615bfb07a6792a4ef143dcfd2b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:41 np0005604215.localdomain sudo[45545]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:42 np0005604215.localdomain sudo[45576]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zophvllshplpzdzgpsediyecyqrnbdmu ; /usr/bin/python3
Feb 01 07:50:42 np0005604215.localdomain sudo[45576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:42 np0005604215.localdomain python3[45578]: ansible-systemd Invoked with name=sshd state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:50:42 np0005604215.localdomain sshd[1134]: Received signal 15; terminating.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: sshd.service: Consumed 2.936s CPU time, read 1.9M from disk, written 72.0K to disk.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 01 07:50:42 np0005604215.localdomain sshd[45582]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 07:50:42 np0005604215.localdomain sshd[45582]: Server listening on 0.0.0.0 port 22.
Feb 01 07:50:42 np0005604215.localdomain sshd[45582]: Server listening on :: port 22.
Feb 01 07:50:42 np0005604215.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 01 07:50:42 np0005604215.localdomain sudo[45576]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:42 np0005604215.localdomain sudo[45596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eswsutqnxkwzojtdhxroqesuqesgtqlb ; /usr/bin/python3
Feb 01 07:50:42 np0005604215.localdomain sudo[45596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:42 np0005604215.localdomain python3[45598]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:42 np0005604215.localdomain sudo[45596]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:43 np0005604215.localdomain sudo[45614]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqkuntaqcujzkxaznhtkgadurmhebsbu ; /usr/bin/python3
Feb 01 07:50:43 np0005604215.localdomain sudo[45614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:43 np0005604215.localdomain python3[45616]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ntpd.service || systemctl is-enabled ntpd.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:43 np0005604215.localdomain sudo[45614]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:44 np0005604215.localdomain sudo[45632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rillvvwydnqfxkjhwovhuuccqrxfxjrm ; /usr/bin/python3
Feb 01 07:50:44 np0005604215.localdomain sudo[45632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:44 np0005604215.localdomain python3[45634]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:50:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 07:50:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3396 writes, 16K keys, 3396 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.03 MB/s
                                                          Cumulative WAL: 3396 writes, 201 syncs, 16.90 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3396 writes, 16K keys, 3396 commit groups, 1.0 writes per commit group, ingest: 15.30 MB, 0.03 MB/s
                                                          Interval WAL: 3396 writes, 201 syncs, 16.90 writes per sync, written: 0.01 GB, 0.03 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 7e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 07:50:46 np0005604215.localdomain sudo[45632]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:47 np0005604215.localdomain sudo[45681]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxgmypejnevniwtrejinjhcxujfrgkzz ; /usr/bin/python3
Feb 01 07:50:47 np0005604215.localdomain sudo[45681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:47 np0005604215.localdomain python3[45683]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:47 np0005604215.localdomain sudo[45681]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:47 np0005604215.localdomain sudo[45699]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvxhksrzwsxrzxkphqwlxorhymzwkyut ; /usr/bin/python3
Feb 01 07:50:47 np0005604215.localdomain sudo[45699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:47 np0005604215.localdomain python3[45701]: ansible-ansible.legacy.file Invoked with owner=root group=root mode=420 dest=/etc/chrony.conf _original_basename=chrony.conf.j2 recurse=False state=file path=/etc/chrony.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:47 np0005604215.localdomain sudo[45699]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:48 np0005604215.localdomain sudo[45729]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qywojdwchquilartfrrhauvjjsazewkw ; /usr/bin/python3
Feb 01 07:50:48 np0005604215.localdomain sudo[45729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:48 np0005604215.localdomain python3[45731]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:50:49 np0005604215.localdomain sudo[45729]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 07:50:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Cumulative writes: 3247 writes, 16K keys, 3247 commit groups, 1.0 writes per commit group, ingest: 0.01 GB, 0.02 MB/s
                                                          Cumulative WAL: 3247 writes, 139 syncs, 23.36 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 3247 writes, 16K keys, 3247 commit groups, 1.0 writes per commit group, ingest: 14.62 MB, 0.02 MB/s
                                                          Interval WAL: 3247 writes, 139 syncs, 23.36 writes per sync, written: 0.01 GB, 0.02 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 2 last_secs: 8e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 600.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 8 last_secs: 3.9e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 07:50:50 np0005604215.localdomain sudo[45779]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijzljjdkhmxmrorajmeqgdzfnntddynw ; /usr/bin/python3
Feb 01 07:50:50 np0005604215.localdomain sudo[45779]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:50 np0005604215.localdomain python3[45781]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/chrony-online.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:50 np0005604215.localdomain sudo[45779]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:50 np0005604215.localdomain sudo[45797]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvhyppdurzqqsuypzgzysggyrelisuru ; /usr/bin/python3
Feb 01 07:50:50 np0005604215.localdomain sudo[45797]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:50 np0005604215.localdomain python3[45799]: ansible-ansible.legacy.file Invoked with dest=/etc/systemd/system/chrony-online.service _original_basename=chrony-online.service recurse=False state=file path=/etc/systemd/system/chrony-online.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:50 np0005604215.localdomain sudo[45797]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:51 np0005604215.localdomain sudo[45827]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llrzbaxxcxijcwybuaxpxklayxaezgfw ; /usr/bin/python3
Feb 01 07:50:51 np0005604215.localdomain sudo[45827]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:51 np0005604215.localdomain python3[45829]: ansible-systemd Invoked with state=started name=chrony-online.service enabled=True daemon-reload=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:50:51 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:50:51 np0005604215.localdomain systemd-rc-local-generator[45853]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:50:51 np0005604215.localdomain systemd-sysv-generator[45858]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:50:51 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:50:51 np0005604215.localdomain systemd[1]: Starting chronyd online sources service...
Feb 01 07:50:51 np0005604215.localdomain chronyc[45870]: 200 OK
Feb 01 07:50:51 np0005604215.localdomain systemd[1]: chrony-online.service: Deactivated successfully.
Feb 01 07:50:51 np0005604215.localdomain systemd[1]: Finished chronyd online sources service.
Feb 01 07:50:51 np0005604215.localdomain sudo[45827]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:52 np0005604215.localdomain sudo[45884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-excozvhzmmyrxkrjcehkttbovlzctqqs ; /usr/bin/python3
Feb 01 07:50:52 np0005604215.localdomain sudo[45884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:52 np0005604215.localdomain python3[45887]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:52 np0005604215.localdomain chronyd[25933]: System clock was stepped by 0.000039 seconds
Feb 01 07:50:52 np0005604215.localdomain sudo[45884]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:52 np0005604215.localdomain sudo[45902]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psdfvgatskitmmnnccxoqbxbdqyrbdha ; /usr/bin/python3
Feb 01 07:50:52 np0005604215.localdomain sudo[45902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:52 np0005604215.localdomain python3[45904]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:52 np0005604215.localdomain sudo[45902]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:52 np0005604215.localdomain sudo[45919]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dttfesplxnjmonvgwdtmdqdvbkxmjigf ; /usr/bin/python3
Feb 01 07:50:52 np0005604215.localdomain sudo[45919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:53 np0005604215.localdomain python3[45921]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc makestep _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:53 np0005604215.localdomain chronyd[25933]: System clock was stepped by 0.000000 seconds
Feb 01 07:50:53 np0005604215.localdomain sudo[45919]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:53 np0005604215.localdomain sudo[45936]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-btdyzvplixynpznkofmamvlhlqrwjqmg ; /usr/bin/python3
Feb 01 07:50:53 np0005604215.localdomain sudo[45936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:53 np0005604215.localdomain python3[45938]: ansible-ansible.legacy.command Invoked with _raw_params=chronyc waitsync 30 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:53 np0005604215.localdomain sudo[45936]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:53 np0005604215.localdomain sudo[46039]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwbwtsqbjpbczyylfevtkwbjisvhgdyi ; /usr/bin/python3
Feb 01 07:50:53 np0005604215.localdomain sudo[46039]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:54 np0005604215.localdomain python3[46067]: ansible-timezone Invoked with name=UTC hwclock=None
Feb 01 07:50:54 np0005604215.localdomain systemd[1]: Starting Time & Date Service...
Feb 01 07:50:54 np0005604215.localdomain systemd[1]: Started Time & Date Service.
Feb 01 07:50:54 np0005604215.localdomain sudo[46039]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:54 np0005604215.localdomain sudo[46148]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncugrejipjarwmymqokfthkdezbpnsuz ; /usr/bin/python3
Feb 01 07:50:54 np0005604215.localdomain sudo[46148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:55 np0005604215.localdomain python3[46150]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q --whatprovides tuned tuned-profiles-cpu-partitioning _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:55 np0005604215.localdomain sudo[46148]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:55 np0005604215.localdomain sudo[46165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngkylgmatomwyhbobfrgdotykwjvjsrz ; PATH=/bin:/usr/bin:/sbin:/usr/sbin /usr/bin/python3
Feb 01 07:50:55 np0005604215.localdomain sudo[46165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:55 np0005604215.localdomain rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 07:50:55 np0005604215.localdomain python3[46167]: ansible-ansible.legacy.command Invoked with _raw_params=which tuned-adm _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:50:55 np0005604215.localdomain sudo[46165]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:56 np0005604215.localdomain sudo[46183]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buixyptndpqpbgftjtsxfygglugebron ; /usr/bin/python3
Feb 01 07:50:56 np0005604215.localdomain sudo[46183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:56 np0005604215.localdomain python3[46185]: ansible-slurp Invoked with src=/etc/tuned/active_profile
Feb 01 07:50:56 np0005604215.localdomain sudo[46183]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:56 np0005604215.localdomain sudo[46199]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cytkesodkfnffosrklqmpkecxvkpbpnw ; /usr/bin/python3
Feb 01 07:50:56 np0005604215.localdomain sudo[46199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:56 np0005604215.localdomain python3[46201]: ansible-stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:50:56 np0005604215.localdomain sudo[46199]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:56 np0005604215.localdomain sudo[46215]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjurnyjobfmpvjtxqlhimzyskxdhzrgu ; /usr/bin/python3
Feb 01 07:50:56 np0005604215.localdomain sudo[46215]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:57 np0005604215.localdomain python3[46217]: ansible-file Invoked with mode=0750 path=/var/log/containers/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:57 np0005604215.localdomain sudo[46215]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:57 np0005604215.localdomain sudo[46231]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hasrcpahuxgfzrcoredvifbjpyxjwmob ; /usr/bin/python3
Feb 01 07:50:57 np0005604215.localdomain sudo[46231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:57 np0005604215.localdomain python3[46233]: ansible-file Invoked with path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:50:57 np0005604215.localdomain sudo[46231]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:57 np0005604215.localdomain sudo[46279]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lurfuntkzjbxespalelrebibfmprpkgb ; /usr/bin/python3
Feb 01 07:50:57 np0005604215.localdomain sudo[46279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:57 np0005604215.localdomain python3[46281]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/neutron-cleanup follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:57 np0005604215.localdomain sudo[46279]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:58 np0005604215.localdomain sudo[46322]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjcrwhtxygafzkazvpexzlfnsmdnyuqa ; /usr/bin/python3
Feb 01 07:50:58 np0005604215.localdomain sudo[46322]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:58 np0005604215.localdomain python3[46324]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/neutron-cleanup force=True mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932257.6321523-78542-211405516631821/source _original_basename=tmp6318g3kl follow=False checksum=f9cc7d1e91fbae49caa7e35eb2253bba146a73b4 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:58 np0005604215.localdomain sudo[46322]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:58 np0005604215.localdomain sudo[46384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iquxzsxluyznqqhdwcqnackheobttkgy ; /usr/bin/python3
Feb 01 07:50:58 np0005604215.localdomain sudo[46384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:58 np0005604215.localdomain python3[46386]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/neutron-cleanup.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:50:58 np0005604215.localdomain sudo[46384]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:59 np0005604215.localdomain sudo[46427]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlfbzxdszfaqvrckziyokfxnhjmltijw ; /usr/bin/python3
Feb 01 07:50:59 np0005604215.localdomain sudo[46427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:59 np0005604215.localdomain python3[46429]: ansible-ansible.legacy.copy Invoked with dest=/usr/lib/systemd/system/neutron-cleanup.service force=True src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932258.5061057-78600-136063128894773/source _original_basename=tmplligh94w follow=False checksum=6b6cd9f074903a28d054eb530a10c7235d0c39fc backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:50:59 np0005604215.localdomain sudo[46427]: pam_unix(sudo:session): session closed for user root
Feb 01 07:50:59 np0005604215.localdomain sudo[46457]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlrwvpmtkivxbrioepcopciiwlpnwyzg ; /usr/bin/python3
Feb 01 07:50:59 np0005604215.localdomain sudo[46457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:50:59 np0005604215.localdomain python3[46459]: ansible-ansible.legacy.systemd Invoked with enabled=True name=neutron-cleanup daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 01 07:50:59 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:50:59 np0005604215.localdomain systemd-rc-local-generator[46490]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:50:59 np0005604215.localdomain systemd-sysv-generator[46493]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:50:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:51:00 np0005604215.localdomain sudo[46457]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:00 np0005604215.localdomain sudo[46510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbfrvqpgmmlngyolcqihksbadjksqqoq ; /usr/bin/python3
Feb 01 07:51:00 np0005604215.localdomain sudo[46510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:00 np0005604215.localdomain python3[46512]: ansible-file Invoked with mode=0750 path=/var/log/containers/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:51:00 np0005604215.localdomain sudo[46510]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:00 np0005604215.localdomain sudo[46526]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxhmjzfnlhtuxizzqtrlazvravbzumdn ; /usr/bin/python3
Feb 01 07:51:00 np0005604215.localdomain sudo[46526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:00 np0005604215.localdomain python3[46528]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns add ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:51:00 np0005604215.localdomain sudo[46526]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:00 np0005604215.localdomain sudo[46543]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-deekdfeaawrsfyjlqixoboauvsswglcg ; /usr/bin/python3
Feb 01 07:51:00 np0005604215.localdomain sudo[46543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:01 np0005604215.localdomain python3[46545]: ansible-ansible.legacy.command Invoked with _raw_params=ip netns delete ns_temp _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:51:01 np0005604215.localdomain systemd[1]: run-netns-ns_temp.mount: Deactivated successfully.
Feb 01 07:51:01 np0005604215.localdomain sudo[46543]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:01 np0005604215.localdomain sudo[46560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvtsabncdbtlfzeipxerktjuoqwuamge ; /usr/bin/python3
Feb 01 07:51:01 np0005604215.localdomain sudo[46560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:01 np0005604215.localdomain python3[46562]: ansible-file Invoked with path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:51:01 np0005604215.localdomain sudo[46560]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:01 np0005604215.localdomain sudo[46576]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jefobzfzujlgnaycwfqlxvcjaksokkme ; /usr/bin/python3
Feb 01 07:51:01 np0005604215.localdomain sudo[46576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:01 np0005604215.localdomain python3[46578]: ansible-file Invoked with path=/var/lib/neutron/kill_scripts state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:51:01 np0005604215.localdomain sudo[46576]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:01 np0005604215.localdomain sudo[46624]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcjfunwstsllapvazeyyngijinyarhnq ; /usr/bin/python3
Feb 01 07:51:02 np0005604215.localdomain sudo[46624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:02 np0005604215.localdomain python3[46626]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:51:02 np0005604215.localdomain sudo[46624]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:02 np0005604215.localdomain sudo[46667]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpiqqdowkabaikluhvbyxmwtshxdqujz ; /usr/bin/python3
Feb 01 07:51:02 np0005604215.localdomain sudo[46667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:02 np0005604215.localdomain python3[46669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=493 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932261.8845768-78805-31797608341399/source _original_basename=tmpnns37bp3 follow=False checksum=2f369fbe8f83639cdfd4efc53e7feb4ee77d1ed7 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:51:02 np0005604215.localdomain sudo[46667]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:22 np0005604215.localdomain sudo[46685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:51:22 np0005604215.localdomain sudo[46685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:51:22 np0005604215.localdomain sudo[46685]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:22 np0005604215.localdomain sudo[46700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:51:22 np0005604215.localdomain sudo[46700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:51:23 np0005604215.localdomain sudo[46700]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:24 np0005604215.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 01 07:51:25 np0005604215.localdomain sudo[46750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:51:25 np0005604215.localdomain sudo[46750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:51:25 np0005604215.localdomain sudo[46750]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:27 np0005604215.localdomain sudo[46778]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-semkygtvtuoebkrkhsydvksfsovidfva ; /usr/bin/python3
Feb 01 07:51:27 np0005604215.localdomain sudo[46778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:28 np0005604215.localdomain python3[46780]: ansible-file Invoked with path=/var/log/containers state=directory setype=container_file_t selevel=s0 mode=488 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:51:28 np0005604215.localdomain sudo[46778]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:28 np0005604215.localdomain sudo[46794]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krmhloseqxjlkzhzlkyqkrnvqjrbeoxf ; /usr/bin/python3
Feb 01 07:51:28 np0005604215.localdomain sudo[46794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:28 np0005604215.localdomain python3[46796]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None setype=None attributes=None
Feb 01 07:51:28 np0005604215.localdomain sudo[46794]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:28 np0005604215.localdomain sudo[46810]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-becfonjbcohsecbqvnzthvvtvrtpakhj ; /usr/bin/python3
Feb 01 07:51:28 np0005604215.localdomain sudo[46810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:28 np0005604215.localdomain python3[46812]: ansible-file Invoked with path=/var/lib/tripleo-config state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:51:28 np0005604215.localdomain sudo[46810]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:28 np0005604215.localdomain sudo[46826]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxycgjczvyynsjcwgujghaaadfrsebog ; /usr/bin/python3
Feb 01 07:51:29 np0005604215.localdomain sudo[46826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:29 np0005604215.localdomain python3[46828]: ansible-file Invoked with path=/var/lib/container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:51:29 np0005604215.localdomain sudo[46826]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:29 np0005604215.localdomain sudo[46842]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euiwbrbkuyjztcntndkyiahjobdtbvuy ; /usr/bin/python3
Feb 01 07:51:29 np0005604215.localdomain sudo[46842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:29 np0005604215.localdomain python3[46844]: ansible-file Invoked with path=/var/lib/docker-container-startup-configs.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:51:29 np0005604215.localdomain sudo[46842]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:29 np0005604215.localdomain sudo[46858]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwdemolufgfmhezlzchgmmnmovvmkyhy ; /usr/bin/python3
Feb 01 07:51:29 np0005604215.localdomain sudo[46858]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:29 np0005604215.localdomain python3[46860]: ansible-community.general.sefcontext Invoked with target=/var/lib/container-config-scripts(/.*)? setype=container_file_t state=present ignore_selinux_state=False ftype=a reload=True seuser=None selevel=None
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  Converting 2705 SID table entries...
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:51:30 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:51:31 np0005604215.localdomain sudo[46858]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:31 np0005604215.localdomain sudo[46879]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiemiylmqjqzjbpkpymkesbavchjothe ; /usr/bin/python3
Feb 01 07:51:31 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Feb 01 07:51:31 np0005604215.localdomain sudo[46879]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:31 np0005604215.localdomain python3[46881]: ansible-file Invoked with path=/var/lib/container-config-scripts state=directory setype=container_file_t recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:51:31 np0005604215.localdomain sudo[46879]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:31 np0005604215.localdomain sudo[46895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csncizpbpxkufrkxnozgzlnwoqpzlbay ; /usr/bin/python3
Feb 01 07:51:31 np0005604215.localdomain sudo[46895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:31 np0005604215.localdomain sudo[46895]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:31 np0005604215.localdomain sudo[46943]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzvqdcbtagganbdmiexypakotaratyqw ; /usr/bin/python3
Feb 01 07:51:31 np0005604215.localdomain sudo[46943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:32 np0005604215.localdomain sudo[46943]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:32 np0005604215.localdomain sudo[46986]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsztbqrlyvpmjpznxpksgodbforpfvxo ; /usr/bin/python3
Feb 01 07:51:32 np0005604215.localdomain sudo[46986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:32 np0005604215.localdomain sudo[46986]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:32 np0005604215.localdomain sudo[47016]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhohbmuqscuashaextjokwektiiefgmr ; /usr/bin/python3
Feb 01 07:51:32 np0005604215.localdomain sudo[47016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:33 np0005604215.localdomain python3[47018]: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-config/container-startup-config config_data={'step_1': {'metrics_qdr': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, 'metrics_qdr_init_logs': {'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}}, 'step_2': {'create_haproxy_wrapper': {'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, 'create_virtlogd_wrapper': {'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, 'nova_compute_init_log': {'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, 'nova_virtqemud_init_logs': {'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}}, 'step_3': {'ceilometer_init_log': {'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'collectd': {'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, 'iscsid': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, 'nova_statedir_owner': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, 'nova_virtlogd_wrapper': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, 'nova_virtnodedevd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtproxyd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtqemud': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, 'nova_virtsecretd': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, 'nova_virtstoraged': {'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, 'rsyslog': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}}, 'step_4': {'ceilometer_agent_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'ceilometer_agent_ipmi': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, 'configure_cms_options': {'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, 'logrotate_crond': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, 'nova_libvirt_init_secret': {'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, 'nova_migration_target': {'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, 'ovn_controller': {'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, 'ovn_metadata_agent': {'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, 'setup_ovs_manager': {'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}}, 'step_5': {'nova_compute': {'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, 'nova_wait_for_compute_service': {'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}}}
Feb 01 07:51:33 np0005604215.localdomain sudo[47016]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:33 np0005604215.localdomain rsyslogd[760]: message too long (31243) with configured size 8096, begin of message is: ansible-container_startup_config Invoked with config_base_dir=/var/lib/tripleo-c [v8.2102.0-111.el9 try https://www.rsyslog.com/e/2445 ]
Feb 01 07:51:33 np0005604215.localdomain sudo[47032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngtqmaqvqzscxlvzjmxwsyxkysabsysv ; /usr/bin/python3
Feb 01 07:51:33 np0005604215.localdomain sudo[47032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:33 np0005604215.localdomain python3[47034]: ansible-file Invoked with path=/var/lib/kolla/config_files state=directory setype=container_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:51:33 np0005604215.localdomain sudo[47032]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:33 np0005604215.localdomain sudo[47048]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gqklggqqreltdsiujxgwwfpkduweksrj ; /usr/bin/python3
Feb 01 07:51:33 np0005604215.localdomain sudo[47048]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:34 np0005604215.localdomain python3[47050]: ansible-file Invoked with path=/var/lib/config-data mode=493 state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:51:34 np0005604215.localdomain sudo[47048]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:34 np0005604215.localdomain sudo[47064]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxyzzmuzdftfyycayeleadcgpwqxmuup ; /usr/bin/python3
Feb 01 07:51:34 np0005604215.localdomain sudo[47064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:34 np0005604215.localdomain python3[47066]: ansible-tripleo_container_configs Invoked with config_data={'/var/lib/kolla/config_files/ceilometer-agent-ipmi.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces ipmi --logfile /var/log/ceilometer/ipmi.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/ceilometer_agent_compute.json': {'command': '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /var/log/ceilometer/compute.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/collectd.json': {'command': '/usr/sbin/collectd -f', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/', 'merge': False, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/etc/collectd.d'}], 'permissions': [{'owner': 'collectd:collectd', 'path': '/var/log/collectd', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/scripts', 'recurse': True}, {'owner': 'collectd:collectd', 'path': '/config-scripts', 'recurse': True}]}, '/var/lib/kolla/config_files/iscsid.json': {'command': '/usr/sbin/iscsid -f', 'config_files': [{'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/'}]}, '/var/lib/kolla/config_files/logrotate-crond.json': {'command': '/usr/sbin/crond -s -n', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}]}, '/var/lib/kolla/config_files/metrics_qdr.json': {'command': '/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/', 'merge': True, 'optional': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-tls/*'}], 'permissions': [{'owner': 'qdrouterd:qdrouterd', 'path': '/var/lib/qdrouterd', 'recurse': True}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/certs/metrics_qdr.crt'}, {'optional': True, 'owner': 'qdrouterd:qdrouterd', 'path': '/etc/pki/tls/private/metrics_qdr.key'}]}, '/var/lib/kolla/config_files/nova-migration-target.json': {'command': 'dumb-init --single-child -- /usr/sbin/sshd -D -p 2022', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ssh/', 'owner': 'root', 'perm': '0600', 'source': '/host-ssh/ssh_host_*_key'}]}, '/var/lib/kolla/config_files/nova_compute.json': {'command': '/var/lib/nova/delay-nova-compute --delay 180 --nova-binary /usr/bin/nova-compute ', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/iscsi/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-iscsid/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}, {'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json': {'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_wait_for_compute_service.py', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'nova:nova', 'path': '/var/log/nova', 'recurse': True}]}, '/var/lib/kolla/config_files/nova_virtlogd.json': {'command': '/usr/local/bin/virtlogd_wrapper', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtnodedevd.json': {'command': '/usr/sbin/virtnodedevd --config /etc/libvirt/virtnodedevd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtproxyd.json': {'command': '/usr/sbin/virtproxyd --config /etc/libvirt/virtproxyd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtqemud.json': {'command': '/usr/sbin/virtqemud --config /etc/libvirt/virtqemud.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtsecretd.json': {'command': '/usr/sbin/virtsecretd --config /etc/libvirt/virtsecretd.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/nova_virtstoraged.json': {'command': '/usr/sbin/virtstoraged --config /etc/libvirt/virtstoraged.conf', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}, {'dest': '/etc/ceph/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src-ceph/'}], 'permissions': [{'owner': 'nova:nova', 'path': '/etc/ceph/ceph.client.openstack.keyring', 'perm': '0600'}]}, '/var/lib/kolla/config_files/ovn_controller.json': {'command': '/usr/bin/ovn-controller --pidfile --log-file unix:/run/openvswitch/db.sock ', 'permissions': [{'owner': 'root:root', 'path': '/var/log/openvswitch', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/ovn', 'recurse': True}]}, '/var/lib/kolla/config_files/ovn_metadata_agent.json': {'command': '/usr/bin/networking-ovn-metadata-agent --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --log-file=/var/log/neutron/ovn-metadata-agent.log', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'neutron:neutron', 'path': '/var/log/neutron', 'recurse': True}, {'owner': 'neutron:neutron', 'path': '/var/lib/neutron', 'recurse': True}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/certs/ovn_metadata.crt', 'perm': '0644'}, {'optional': True, 'owner': 'neutron:neutron', 'path': '/etc/pki/tls/private/ovn_metadata.key', 'perm': '0644'}]}, '/var/lib/kolla/config_files/rsyslog.json': {'command': '/usr/sbin/rsyslogd -n -iNONE', 'config_files': [{'dest': '/', 'merge': True, 'preserve_properties': True, 'source': '/var/lib/kolla/config_files/src/*'}], 'permissions': [{'owner': 'root:root', 'path': '/var/lib/rsyslog', 'recurse': True}, {'owner': 'root:root', 'path': '/var/log/rsyslog', 'recurse': True}]}}
Feb 01 07:51:34 np0005604215.localdomain sudo[47064]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:39 np0005604215.localdomain sudo[47112]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxffjmnqcqqdjappvazqpeibhrnctpdq ; /usr/bin/python3
Feb 01 07:51:39 np0005604215.localdomain sudo[47112]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:39 np0005604215.localdomain python3[47114]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:51:39 np0005604215.localdomain sudo[47112]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:39 np0005604215.localdomain sudo[47155]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlvaxspiwgabptwduditbpyynoiqtxcw ; /usr/bin/python3
Feb 01 07:51:39 np0005604215.localdomain sudo[47155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:39 np0005604215.localdomain python3[47157]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932299.1210291-80419-20899947941990/source _original_basename=tmpxinz7x4m follow=False checksum=dfdcc7695edd230e7a2c06fc7b739bfa56506d8f backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:51:39 np0005604215.localdomain sudo[47155]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:40 np0005604215.localdomain sudo[47185]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhzyvxlzargrwqfhfudcysbintgpcvcm ; /usr/bin/python3
Feb 01 07:51:40 np0005604215.localdomain sudo[47185]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:40 np0005604215.localdomain python3[47187]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:51:40 np0005604215.localdomain sudo[47185]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:40 np0005604215.localdomain systemd[35763]: Created slice User Background Tasks Slice.
Feb 01 07:51:40 np0005604215.localdomain systemd[35763]: Starting Cleanup of User's Temporary Files and Directories...
Feb 01 07:51:40 np0005604215.localdomain systemd[35763]: Finished Cleanup of User's Temporary Files and Directories.
Feb 01 07:51:41 np0005604215.localdomain sudo[47236]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ebynvwpkjapiaejiisaxgtyuqxdtminb ; /usr/bin/python3
Feb 01 07:51:41 np0005604215.localdomain sudo[47236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:41 np0005604215.localdomain sudo[47236]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:41 np0005604215.localdomain sudo[47279]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjuqzydnlljwcmerlhvmbgyiiaxvekxw ; /usr/bin/python3
Feb 01 07:51:41 np0005604215.localdomain sudo[47279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:41 np0005604215.localdomain sudo[47279]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:42 np0005604215.localdomain sudo[47309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-koiokvpooskzcoyqjtpuiioxpuiwfjmt ; /usr/bin/python3
Feb 01 07:51:42 np0005604215.localdomain sudo[47309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:42 np0005604215.localdomain python3[47311]: ansible-file Invoked with path=/var/lib/container-puppet state=directory setype=container_file_t selevel=s0 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:51:42 np0005604215.localdomain sudo[47309]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:42 np0005604215.localdomain sudo[47357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vljwivgduhfbbujaozdjzkgmtmksxjey ; /usr/bin/python3
Feb 01 07:51:42 np0005604215.localdomain sudo[47357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:43 np0005604215.localdomain sudo[47357]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:43 np0005604215.localdomain sudo[47400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftbsabdvjpysvaxaybmcuizzocnlxzas ; /usr/bin/python3
Feb 01 07:51:43 np0005604215.localdomain sudo[47400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:43 np0005604215.localdomain sudo[47400]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:43 np0005604215.localdomain sudo[47430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbhfnzbsqlkmowysiacrtgjalgqqasjt ; /usr/bin/python3
Feb 01 07:51:43 np0005604215.localdomain sudo[47430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:43 np0005604215.localdomain python3[47432]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 01 07:51:44 np0005604215.localdomain sudo[47430]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:45 np0005604215.localdomain sudo[47446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlcrfqrdxpyixplhymsanjnptghwlksh ; /usr/bin/python3
Feb 01 07:51:45 np0005604215.localdomain sudo[47446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:46 np0005604215.localdomain python3[47448]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -q lvm2 _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:51:46 np0005604215.localdomain sudo[47446]: pam_unix(sudo:session): session closed for user root
Feb 01 07:51:46 np0005604215.localdomain sudo[47463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jaalbezrivnbbgxzmnpkqbqqahfognbx ; /usr/bin/python3
Feb 01 07:51:46 np0005604215.localdomain sudo[47463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:51:47 np0005604215.localdomain python3[47465]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 07:51:51 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 07:51:51 np0005604215.localdomain dbus-broker-launch[14398]: Noticed file-system modification, trigger reload.
Feb 01 07:51:51 np0005604215.localdomain dbus-broker-launch[14398]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Feb 01 07:51:51 np0005604215.localdomain dbus-broker-launch[14398]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Feb 01 07:51:51 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 07:51:51 np0005604215.localdomain systemd[1]: Reexecuting.
Feb 01 07:51:51 np0005604215.localdomain systemd[1]: systemd 252-14.el9_2.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Feb 01 07:51:51 np0005604215.localdomain systemd[1]: Detected virtualization kvm.
Feb 01 07:51:51 np0005604215.localdomain systemd[1]: Detected architecture x86-64.
Feb 01 07:51:51 np0005604215.localdomain systemd-rc-local-generator[47519]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:51:51 np0005604215.localdomain systemd-sysv-generator[47522]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:51:51 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  Converting 2705 SID table entries...
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 07:51:59 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 07:51:59 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 07:51:59 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Feb 01 07:51:59 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:52:01 np0005604215.localdomain systemd-rc-local-generator[47613]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:52:01 np0005604215.localdomain systemd-sysv-generator[47620]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 07:52:01 np0005604215.localdomain systemd-journald[619]: Journal stopped
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Stopping Journal Service...
Feb 01 07:52:01 np0005604215.localdomain systemd-journald[619]: Received SIGTERM from PID 1 (systemd).
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: systemd-journald.service: Deactivated successfully.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Stopped Journal Service.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: systemd-journald.service: Consumed 1.714s CPU time.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Starting Journal Service...
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: systemd-udevd.service: Deactivated successfully.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: systemd-udevd.service: Consumed 2.848s CPU time.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Starting Rule-based Manager for Device Events and Files...
Feb 01 07:52:01 np0005604215.localdomain systemd-journald[47940]: Journal started
Feb 01 07:52:01 np0005604215.localdomain systemd-journald[47940]: Runtime Journal (/run/log/journal/00836dadc27b01f9fb0a211cca69e688) is 12.2M, max 314.7M, 302.5M free.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Started Journal Service.
Feb 01 07:52:01 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.4 (251 of 333 items), suggesting rotation.
Feb 01 07:52:01 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 07:52:01 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 07:52:01 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 07:52:01 np0005604215.localdomain systemd-udevd[47944]: Using default interface naming scheme 'rhel-9.0'.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Started Rule-based Manager for Device Events and Files.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:52:01 np0005604215.localdomain systemd-rc-local-generator[48530]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:52:01 np0005604215.localdomain systemd-sysv-generator[48534]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:52:01 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 07:52:02 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 07:52:02 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 07:52:02 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.336s CPU time.
Feb 01 07:52:02 np0005604215.localdomain systemd[1]: run-r35a3ea8843aa4d688f7a7b87ffa4fd4f.service: Deactivated successfully.
Feb 01 07:52:02 np0005604215.localdomain systemd[1]: run-rcddb9faf836a4d5baf50b47767ff6a4d.service: Deactivated successfully.
Feb 01 07:52:03 np0005604215.localdomain sudo[47463]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:03 np0005604215.localdomain sudo[48953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfzqxocwjavhpmlehokldvdeszxnqmym ; /usr/bin/python3
Feb 01 07:52:03 np0005604215.localdomain sudo[48953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:03 np0005604215.localdomain python3[48955]: ansible-sysctl Invoked with name=vm.unprivileged_userfaultfd reload=True state=present sysctl_file=/etc/sysctl.d/99-tripleo-postcopy.conf sysctl_set=True value=1 ignoreerrors=False
Feb 01 07:52:03 np0005604215.localdomain sudo[48953]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:04 np0005604215.localdomain sudo[48972]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tndefvemuqvrnvjcfrvmqnlbkwuwqycu ; /usr/bin/python3
Feb 01 07:52:04 np0005604215.localdomain sudo[48972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:04 np0005604215.localdomain python3[48974]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-active ksm.service || systemctl is-enabled ksm.service _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 07:52:04 np0005604215.localdomain sudo[48972]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:04 np0005604215.localdomain sudo[48990]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rdikudsoyhughpcqjllgnzfqpoiaanqv ; /usr/bin/python3
Feb 01 07:52:04 np0005604215.localdomain sudo[48990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:05 np0005604215.localdomain python3[48992]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:52:05 np0005604215.localdomain python3[48992]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 --format json
Feb 01 07:52:05 np0005604215.localdomain python3[48992]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 -q --tls-verify=false
Feb 01 07:52:12 np0005604215.localdomain podman[49004]: 2026-02-01 07:52:05.215048027 +0000 UTC m=+0.043943973 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 01 07:52:12 np0005604215.localdomain python3[48992]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 591bb9fb46a70e9f840f28502388406078442df6b6701a3c17990ee75e333673 --format json
Feb 01 07:52:12 np0005604215.localdomain sudo[48990]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:12 np0005604215.localdomain sudo[49103]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eymowdgbwgeoazeblxoesgycrldlmonw ; /usr/bin/python3
Feb 01 07:52:12 np0005604215.localdomain sudo[49103]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:12 np0005604215.localdomain python3[49105]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:52:12 np0005604215.localdomain python3[49105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 --format json
Feb 01 07:52:12 np0005604215.localdomain python3[49105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 -q --tls-verify=false
Feb 01 07:52:20 np0005604215.localdomain podman[49118]: 2026-02-01 07:52:12.795696836 +0000 UTC m=+0.042923001 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 01 07:52:20 np0005604215.localdomain python3[49105]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect d59b33e7fb841c47a47a12b18fb68b11debd968b4596c63f3177ecc7400fb1bc --format json
Feb 01 07:52:20 np0005604215.localdomain sudo[49103]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:20 np0005604215.localdomain sudo[49218]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgqsprlsdkzmbqsfvlnfhxzrofyufmdp ; /usr/bin/python3
Feb 01 07:52:20 np0005604215.localdomain sudo[49218]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:20 np0005604215.localdomain python3[49220]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:52:20 np0005604215.localdomain python3[49220]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 --format json
Feb 01 07:52:20 np0005604215.localdomain python3[49220]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 -q --tls-verify=false
Feb 01 07:52:25 np0005604215.localdomain sudo[49273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:52:25 np0005604215.localdomain sudo[49273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:52:25 np0005604215.localdomain sudo[49273]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:25 np0005604215.localdomain sudo[49298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 07:52:25 np0005604215.localdomain sudo[49298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:52:29 np0005604215.localdomain podman[49544]: 2026-02-01 07:52:29.020878774 +0000 UTC m=+0.062107022 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.buildah.version=1.41.4, name=rhceph, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, RELEASE=main, GIT_BRANCH=main, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 07:52:29 np0005604215.localdomain podman[49544]: 2026-02-01 07:52:29.141548033 +0000 UTC m=+0.182776271 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, vcs-type=git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, com.redhat.component=rhceph-container)
Feb 01 07:52:36 np0005604215.localdomain podman[49234]: 2026-02-01 07:52:20.592689317 +0000 UTC m=+0.041650582 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 07:52:36 np0005604215.localdomain python3[49220]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 6eddd23e1e6adfbfa713a747123707c02f92ffdbf1913da92f171aba1d6d7856 --format json
Feb 01 07:52:36 np0005604215.localdomain sudo[49298]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:36 np0005604215.localdomain sudo[49218]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:36 np0005604215.localdomain sudo[49889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:52:36 np0005604215.localdomain sudo[49889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:52:36 np0005604215.localdomain sudo[49889]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:36 np0005604215.localdomain sudo[49904]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:52:36 np0005604215.localdomain sudo[49904]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:52:36 np0005604215.localdomain sudo[49932]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpqnjjvvperrusokfjivraenmdchdfku ; /usr/bin/python3
Feb 01 07:52:36 np0005604215.localdomain sudo[49932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:36 np0005604215.localdomain python3[49934]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:52:36 np0005604215.localdomain python3[49934]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 --format json
Feb 01 07:52:36 np0005604215.localdomain python3[49934]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 -q --tls-verify=false
Feb 01 07:52:37 np0005604215.localdomain sudo[49904]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:38 np0005604215.localdomain sudo[50017]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:52:38 np0005604215.localdomain sudo[50017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:52:38 np0005604215.localdomain sudo[50017]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:50 np0005604215.localdomain podman[49955]: 2026-02-01 07:52:36.908977849 +0000 UTC m=+0.046318627 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 07:52:50 np0005604215.localdomain python3[49934]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 2c8610235afe953aa46efb141a5a988799548b22280d65a7e7ab21889422df37 --format json
Feb 01 07:52:51 np0005604215.localdomain sudo[49932]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:51 np0005604215.localdomain sudo[50289]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijxybuuifzuwjwbiujptuwgcdlcnweyj ; /usr/bin/python3
Feb 01 07:52:51 np0005604215.localdomain sudo[50289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:51 np0005604215.localdomain python3[50291]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:52:51 np0005604215.localdomain python3[50291]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 --format json
Feb 01 07:52:51 np0005604215.localdomain python3[50291]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 -q --tls-verify=false
Feb 01 07:52:57 np0005604215.localdomain podman[50304]: 2026-02-01 07:52:51.523833728 +0000 UTC m=+0.045427520 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 01 07:52:57 np0005604215.localdomain python3[50291]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ab5aab6d0c3ec80926032b7acf4cec1d4710f1c2daccd17ae4daa64399ec237 --format json
Feb 01 07:52:57 np0005604215.localdomain sudo[50289]: pam_unix(sudo:session): session closed for user root
Feb 01 07:52:57 np0005604215.localdomain sudo[50393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oltnnesqugkgepgopsbgbmtwblrufpqx ; /usr/bin/python3
Feb 01 07:52:57 np0005604215.localdomain sudo[50393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:52:58 np0005604215.localdomain python3[50395]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:52:58 np0005604215.localdomain python3[50395]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 --format json
Feb 01 07:52:58 np0005604215.localdomain python3[50395]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1 -q --tls-verify=false
Feb 01 07:53:02 np0005604215.localdomain podman[50408]: 2026-02-01 07:52:58.163170149 +0000 UTC m=+0.030859274 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 01 07:53:02 np0005604215.localdomain python3[50395]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 4853142d85dba3766b28d28ae195b26f7242230fe3646e9590a7aee2dc2e0dfa --format json
Feb 01 07:53:02 np0005604215.localdomain sudo[50393]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:02 np0005604215.localdomain sudo[50485]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyhkzyeyieaheijngyqabsfyqywtucpd ; /usr/bin/python3
Feb 01 07:53:02 np0005604215.localdomain sudo[50485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:02 np0005604215.localdomain python3[50487]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:53:02 np0005604215.localdomain python3[50487]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 --format json
Feb 01 07:53:02 np0005604215.localdomain python3[50487]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1 -q --tls-verify=false
Feb 01 07:53:05 np0005604215.localdomain podman[50500]: 2026-02-01 07:53:03.032631179 +0000 UTC m=+0.044520952 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 01 07:53:05 np0005604215.localdomain python3[50487]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 9ac6ea63c0fb4851145e847f9ced2f20804afc8472907b63a82d5866f5cf608a --format json
Feb 01 07:53:05 np0005604215.localdomain sudo[50485]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:05 np0005604215.localdomain sudo[50576]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfnqkajkvgisbfgsdsfagyxofrwweeka ; /usr/bin/python3
Feb 01 07:53:05 np0005604215.localdomain sudo[50576]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:05 np0005604215.localdomain python3[50578]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:53:05 np0005604215.localdomain python3[50578]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 --format json
Feb 01 07:53:05 np0005604215.localdomain python3[50578]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1 -q --tls-verify=false
Feb 01 07:53:07 np0005604215.localdomain podman[50591]: 2026-02-01 07:53:05.654741421 +0000 UTC m=+0.037856324 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 01 07:53:07 np0005604215.localdomain python3[50578]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect ba1a08ea1c1207b471b1f02cee16ff456b8a812662cce16906d16de330a66d63 --format json
Feb 01 07:53:07 np0005604215.localdomain sudo[50576]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:07 np0005604215.localdomain sudo[50666]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbicgcaxjeekkvuqvpnzuwgykqwobkrr ; /usr/bin/python3
Feb 01 07:53:07 np0005604215.localdomain sudo[50666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:08 np0005604215.localdomain python3[50668]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:53:08 np0005604215.localdomain python3[50668]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 --format json
Feb 01 07:53:08 np0005604215.localdomain python3[50668]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1 -q --tls-verify=false
Feb 01 07:53:10 np0005604215.localdomain podman[50681]: 2026-02-01 07:53:08.248677973 +0000 UTC m=+0.042281472 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 01 07:53:10 np0005604215.localdomain python3[50668]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 8576d3a17e57ea28f29435f132f583320941b5aa7bf0aa02e998b09a094d1fe8 --format json
Feb 01 07:53:10 np0005604215.localdomain sudo[50666]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:10 np0005604215.localdomain sudo[50758]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdmrnargaoilxxewrowjnnzpobjczwch ; /usr/bin/python3
Feb 01 07:53:10 np0005604215.localdomain sudo[50758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:11 np0005604215.localdomain python3[50760]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:53:11 np0005604215.localdomain python3[50760]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 --format json
Feb 01 07:53:11 np0005604215.localdomain python3[50760]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 -q --tls-verify=false
Feb 01 07:53:14 np0005604215.localdomain podman[50772]: 2026-02-01 07:53:11.141874692 +0000 UTC m=+0.042903580 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 01 07:53:14 np0005604215.localdomain python3[50760]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 7fcbf63c0504494c8fcaa07583f909a06486472a0982aeac9554c6fdbeb04c9a --format json
Feb 01 07:53:14 np0005604215.localdomain sudo[50758]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:14 np0005604215.localdomain sudo[50862]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywufzsxcmdoaytmnsvuxvkddvvwxthsy ; /usr/bin/python3
Feb 01 07:53:14 np0005604215.localdomain sudo[50862]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:15 np0005604215.localdomain python3[50864]: ansible-containers.podman.podman_image Invoked with force=True name=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 validate_certs=False tag=latest pull=True push=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'volume': None, 'extra_args': None} push_args={'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'transport': None} path=None auth_file=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None
Feb 01 07:53:15 np0005604215.localdomain python3[50864]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman image ls registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 --format json
Feb 01 07:53:15 np0005604215.localdomain python3[50864]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman pull registry.redhat.io/rhosp-rhel9/openstack-cron:17.1 -q --tls-verify=false
Feb 01 07:53:17 np0005604215.localdomain podman[50876]: 2026-02-01 07:53:15.15680301 +0000 UTC m=+0.043021645 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 01 07:53:17 np0005604215.localdomain python3[50864]: ansible-containers.podman.podman_image PODMAN-IMAGE-DEBUG: /bin/podman inspect 72ddf109f135b64d3116af7b84caaa358dc72e2e60f4c8753fa54fa65b76ba35 --format json
Feb 01 07:53:17 np0005604215.localdomain sudo[50862]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:17 np0005604215.localdomain sudo[50953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttfdajjyucwdlhphfvkckegdelbbjxyl ; /usr/bin/python3
Feb 01 07:53:17 np0005604215.localdomain sudo[50953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:17 np0005604215.localdomain python3[50955]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_1 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:53:17 np0005604215.localdomain sudo[50953]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:18 np0005604215.localdomain sudo[51003]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbvztnofuhhlzazknayghfzsxwbifwbl ; /usr/bin/python3
Feb 01 07:53:18 np0005604215.localdomain sudo[51003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:18 np0005604215.localdomain sudo[51003]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:18 np0005604215.localdomain sudo[51021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xchegkpcoqhpimgvoiizrmjsxtimgque ; /usr/bin/python3
Feb 01 07:53:18 np0005604215.localdomain sudo[51021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:18 np0005604215.localdomain sudo[51021]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:19 np0005604215.localdomain sudo[51126]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-solsyftacmpzcylspaglfftclcucyyzs ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932398.9788837-83068-65418368192919/async_wrapper.py 353297964674 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932398.9788837-83068-65418368192919/AnsiballZ_command.py _
Feb 01 07:53:19 np0005604215.localdomain sudo[51126]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 07:53:19 np0005604215.localdomain ansible-async_wrapper.py[51128]: Invoked with 353297964674 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932398.9788837-83068-65418368192919/AnsiballZ_command.py _
Feb 01 07:53:19 np0005604215.localdomain ansible-async_wrapper.py[51131]: Starting module and watcher
Feb 01 07:53:19 np0005604215.localdomain ansible-async_wrapper.py[51131]: Start watching 51132 (3600)
Feb 01 07:53:19 np0005604215.localdomain ansible-async_wrapper.py[51132]: Start module (51132)
Feb 01 07:53:19 np0005604215.localdomain ansible-async_wrapper.py[51128]: Return async_wrapper task started.
Feb 01 07:53:19 np0005604215.localdomain sudo[51126]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:19 np0005604215.localdomain sudo[51150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlamknhbdapdyjowsklymqdwrvwvnzxs ; /usr/bin/python3
Feb 01 07:53:19 np0005604215.localdomain sudo[51150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:20 np0005604215.localdomain python3[51152]: ansible-ansible.legacy.async_status Invoked with jid=353297964674.51128 mode=status _async_dir=/tmp/.ansible_async
Feb 01 07:53:20 np0005604215.localdomain sudo[51150]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    (file & line not available)
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    (file & line not available)
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.12 seconds
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Exec[directory-create-etc-my.cnf.d]/returns: executed successfully
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/File[/etc/my.cnf.d/tripleo.cnf]/ensure: created
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Notice: /Stage[main]/Tripleo::Profile::Base::Database::Mysql::Client/Augeas[tripleo-mysql-client-conf]/returns: executed successfully
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Notice: Applied catalog in 0.08 seconds
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Application:
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    Initial environment: production
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    Converged environment: production
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:          Run mode: user
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Changes:
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:             Total: 3
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Events:
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:           Success: 3
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:             Total: 3
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Resources:
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:           Changed: 3
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:       Out of sync: 3
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:             Total: 10
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Time:
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:          Schedule: 0.00
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:              File: 0.00
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:              Exec: 0.01
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:            Augeas: 0.05
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    Transaction evaluation: 0.08
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    Catalog application: 0.08
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:    Config retrieval: 0.16
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:          Last run: 1769932403
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:        Filebucket: 0.00
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:             Total: 0.08
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]: Version:
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:            Config: 1769932403
Feb 01 07:53:23 np0005604215.localdomain puppet-user[51136]:            Puppet: 7.10.0
Feb 01 07:53:23 np0005604215.localdomain ansible-async_wrapper.py[51132]: Module complete (51132)
Feb 01 07:53:24 np0005604215.localdomain ansible-async_wrapper.py[51131]: Done in kid B.
Feb 01 07:53:30 np0005604215.localdomain sudo[51277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqgmwlcbxrsarwtulfysyllrqjyokkrc ; /usr/bin/python3
Feb 01 07:53:30 np0005604215.localdomain sudo[51277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:30 np0005604215.localdomain python3[51279]: ansible-ansible.legacy.async_status Invoked with jid=353297964674.51128 mode=status _async_dir=/tmp/.ansible_async
Feb 01 07:53:30 np0005604215.localdomain sudo[51277]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:30 np0005604215.localdomain sudo[51293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjkmewxefhmkryowsdjxvpteahnizmrk ; /usr/bin/python3
Feb 01 07:53:30 np0005604215.localdomain sudo[51293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:30 np0005604215.localdomain python3[51295]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:53:30 np0005604215.localdomain sudo[51293]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:31 np0005604215.localdomain sudo[51309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozjkwdjnafqofqikieromudwtacgakbq ; /usr/bin/python3
Feb 01 07:53:31 np0005604215.localdomain sudo[51309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:31 np0005604215.localdomain python3[51311]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:53:31 np0005604215.localdomain sudo[51309]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:31 np0005604215.localdomain sudo[51357]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cegdzojeklemwftmpidghyqmiacvuami ; /usr/bin/python3
Feb 01 07:53:31 np0005604215.localdomain sudo[51357]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:31 np0005604215.localdomain python3[51359]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:53:31 np0005604215.localdomain sudo[51357]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:32 np0005604215.localdomain sudo[51400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-okdlxwnglknadzbyorpucifveiiubjto ; /usr/bin/python3
Feb 01 07:53:32 np0005604215.localdomain sudo[51400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:32 np0005604215.localdomain python3[51402]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/container-puppet/puppetlabs/facter.conf setype=svirt_sandbox_file_t selevel=s0 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932411.471384-83431-118320226345827/source _original_basename=tmpzp4p8yt5 follow=False checksum=53908622cb869db5e2e2a68e737aa2ab1a872111 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 07:53:32 np0005604215.localdomain sudo[51400]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:32 np0005604215.localdomain sudo[51430]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujficsjosbufwypsczutqjaayjhvagnf ; /usr/bin/python3
Feb 01 07:53:32 np0005604215.localdomain sudo[51430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:32 np0005604215.localdomain python3[51432]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:53:32 np0005604215.localdomain sudo[51430]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:32 np0005604215.localdomain sudo[51446]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suhllflmbvfwdqlofobzqiuoatmnpvej ; /usr/bin/python3
Feb 01 07:53:32 np0005604215.localdomain sudo[51446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:33 np0005604215.localdomain sudo[51446]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:33 np0005604215.localdomain sudo[51533]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grxuofifahxgfugiyvifaaghncarozkr ; /usr/bin/python3
Feb 01 07:53:33 np0005604215.localdomain sudo[51533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:33 np0005604215.localdomain python3[51535]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 01 07:53:34 np0005604215.localdomain sudo[51533]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:34 np0005604215.localdomain sudo[51552]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fptooinlvabdpofruhtzmohcvxqgdyes ; /usr/bin/python3
Feb 01 07:53:34 np0005604215.localdomain sudo[51552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:34 np0005604215.localdomain python3[51554]: ansible-file Invoked with path=/var/lib/tripleo-config/container-puppet-config mode=448 recurse=True setype=container_file_t force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 07:53:34 np0005604215.localdomain sudo[51552]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:34 np0005604215.localdomain sudo[51568]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdneordrvnsndigohrkmkrbplwhnsewl ; /usr/bin/python3
Feb 01 07:53:34 np0005604215.localdomain sudo[51568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:34 np0005604215.localdomain python3[51570]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=False puppet_config=/var/lib/container-puppet/container-puppet.json short_hostname=np0005604215 step=1 update_config_hash_only=False
Feb 01 07:53:34 np0005604215.localdomain sudo[51568]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:35 np0005604215.localdomain sudo[51584]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lyeoevadcltrxiarlgkuyaosnbgzgdfd ; /usr/bin/python3
Feb 01 07:53:35 np0005604215.localdomain sudo[51584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:35 np0005604215.localdomain python3[51586]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:53:35 np0005604215.localdomain sudo[51584]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:35 np0005604215.localdomain sudo[51600]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xyzckhpnvujhhljmtbtizcsematzczkz ; /usr/bin/python3
Feb 01 07:53:35 np0005604215.localdomain sudo[51600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:36 np0005604215.localdomain python3[51602]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 01 07:53:36 np0005604215.localdomain sudo[51600]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:36 np0005604215.localdomain sudo[51616]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avbjqggkawvepayrznirtbotcvgpcrvp ; /usr/bin/python3
Feb 01 07:53:36 np0005604215.localdomain sudo[51616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:36 np0005604215.localdomain python3[51618]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Feb 01 07:53:37 np0005604215.localdomain sudo[51616]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:37 np0005604215.localdomain sudo[51657]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqalvyllhdetuobotnvuxzuvpnwijodr ; /usr/bin/python3
Feb 01 07:53:37 np0005604215.localdomain sudo[51657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:38 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage Invoked with config_id=tripleo_puppet_step1 config_dir=/var/lib/tripleo-config/container-puppet-config/step_1 config_patterns=container-puppet-*.json config_overrides={} concurrency=6 log_base_path=/var/log/containers/stdouts debug=False
Feb 01 07:53:38 np0005604215.localdomain podman[51834]: 2026-02-01 07:53:38.363021799 +0000 UTC m=+0.109011335 container create 416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, url=https://www.redhat.com, release=1766032510, architecture=x86_64, io.openshift.expose-services=, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, container_name=container-puppet-collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 07:53:38 np0005604215.localdomain podman[51834]: 2026-02-01 07:53:38.284575661 +0000 UTC m=+0.030565207 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 01 07:53:38 np0005604215.localdomain podman[51852]: 2026-02-01 07:53:38.396858937 +0000 UTC m=+0.115232489 container create 11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, release=1766032510, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, io.buildah.version=1.41.5, container_name=container-puppet-metrics_qdr, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Feb 01 07:53:38 np0005604215.localdomain systemd[1]: Started libpod-conmon-416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2.scope.
Feb 01 07:53:38 np0005604215.localdomain podman[51836]: 2026-02-01 07:53:38.303060709 +0000 UTC m=+0.041560106 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 07:53:38 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:38 np0005604215.localdomain podman[51836]: 2026-02-01 07:53:38.419954665 +0000 UTC m=+0.158454062 container create a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, config_id=tripleo_puppet_step1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, container_name=container-puppet-nova_libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, release=1766032510, build-date=2026-01-12T23:31:49Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 07:53:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/193d63b6dd9579507d9f1518ccbcb97a99c18e05e53fbccdc25e375b68ff02d6/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:38 np0005604215.localdomain podman[51864]: 2026-02-01 07:53:38.426360395 +0000 UTC m=+0.137324001 container create e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron)
Feb 01 07:53:38 np0005604215.localdomain systemd[1]: Started libpod-conmon-11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27.scope.
Feb 01 07:53:38 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:38 np0005604215.localdomain podman[51834]: 2026-02-01 07:53:38.436818487 +0000 UTC m=+0.182808013 container init 416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, container_name=container-puppet-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, distribution-scope=public, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13)
Feb 01 07:53:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/45fdf082ac490d270b07fd0f17cf89cd8bf1d13e0d604cb75e37ccaf54fab194/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:38 np0005604215.localdomain podman[51855]: 2026-02-01 07:53:38.442923037 +0000 UTC m=+0.152778065 container create 1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_id=tripleo_puppet_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=container-puppet-iscsid, release=1766032510)
Feb 01 07:53:38 np0005604215.localdomain systemd[1]: Started libpod-conmon-a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb.scope.
Feb 01 07:53:38 np0005604215.localdomain podman[51834]: 2026-02-01 07:53:38.447983602 +0000 UTC m=+0.193973158 container start 416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, architecture=x86_64, container_name=container-puppet-collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_puppet_step1)
Feb 01 07:53:38 np0005604215.localdomain podman[51834]: 2026-02-01 07:53:38.448157008 +0000 UTC m=+0.194146564 container attach 416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=container-puppet-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:53:38 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/892d1779a7f946097f73616f672cd69c2781ff491e090964134e591e5adb1a86/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:38 np0005604215.localdomain podman[51864]: 2026-02-01 07:53:38.358506743 +0000 UTC m=+0.069470389 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 01 07:53:38 np0005604215.localdomain podman[51852]: 2026-02-01 07:53:38.358223484 +0000 UTC m=+0.076597056 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 01 07:53:38 np0005604215.localdomain podman[51855]: 2026-02-01 07:53:38.36158826 +0000 UTC m=+0.071443368 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 01 07:53:38 np0005604215.localdomain podman[51836]: 2026-02-01 07:53:38.462453902 +0000 UTC m=+0.200953299 container init a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=container-puppet-nova_libvirt, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, config_id=tripleo_puppet_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 07:53:38 np0005604215.localdomain podman[51836]: 2026-02-01 07:53:38.479487979 +0000 UTC m=+0.217987366 container start a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., container_name=container-puppet-nova_libvirt, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, io.buildah.version=1.41.5, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 07:53:38 np0005604215.localdomain podman[51836]: 2026-02-01 07:53:38.479709047 +0000 UTC m=+0.218208444 container attach a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, container_name=container-puppet-nova_libvirt, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z)
Feb 01 07:53:39 np0005604215.localdomain sudo[51928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:53:39 np0005604215.localdomain sudo[51928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:53:39 np0005604215.localdomain sudo[51928]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:39 np0005604215.localdomain sudo[51943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:53:39 np0005604215.localdomain sudo[51943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:53:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219.scope.
Feb 01 07:53:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9.scope.
Feb 01 07:53:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4fd9ea2ebfbeb4119560e74e5b0456fd618118c9f72a7ecf288a55a3e1a95413/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:39 np0005604215.localdomain podman[51852]: 2026-02-01 07:53:39.571388383 +0000 UTC m=+1.289761925 container init 11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, container_name=container-puppet-metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 07:53:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/311e95bb3bdbe1bb40730cbc80ffa3861fcafc1265b18b49a2e8169fc5d3cbf1/merged/tmp/iscsi.host supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/311e95bb3bdbe1bb40730cbc80ffa3861fcafc1265b18b49a2e8169fc5d3cbf1/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:39 np0005604215.localdomain podman[51864]: 2026-02-01 07:53:39.579367259 +0000 UTC m=+1.290330905 container init e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_puppet_step1, container_name=container-puppet-crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git)
Feb 01 07:53:39 np0005604215.localdomain podman[51855]: 2026-02-01 07:53:39.584400162 +0000 UTC m=+1.294255210 container init 1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_puppet_step1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, version=17.1.13, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=container-puppet-iscsid, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 07:53:39 np0005604215.localdomain podman[51864]: 2026-02-01 07:53:39.589973134 +0000 UTC m=+1.300936790 container start e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, container_name=container-puppet-crond, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 07:53:39 np0005604215.localdomain podman[51864]: 2026-02-01 07:53:39.590382408 +0000 UTC m=+1.301346054 container attach e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, architecture=x86_64, managed_by=tripleo_ansible, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:53:39 np0005604215.localdomain podman[51855]: 2026-02-01 07:53:39.5953464 +0000 UTC m=+1.305201458 container start 1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, container_name=container-puppet-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 07:53:39 np0005604215.localdomain podman[51855]: 2026-02-01 07:53:39.595576428 +0000 UTC m=+1.305431486 container attach 1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, container_name=container-puppet-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, config_id=tripleo_puppet_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 01 07:53:39 np0005604215.localdomain podman[51852]: 2026-02-01 07:53:39.632722161 +0000 UTC m=+1.351095693 container start 11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, container_name=container-puppet-metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_puppet_step1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:53:39 np0005604215.localdomain podman[51852]: 2026-02-01 07:53:39.633304141 +0000 UTC m=+1.351677703 container attach 11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, container_name=container-puppet-metrics_qdr, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_puppet_step1)
Feb 01 07:53:39 np0005604215.localdomain sudo[51943]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:40 np0005604215.localdomain podman[51732]: 2026-02-01 07:53:38.207676956 +0000 UTC m=+0.055603690 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 01 07:53:40 np0005604215.localdomain podman[52110]: 2026-02-01 07:53:40.626726464 +0000 UTC m=+0.085993529 container create 5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-central-container, release=1766032510, vcs-type=git, build-date=2026-01-12T23:07:24Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:24Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, container_name=container-puppet-ceilometer, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-central, version=17.1.13, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public)
Feb 01 07:53:40 np0005604215.localdomain systemd[1]: Started libpod-conmon-5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91.scope.
Feb 01 07:53:40 np0005604215.localdomain podman[52110]: 2026-02-01 07:53:40.57302674 +0000 UTC m=+0.032293815 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 01 07:53:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d8c1697d3f9451811eabeba845d2774ca9523a4c1f6255791f262d42dbea547b/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:40 np0005604215.localdomain podman[52110]: 2026-02-01 07:53:40.704052184 +0000 UTC m=+0.163319219 container init 5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, build-date=2026-01-12T23:07:24Z, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:24Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-ceilometer, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-type=git)
Feb 01 07:53:40 np0005604215.localdomain podman[52110]: 2026-02-01 07:53:40.716583946 +0000 UTC m=+0.175851031 container start 5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_puppet_step1, build-date=2026-01-12T23:07:24Z, com.redhat.component=openstack-ceilometer-central-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, container_name=container-puppet-ceilometer, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:24Z, name=rhosp-rhel9/openstack-ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, url=https://www.redhat.com, version=17.1.13)
Feb 01 07:53:40 np0005604215.localdomain podman[52110]: 2026-02-01 07:53:40.720066797 +0000 UTC m=+0.179333832 container attach 5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, org.opencontainers.image.created=2026-01-12T23:07:24Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-central, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:24Z, container_name=container-puppet-ceilometer, config_id=tripleo_puppet_step1, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-central-container)
Feb 01 07:53:41 np0005604215.localdomain ovs-vsctl[52167]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: tmp-crun.l8tHPv.mount: Deactivated successfully.
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.10 seconds
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.08 seconds
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    (file & line not available)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[reset-iscsi-initiator-name]/returns: executed successfully
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Scope(Class[Nova]): The os_region_name parameter is deprecated and will be removed \
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: in a future release. Use nova::cinder::os_region_name instead
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Scope(Class[Nova]): The catalog_info parameter is deprecated and will be removed \
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: in a future release. Use nova::cinder::catalog_info instead
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/File[/etc/iscsi/.initiator_reset]/ensure: created
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/File[/etc/logrotate-crond.conf]/ensure: defined content as '{sha256}1c3202f58bd2ae16cb31badcbb7f0d4e6697157b987d1887736ad96bb73d70b0'
Feb 01 07:53:41 np0005604215.localdomain crontab[52462]: (root) LIST (root)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52053]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Exec[sync-iqn-to-host]/returns: executed successfully
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Notice: /Stage[main]/Tripleo::Profile::Base::Logging::Logrotate/Cron[logrotate-crond]/ensure: created
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: Accepting previously invalid value for target type 'Integer'
Feb 01 07:53:41 np0005604215.localdomain crontab[52463]: (root) REPLACE (root)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Notice: Applied catalog in 0.04 seconds
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Application:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    Initial environment: production
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    Converged environment: production
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:          Run mode: user
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Changes:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:             Total: 2
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Events:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:           Success: 2
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:             Total: 2
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Resources:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:           Changed: 2
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:       Out of sync: 2
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:           Skipped: 7
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:             Total: 9
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Time:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:              File: 0.01
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:              Cron: 0.01
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    Transaction evaluation: 0.04
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    Catalog application: 0.04
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:    Config retrieval: 0.11
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:          Last run: 1769932421
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:             Total: 0.04
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]: Version:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:            Config: 1769932421
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52058]:            Puppet: 7.10.0
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.17 seconds
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/owner: owner changed 'qdrouterd' to 'root'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/group: group changed 'qdrouterd' to 'root'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[/var/lib/qdrouterd]/mode: mode changed '0700' to '0755'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Unknown variable: '::nova::compute::verify_glance_signatures'. (file: /etc/puppet/modules/nova/manifests/glance.pp, line: 62, column: 41)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[/etc/qpid-dispatch/ssl]/ensure: created
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[qdrouterd.conf]/content: content changed '{sha256}89e10d8896247f992c5f0baf027c25a8ca5d0441be46d8859d9db2067ea74cd3' to '{sha256}2bddea0b0fe879e5c10f974fc2b3f9b8f40891a25c67e254d0f4c621edb22ff7'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd]/ensure: created
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: /Stage[main]/Qdr::Config/File[/var/log/qdrouterd/metrics_qdr.log]/ensure: created
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Notice: Applied catalog in 0.03 seconds
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Application:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    Initial environment: production
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    Converged environment: production
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:          Run mode: user
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Changes:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:             Total: 7
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Events:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:           Success: 7
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:             Total: 7
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Resources:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:           Skipped: 13
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:           Changed: 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:       Out of sync: 5
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:             Total: 20
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Time:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:              File: 0.01
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    Transaction evaluation: 0.03
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    Catalog application: 0.03
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:    Config retrieval: 0.22
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:          Last run: 1769932421
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:             Total: 0.03
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]: Version:
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:            Config: 1769932421
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52051]:            Puppet: 7.10.0
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_base_images'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 44, column: 5)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_original_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 48, column: 5)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Unknown variable: '::nova::compute::libvirt::remove_unused_resized_minimum_age_seconds'. (file: /etc/puppet/modules/nova/manifests/compute/image_cache.pp, line: 52, column: 5)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.42 seconds
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Scope(Class[Tripleo::Profile::Base::Nova::Compute]): The keymgr_backend parameter has been deprecated
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Scope(Class[Nova::Compute]): vcpu_pin_set is deprecated, instead use cpu_dedicated_set or cpu_shared_set.
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Scope(Class[Nova::Compute]): verify_glance_signatures is deprecated. Use the same parameter in nova::glance
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/content: content changed '{sha256}aea388a73ebafc7e07a81ddb930a91099211f660eee55fbf92c13007a77501e5' to '{sha256}2523d01ee9c3022c0e9f61d896b1474a168e18472aee141cc278e69fe13f41c1'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/owner: owner changed 'collectd' to 'root'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/group: group changed 'collectd' to 'root'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.conf]/mode: mode changed '0644' to '0640'
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: libpod-e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219.scope: Deactivated successfully.
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: libpod-e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219.scope: Consumed 2.095s CPU time.
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/owner: owner changed 'collectd' to 'root'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52005]: Warning: Scope(Class[Nova::Compute::Libvirt]): nova::compute::libvirt::images_type will be required if rbd ephemeral storage is used.
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/group: group changed 'collectd' to 'root'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[collectd.d]/mode: mode changed '0755' to '0750'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-cpu.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-interface.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-load.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-memory.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/90-default-plugins-syslog.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/apache.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/dns.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ipmi.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mcelog.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/mysql.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-events.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ovs-stats.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/ping.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/pmu.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/rdt.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/sensors.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/snmp.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Config/File[/etc/collectd.d/write_prometheus.conf]/ensure: removed
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Python/File[/usr/lib/python3.9/site-packages]/mode: mode changed '0755' to '0750'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Python/Collectd::Plugin[python]/File[python.load]/ensure: defined content as '{sha256}0163924a0099dd43fe39cb85e836df147fd2cfee8197dc6866d3c384539eb6ee'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Python/Concat[/etc/collectd.d/python-config.conf]/File[/etc/collectd.d/python-config.conf]/ensure: defined content as '{sha256}2e5fb20e60b30f84687fc456a37fc62451000d2d85f5bbc1b3fca3a5eac9deeb'
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: libpod-11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27.scope: Deactivated successfully.
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: libpod-11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27.scope: Consumed 2.230s CPU time.
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Logfile/Collectd::Plugin[logfile]/File[logfile.load]/ensure: defined content as '{sha256}07bbda08ef9b824089500bdc6ac5a86e7d1ef2ae3ed4ed423c0559fe6361e5af'
Feb 01 07:53:41 np0005604215.localdomain podman[52522]: 2026-02-01 07:53:41.980965194 +0000 UTC m=+0.061843926 container died e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, config_id=tripleo_puppet_step1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, container_name=container-puppet-crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., vcs-type=git)
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Amqp1/Collectd::Plugin[amqp1]/File[amqp1.load]/ensure: defined content as '{sha256}8dd3769945b86c38433504b97f7851a931eb3c94b667298d10a9796a3d020595'
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Ceph/Collectd::Plugin[ceph]/File[ceph.load]/ensure: defined content as '{sha256}c796abffda2e860875295b4fc11cc95c6032b4e13fa8fb128e839a305aa1676c'
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4fd9ea2ebfbeb4119560e74e5b0456fd618118c9f72a7ecf288a55a3e1a95413-merged.mount: Deactivated successfully.
Feb 01 07:53:41 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Cpu/Collectd::Plugin[cpu]/File[cpu.load]/ensure: defined content as '{sha256}67d4c8bf6bf5785f4cb6b596712204d9eacbcebbf16fe289907195d4d3cb0e34'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Notice: /Stage[main]/Tripleo::Profile::Base::Iscsid/Augeas[chap_algs in /etc/iscsi/iscsid.conf]/returns: executed successfully
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Df/Collectd::Plugin[df]/File[df.load]/ensure: defined content as '{sha256}edeb4716d96fc9dca2c6adfe07bae70ba08c6af3944a3900581cba0f08f3c4ba'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Notice: Applied catalog in 0.52 seconds
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Application:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:    Initial environment: production
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:    Converged environment: production
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:          Run mode: user
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Changes:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:             Total: 4
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Events:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:           Success: 4
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:             Total: 4
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Resources:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:           Changed: 4
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:       Out of sync: 4
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:           Skipped: 8
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:             Total: 13
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Time:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:              File: 0.00
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:              Exec: 0.05
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:    Config retrieval: 0.13
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:            Augeas: 0.45
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:    Transaction evaluation: 0.51
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:    Catalog application: 0.52
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:          Last run: 1769932422
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:             Total: 0.52
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]: Version:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:            Config: 1769932421
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52053]:            Puppet: 7.10.0
Feb 01 07:53:42 np0005604215.localdomain podman[51852]: 2026-02-01 07:53:42.025569894 +0000 UTC m=+3.743943516 container died 11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_puppet_step1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=container-puppet-metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Disk/Collectd::Plugin[disk]/File[disk.load]/ensure: defined content as '{sha256}1d0cb838278f3226fcd381f0fc2e0e1abaf0d590f4ba7bcb2fc6ec113d3ebde7'
Feb 01 07:53:42 np0005604215.localdomain podman[52522]: 2026-02-01 07:53:42.034983278 +0000 UTC m=+0.115861910 container cleanup e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=container-puppet-crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=container-puppet-crond, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_puppet_step1, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']})
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-conmon-e1a6e99a8ee469635366e7a4cea70fd0827e6234b5619dd2b41de273d6a79219.scope: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-crond --conmon-pidfile /run/container-puppet-crond.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=crond --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::logrotate --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-crond --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'crond', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::logrotate'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-crond.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[hugepages.load]/ensure: defined content as '{sha256}9b9f35b65a73da8d4037e4355a23b678f2cf61997ccf7a5e1adf2a7ce6415827'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Hugepages/Collectd::Plugin[hugepages]/File[older_hugepages.load]/ensure: removed
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Interface/Collectd::Plugin[interface]/File[interface.load]/ensure: defined content as '{sha256}b76b315dc312e398940fe029c6dbc5c18d2b974ff7527469fc7d3617b5222046'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Load/Collectd::Plugin[load]/File[load.load]/ensure: defined content as '{sha256}af2403f76aebd2f10202d66d2d55e1a8d987eed09ced5a3e3873a4093585dc31'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Memory/Collectd::Plugin[memory]/File[memory.load]/ensure: defined content as '{sha256}0f270425ee6b05fc9440ee32b9afd1010dcbddd9b04ca78ff693858f7ecb9d0e'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Unixsock/Collectd::Plugin[unixsock]/File[unixsock.load]/ensure: defined content as '{sha256}9d1ec1c51ba386baa6f62d2e019dbd6998ad924bf868b3edc2d24d3dc3c63885'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Uptime/Collectd::Plugin[uptime]/File[uptime.load]/ensure: defined content as '{sha256}f7a26c6369f904d0ca1af59627ebea15f5e72160bcacdf08d217af282b42e5c0'
Feb 01 07:53:42 np0005604215.localdomain podman[52543]: 2026-02-01 07:53:42.098264053 +0000 UTC m=+0.111529481 container cleanup 11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=container-puppet-metrics_qdr, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-metrics_qdr, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[virt.load]/ensure: defined content as '{sha256}9a2bcf913f6bf8a962a0ff351a9faea51ae863cc80af97b77f63f8ab68941c62'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: /Stage[main]/Collectd::Plugin::Virt/Collectd::Plugin[virt]/File[older_virt.load]/ensure: removed
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-conmon-11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27.scope: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-metrics_qdr --conmon-pidfile /run/container-puppet-metrics_qdr.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron --env NAME=metrics_qdr --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::qdr
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-metrics_qdr --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron', 'NAME': 'metrics_qdr', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::qdr\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-metrics_qdr.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Notice: Applied catalog in 0.31 seconds
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Application:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:    Initial environment: production
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:    Converged environment: production
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:          Run mode: user
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Changes:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:             Total: 43
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Events:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:           Success: 43
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:             Total: 43
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Resources:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:           Skipped: 14
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:           Changed: 38
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:       Out of sync: 38
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:             Total: 82
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Time:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:    Concat fragment: 0.00
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:       Concat file: 0.00
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:              File: 0.14
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:    Transaction evaluation: 0.30
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:    Catalog application: 0.31
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:    Config retrieval: 0.49
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:          Last run: 1769932422
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:             Total: 0.31
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]: Version:
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:            Config: 1769932421
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52007]:            Puppet: 7.10.0
Feb 01 07:53:42 np0005604215.localdomain sudo[52567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:53:42 np0005604215.localdomain sudo[52567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:53:42 np0005604215.localdomain sudo[52567]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9.scope: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9.scope: Consumed 2.490s CPU time.
Feb 01 07:53:42 np0005604215.localdomain podman[51855]: 2026-02-01 07:53:42.299597393 +0000 UTC m=+4.009452441 container died 1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, container_name=container-puppet-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_id=tripleo_puppet_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-45fdf082ac490d270b07fd0f17cf89cd8bf1d13e0d604cb75e37ccaf54fab194-merged.mount: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11651b540d5228a8dc09a1dd29082efa8101b78da1dee1533bd26f2650e9cd27-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-311e95bb3bdbe1bb40730cbc80ffa3861fcafc1265b18b49a2e8169fc5d3cbf1-merged.mount: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain podman[52689]: 2026-02-01 07:53:42.402692022 +0000 UTC m=+0.091157938 container cleanup 1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=container-puppet-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, distribution-scope=public, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, container_name=container-puppet-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid)
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-conmon-1f16e0e1e2e3a2dd4b5e420bfd21338a28394069efeefa12bd8c15ddab6dbcb9.scope: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-iscsid --conmon-pidfile /run/container-puppet-iscsid.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,iscsid_config --env NAME=iscsid --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::iscsid
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-iscsid --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,iscsid_config', 'NAME': 'iscsid', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::iscsid\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/iscsi:/tmp/iscsi.host:z', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-iscsid.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/iscsi:/tmp/iscsi.host:z --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 01 07:53:42 np0005604215.localdomain podman[52722]: 2026-02-01 07:53:42.426260166 +0000 UTC m=+0.062725907 container create 734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, container_name=container-puppet-rsyslog, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:09Z, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_puppet_step1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: Started libpod-conmon-734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035.scope.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1783ac4e59af83bfa6c705cb913a4e3f5e5d835b34fd8ada82ce7a661d9e5a58/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2.scope: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2.scope: Consumed 2.744s CPU time.
Feb 01 07:53:42 np0005604215.localdomain podman[52722]: 2026-02-01 07:53:42.397516313 +0000 UTC m=+0.033982064 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 1.24 seconds
Feb 01 07:53:42 np0005604215.localdomain podman[52755]: 2026-02-01 07:53:42.469895091 +0000 UTC m=+0.046051370 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 01 07:53:42 np0005604215.localdomain podman[52722]: 2026-02-01 07:53:42.574860105 +0000 UTC m=+0.211325876 container init 734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, io.openshift.expose-services=, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=container-puppet-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510)
Feb 01 07:53:42 np0005604215.localdomain podman[52722]: 2026-02-01 07:53:42.583058919 +0000 UTC m=+0.219524660 container start 734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, build-date=2026-01-12T22:10:09Z, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-rsyslog, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 01 07:53:42 np0005604215.localdomain podman[52722]: 2026-02-01 07:53:42.5836757 +0000 UTC m=+0.220141511 container attach 734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, build-date=2026-01-12T22:10:09Z, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, container_name=container-puppet-rsyslog, version=17.1.13, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:53:42 np0005604215.localdomain podman[52755]: 2026-02-01 07:53:42.604040512 +0000 UTC m=+0.180196781 container create 5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_puppet_step1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=container-puppet-ovn_controller, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 01 07:53:42 np0005604215.localdomain podman[51834]: 2026-02-01 07:53:42.606332812 +0000 UTC m=+4.352322368 container died 416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=container-puppet-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_puppet_step1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd)
Feb 01 07:53:42 np0005604215.localdomain podman[52809]: 2026-02-01 07:53:42.640543452 +0000 UTC m=+0.079417212 container cleanup 416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=container-puppet-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, container_name=container-puppet-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1)
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: libpod-conmon-416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2.scope: Deactivated successfully.
Feb 01 07:53:42 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-collectd --conmon-pidfile /run/container-puppet-collectd.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,collectd_client_config,exec --env NAME=collectd --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::metrics::collectd --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-collectd --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,collectd_client_config,exec', 'NAME': 'collectd', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::metrics::collectd'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-collectd.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: Started libpod-conmon-5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f.scope.
Feb 01 07:53:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f62902336a91aed0e6d89cda1611500b3d6fe7b4bddf84b8ce31199c37cfaf6/merged/etc/sysconfig/modules supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5f62902336a91aed0e6d89cda1611500b3d6fe7b4bddf84b8ce31199c37cfaf6/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:42 np0005604215.localdomain podman[52755]: 2026-02-01 07:53:42.723248448 +0000 UTC m=+0.299404737 container init 5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=container-puppet-ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 07:53:42 np0005604215.localdomain podman[52755]: 2026-02-01 07:53:42.732784187 +0000 UTC m=+0.308940456 container start 5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., container_name=container-puppet-ovn_controller, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, release=1766032510, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 07:53:42 np0005604215.localdomain podman[52755]: 2026-02-01 07:53:42.733956578 +0000 UTC m=+0.310112857 container attach 5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_puppet_step1, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=container-puppet-ovn_controller)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File[/etc/nova/migration/identity]/content: content changed '{sha256}86610d84e745a3992358ae0b747297805d075492e5114c666fa08f8aecce7da0' to '{sha256}41fd6c6f800884fc5582fcd6978c5fdf9efd895ea286512b024eb4dc5635dca8'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Client/File_line[nova_ssh_port]/ensure: created
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/File[/etc/sasl2/libvirt.conf]/content: content changed '{sha256}78510a0d6f14b269ddeb9f9638dfdfba9f976d370ee2ec04ba25352a8af6df35' to '{sha256}6d7bcae773217a30c0772f75d0d1b6d21f5d64e72853f5e3d91bb47799dbb7fe'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Warning: Empty environment setting 'TLS_PASSWORD'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]:    (file: /etc/puppet/modules/tripleo/manifests/profile/base/nova/libvirt.pp, line: 182)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Libvirt/Exec[set libvirt sasl credentials]/returns: executed successfully
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File[/etc/nova/migration/authorized_keys]/content: content changed '{sha256}0d05a8832f36c0517b84e9c3ad11069d531c7d2be5297661e5552fd29e3a5e47' to '{sha256}ebc7fc3dcb9777cbffecb2db809cb7f56024c1a98bdd34554dbaaa8469bb0cdf'
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Tripleo::Profile::Base::Nova::Migration::Target/File_line[nova_migration_logindefs]/ensure: created
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/never_download_image_if_on_rbd]/ensure: created
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52157]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52157]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52157]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52157]:    (file & line not available)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52157]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52157]:    (file & line not available)
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Workarounds/Nova_config[workarounds/disable_compute_service_check_for_ffu]/ensure: created
Feb 01 07:53:42 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ssl_only]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/my_ip]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/host]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/cpu_allocation_ratio]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/ram_allocation_ratio]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/disk_allocation_ratio]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::cache_backend'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 145, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::memcache_servers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 146, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::cache_tls_enabled'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 147, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::cache_tls_cafile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 148, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::cache_tls_certfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 149, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::cache_tls_keyfile'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 150, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::cache_tls_allowed_ciphers'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 151, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::manage_backend_package'. (file: /etc/puppet/modules/ceilometer/manifests/cache.pp, line: 152, column: 39)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/dhcp_domain]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[vif_plug_ovs/ovsdb_connection]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[notifications/notification_format]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_password'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 63, column: 25)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_url'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 68, column: 25)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_region'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 69, column: 28)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 70, column: 25)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_tenant_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 71, column: 29)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_cacert'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 72, column: 23)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_endpoint_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 73, column: 26)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_user_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 74, column: 33)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_project_domain_name'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 75, column: 36)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Warning: Unknown variable: '::ceilometer::agent::auth::auth_type'. (file: /etc/puppet/modules/ceilometer/manifests/agent/service_credentials.pp, line: 76, column: 26)
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/state_path]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/service_down_time]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/rootwrap_config]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[DEFAULT/report_interval]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[notifications/notify_on_state_change]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Nova_config[cinder/cross_az_attach]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.38 seconds
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Glance/Nova_config[glance/valid_interfaces]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_type]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/auth_url]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/http_timeout]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[DEFAULT/host]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[publisher/telemetry_secret]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/password]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-193d63b6dd9579507d9f1518ccbcb97a99c18e05e53fbccdc25e375b68ff02d6-merged.mount: Deactivated successfully.
Feb 01 07:53:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-416498f422383064ad86a44268caa4d4714236e931b56a3198ace870ec017bf2-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Ceilometer_config[hardware/readonly_user_password]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/project_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_url]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/region_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/user_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/username]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/username]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/password]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/region_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Placement/Nova_config[placement/valid_interfaces]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/interface]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/user_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/project_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/password]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Service_credentials/Ceilometer_config[service_credentials/auth_type]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[compute/instance_discovery_method]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[DEFAULT/polling_namespaces]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_type]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[polling/tenant_name_discovery]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/auth_url]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Agent::Polling/Ceilometer_config[coordination/backend_url]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/region_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/backend]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/project_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/enabled]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/username]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/memcache_servers]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/user_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/os_region_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Cache/Oslo::Cache[ceilometer_config]/Ceilometer_config[cache/tls_enabled]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cinder/Nova_config[cinder/catalog_info]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/manager_interval]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Rabbit[ceilometer_config]/Ceilometer_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_base_images]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_original_minimum_age_seconds]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/remove_unused_resized_minimum_age_seconds]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Image_cache/Nova_config[image_cache/precache_concurrency]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/rpc_address_prefix]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Amqp[ceilometer_config]/Ceilometer_config[oslo_messaging_amqp/notify_address_prefix]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/project_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Vendordata/Nova_config[vendordata_dynamic_auth/user_domain_name]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Provider/Nova_config[compute/provider_config_location]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Provider/File[/etc/nova/provider_config]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/driver]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Notifications[ceilometer_config]/Ceilometer_config[oslo_messaging_notifications/topics]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer/Oslo::Messaging::Default[ceilometer_config]/Ceilometer_config[DEFAULT/transport_url]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/debug]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: /Stage[main]/Ceilometer::Logging/Oslo::Log[ceilometer_config]/Ceilometer_config[DEFAULT/log_dir]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Notice: Applied catalog in 0.50 seconds
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Application:
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:    Initial environment: production
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:    Converged environment: production
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:          Run mode: user
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Changes:
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:             Total: 31
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Events:
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:           Success: 31
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:             Total: 31
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Resources:
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:           Skipped: 22
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:           Changed: 31
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:       Out of sync: 31
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:             Total: 151
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Time:
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:           Package: 0.03
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:    Ceilometer config: 0.39
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:    Config retrieval: 0.45
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:    Transaction evaluation: 0.49
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:    Catalog application: 0.50
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:          Last run: 1769932423
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:         Resources: 0.00
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:             Total: 0.50
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]: Version:
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:            Config: 1769932422
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52157]:            Puppet: 7.10.0
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/use_cow_images]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/mkisofs_cmd]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/force_raw_images]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_host_memory_mb]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/reserved_huge_pages]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/resume_guests_state_on_host_boot]/ensure: created
Feb 01 07:53:43 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[key_manager/backend]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/sync_power_state_interval]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/consecutive_build_service_disable_threshold]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/live_migration_wait_for_vif_plug]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[compute/max_disk_devices_to_attach]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Vncproxy::Common/Nova_config[vnc/novncproxy_base_url]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/server_proxyclient_address]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[vnc/enabled]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[spice/enabled]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute/Nova_config[DEFAULT/instance_usage_audit_period]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain systemd[1]: libpod-5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91.scope: Deactivated successfully.
Feb 01 07:53:44 np0005604215.localdomain systemd[1]: libpod-5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91.scope: Consumed 3.169s CPU time.
Feb 01 07:53:44 np0005604215.localdomain podman[52110]: 2026-02-01 07:53:44.20678463 +0000 UTC m=+3.666051715 container died 5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:24Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:24Z, container_name=container-puppet-ceilometer, description=Red Hat OpenStack Platform 17.1 ceilometer-central, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-central, com.redhat.component=openstack-ceilometer-central-container, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, architecture=x86_64, config_id=tripleo_puppet_step1, vcs-type=git, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_is_fatal]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[DEFAULT/vif_plugging_timeout]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/default_floating_pool]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/timeout]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_name]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/project_domain_name]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/region_name]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/username]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain systemd[1]: tmp-crun.vEJE4z.mount: Deactivated successfully.
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/user_domain_name]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/password]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_url]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/valid_interfaces]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain podman[53013]: 2026-02-01 07:53:44.332425478 +0000 UTC m=+0.116357198 container cleanup 5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1, name=container-puppet-ceilometer, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-central, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:24Z, io.buildah.version=1.41.5, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 ceilometer-central, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:24Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-central-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-central, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-central, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-central, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, container_name=container-puppet-ceilometer, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-central, vendor=Red Hat, Inc.)
Feb 01 07:53:44 np0005604215.localdomain systemd[1]: libpod-conmon-5254c7db874e6a04303dfb1bd6b2faa423e5c8db607a20f3689713c64f535b91.scope: Deactivated successfully.
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/ovs_bridge]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ceilometer --conmon-pidfile /run/container-puppet-ceilometer.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config --env NAME=ceilometer --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                         include tripleo::profile::base::ceilometer::agent::polling
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ceilometer --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,ceilometer_config,ceilometer_config', 'NAME': 'ceilometer', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::ceilometer::agent::polling\ninclude tripleo::profile::base::ceilometer::agent::polling\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ceilometer.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ceilometer-central:17.1
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/extension_sync_interval]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Network::Neutron/Nova_config[neutron/auth_type]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d8c1697d3f9451811eabeba845d2774ca9523a4c1f6255791f262d42dbea547b-merged.mount: Deactivated successfully.
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_uri]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_tunnelled]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_inbound_addr]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_post_copy]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Nova_config[libvirt/live_migration_permit_auto_converge]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tls]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Libvirt/Virtproxyd_config[listen_tcp]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_user]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/rbd_secret_uuid]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/File[/etc/nova/secret.xml]/ensure: defined content as '{sha256}f9d8b60f125f93c01d13e9bc67ee58f1fd06cc57ef5fbe63b5478e0790417593'
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_type]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]:    (file & line not available)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_pool]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    (file & line not available)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_ceph_conf]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_store_name]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]:    (file & line not available)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_poll_interval]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    (file & line not available)
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Rbd/Nova_config[libvirt/images_rbd_glance_copy_timeout]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/compute_driver]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[DEFAULT/preallocate_images]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[vnc/server_listen]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/virt_type]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_mode]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_password]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_key]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/inject_partition]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_disk_discard]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/hw_machine_type]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/enabled_perf_events]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/rx_queue_size]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/tx_queue_size]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.24 seconds
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.22 seconds
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/file_backed_memory]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/volume_use_multipath]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/num_pcie_ports]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/mem_stats_period_seconds]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/pmem_namespaces]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/swtpm_enabled]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53164]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote=tcp:172.17.0.103:6642,tcp:172.17.0.104:6642,tcp:172.17.0.105:6642
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53166]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-type=geneve
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-type]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/cpu_model_extra_flags]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Notice: /Stage[main]/Rsyslog::Base/File[/etc/rsyslog.conf]/content: content changed '{sha256}d6f679f6a4eb6f33f9fc20c846cb30bef93811e1c86bc4da1946dc3100b826c3' to '{sha256}7963bd801fadd49a17561f4d3f80738c3f504b413b11c443432d8303138041f2'
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53168]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-ip=172.19.0.108
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt/Nova_config[libvirt/disk_cachemodes]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_filters]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-ip]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtlogd/Virtlogd_config[log_outputs]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Notice: /Stage[main]/Rsyslog::Config::Global/Rsyslog::Component::Global_config[MaxMessageSize]/Rsyslog::Generate_concat[rsyslog::concat::global_config::MaxMessageSize]/Concat[/etc/rsyslog.d/00_rsyslog.conf]/File[/etc/rsyslog.d/00_rsyslog.conf]/ensure: defined content as '{sha256}a291d5cc6d5884a978161f4c7b5831d43edd07797cc590bae366e7f150b8643b'
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_filters]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtproxyd/Virtproxyd_config[log_outputs]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Notice: /Stage[main]/Rsyslog::Config::Templates/Rsyslog::Component::Template[rsyslog-node-index]/Rsyslog::Generate_concat[rsyslog::concat::template::rsyslog-node-index]/Concat[/etc/rsyslog.d/50_openstack_logs.conf]/File[/etc/rsyslog.d/50_openstack_logs.conf]/ensure: defined content as '{sha256}d9ddd6486f0577337caa69e7107b3a4c217ac8a894483a5e6ed8bdfdb439e8bc'
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_filters]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtqemud/Virtqemud_config[log_outputs]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Notice: Applied catalog in 0.11 seconds
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Application:
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    Initial environment: production
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    Converged environment: production
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:          Run mode: user
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Changes:
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:             Total: 3
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Events:
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:           Success: 3
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:             Total: 3
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Resources:
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:           Skipped: 11
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:           Changed: 3
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:       Out of sync: 3
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:             Total: 25
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Time:
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:       Concat file: 0.00
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    Concat fragment: 0.00
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:              File: 0.02
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    Transaction evaluation: 0.10
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    Catalog application: 0.11
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:    Config retrieval: 0.27
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:          Last run: 1769932424
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:             Total: 0.11
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]: Version:
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:            Config: 1769932424
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52835]:            Puppet: 7.10.0
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_filters]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtnodedevd/Virtnodedevd_config[log_outputs]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_filters]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53171]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:hostname=np0005604215.localdomain
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtstoraged/Virtstoraged_config[log_outputs]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_filters]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:hostname]/value: value changed 'np0005604215.novalocal' to 'np0005604215.localdomain'
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Virtsecretd/Virtsecretd_config[log_outputs]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_group]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_ro]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[auth_unix_rw]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_ro_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtnodedevd_config[unix_sock_rw_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_group]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_ro]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[auth_unix_rw]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_ro_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtproxyd_config[unix_sock_rw_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_group]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_ro]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[auth_unix_rw]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_ro_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtqemud_config[unix_sock_rw_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_group]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53173]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge=br-int
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_ro]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[auth_unix_rw]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_ro_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtsecretd_config[unix_sock_rw_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_group]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_ro]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[auth_unix_rw]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_ro_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Config/Virtstoraged_config[unix_sock_rw_perms]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53179]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-remote-probe-interval=60000
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-remote-probe-interval]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53183]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-openflow-probe-interval=60
Feb 01 07:53:44 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-openflow-probe-interval]/ensure: created
Feb 01 07:53:44 np0005604215.localdomain ovs-vsctl[53187]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-monitor-all=true
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-monitor-all]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain ovs-vsctl[53190]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-ofctrl-wait-before-clear=8000
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-ofctrl-wait-before-clear]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain ovs-vsctl[53192]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-encap-tos=0
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-encap-tos]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain ovs-vsctl[53195]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-chassis-mac-mappings=datacentre:fa:16:3e:44:83:a4
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-chassis-mac-mappings]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain ovs-vsctl[53208]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-bridge-mappings=datacentre:br-ex
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-bridge-mappings]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain ovs-vsctl[53212]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:ovn-match-northd-version=false
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:ovn-match-northd-version]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain ovs-vsctl[53219]: ovs|00001|vsctl|INFO|Called as /usr/bin/ovs-vsctl set Open_vSwitch . external_ids:garp-max-timeout-sec=0
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: /Stage[main]/Ovn::Controller/Vs_config[external_ids:garp-max-timeout-sec]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: libpod-734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035.scope: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: libpod-734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035.scope: Consumed 2.517s CPU time.
Feb 01 07:53:45 np0005604215.localdomain podman[52722]: 2026-02-01 07:53:45.200161023 +0000 UTC m=+2.836626794 container died 734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_puppet_step1, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, release=1766032510, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=container-puppet-rsyslog, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:09Z, version=17.1.13, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, distribution-scope=public)
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Notice: Applied catalog in 0.50 seconds
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Application:
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:    Initial environment: production
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:    Converged environment: production
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:          Run mode: user
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Changes:
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:             Total: 14
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Events:
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:           Success: 14
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:             Total: 14
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Resources:
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:           Skipped: 12
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:           Changed: 14
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:       Out of sync: 14
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:             Total: 29
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Time:
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:              Exec: 0.02
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:    Config retrieval: 0.27
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:         Vs config: 0.44
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:    Transaction evaluation: 0.49
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:    Catalog application: 0.50
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:          Last run: 1769932425
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:             Total: 0.50
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]: Version:
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:            Config: 1769932424
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52892]:            Puppet: 7.10.0
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1783ac4e59af83bfa6c705cb913a4e3f5e5d835b34fd8ada82ce7a661d9e5a58-merged.mount: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain podman[53227]: 2026-02-01 07:53:45.306070179 +0000 UTC m=+0.092797434 container cleanup 734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=container-puppet-rsyslog, config_id=tripleo_puppet_step1, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, name=rhosp-rhel9/openstack-rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, container_name=container-puppet-rsyslog, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: libpod-conmon-734d51f4d1bb1d3b860c53442f7df52821e3416d9570482daf00b7891c713035.scope: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-rsyslog --conmon-pidfile /run/container-puppet-rsyslog.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment --env NAME=rsyslog --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::logging::rsyslog --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-rsyslog --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,rsyslog::generate_concat,concat::fragment', 'NAME': 'rsyslog', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::logging::rsyslog'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-rsyslog.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Compute::Libvirt::Qemu/Augeas[qemu-conf-limits]/returns: executed successfully
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: libpod-5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f.scope: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: libpod-5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f.scope: Consumed 2.914s CPU time.
Feb 01 07:53:45 np0005604215.localdomain podman[52755]: 2026-02-01 07:53:45.702715561 +0000 UTC m=+3.278871860 container died 5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_puppet_step1, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, architecture=x86_64, container_name=container-puppet-ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true)
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5f62902336a91aed0e6d89cda1611500b3d6fe7b4bddf84b8ce31199c37cfaf6-merged.mount: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain podman[53300]: 2026-02-01 07:53:45.82373732 +0000 UTC m=+0.114620518 container cleanup 5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=container-puppet-ovn_controller, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, version=17.1.13, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_puppet_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=container-puppet-ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 01 07:53:45 np0005604215.localdomain systemd[1]: libpod-conmon-5bd5b1296ba45171c0a35cb096a0e87dd874626ccc80ef771650178655270e5f.scope: Deactivated successfully.
Feb 01 07:53:45 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-ovn_controller --conmon-pidfile /run/container-puppet-ovn_controller.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,vs_config,exec --env NAME=ovn_controller --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::agents::ovn
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-ovn_controller --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,vs_config,exec', 'NAME': 'ovn_controller', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::agents::ovn\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/etc/sysconfig/modules:/etc/sysconfig/modules', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-ovn_controller.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /etc/sysconfig/modules:/etc/sysconfig/modules --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Migration::Qemu/Augeas[qemu-conf-migration-ports]/returns: executed successfully
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/debug]/ensure: created
Feb 01 07:53:45 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Logging/Oslo::Log[nova_config]/Nova_config[DEFAULT/log_dir]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/backend]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/enabled]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/memcache_servers]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Cache/Oslo::Cache[nova_config]/Nova_config[cache/tls_enabled]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Messaging::Rabbit[nova_config]/Nova_config[oslo_messaging_rabbit/ssl]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Messaging::Default[nova_config]/Nova_config[DEFAULT/transport_url]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/driver]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Messaging::Notifications[nova_config]/Nova_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova/Oslo::Concurrency[nova_config]/Nova_config[oslo_concurrency/lock_path]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_type]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/region_name]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/auth_url]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/username]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/password]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/user_domain_name]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_name]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/project_domain_name]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Nova::Keystone::Service_user/Keystone::Resource::Service_user[nova_config]/Nova_config[service_user/send_service_user_token]/ensure: created
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: /Stage[main]/Ssh::Server::Config/Concat[/etc/ssh/sshd_config]/File[/etc/ssh/sshd_config]/ensure: defined content as '{sha256}66a7ab6cc1a19ea5002a5aaa2cfb2f196778c89c859d0afac926fe3fac9c75a4'
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Notice: Applied catalog in 4.08 seconds
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Application:
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Initial environment: production
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Converged environment: production
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:          Run mode: user
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Changes:
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:             Total: 183
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Events:
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:           Success: 183
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:             Total: 183
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Resources:
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:           Changed: 183
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:       Out of sync: 183
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:           Skipped: 57
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:             Total: 487
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Time:
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:       Concat file: 0.00
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Concat fragment: 0.00
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:            Anchor: 0.00
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:         File line: 0.00
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Virtlogd config: 0.00
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Virtnodedevd config: 0.01
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Virtsecretd config: 0.01
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Virtqemud config: 0.01
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:              Exec: 0.01
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Virtstoraged config: 0.01
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:           Package: 0.02
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:              File: 0.02
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Virtproxyd config: 0.03
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:            Augeas: 0.91
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Config retrieval: 1.48
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:          Last run: 1769932426
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:       Nova config: 2.84
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Transaction evaluation: 4.07
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:    Catalog application: 4.08
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:         Resources: 0.00
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:             Total: 4.08
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]: Version:
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:            Config: 1769932421
Feb 01 07:53:46 np0005604215.localdomain puppet-user[52005]:            Puppet: 7.10.0
Feb 01 07:53:47 np0005604215.localdomain systemd[1]: libpod-a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb.scope: Deactivated successfully.
Feb 01 07:53:47 np0005604215.localdomain systemd[1]: libpod-a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb.scope: Consumed 8.111s CPU time.
Feb 01 07:53:47 np0005604215.localdomain podman[51836]: 2026-02-01 07:53:47.692069905 +0000 UTC m=+9.430569352 container died a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, config_id=tripleo_puppet_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, container_name=container-puppet-nova_libvirt, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 07:53:47 np0005604215.localdomain systemd[1]: tmp-crun.JbqPYh.mount: Deactivated successfully.
Feb 01 07:53:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-892d1779a7f946097f73616f672cd69c2781ff491e090964134e591e5adb1a86-merged.mount: Deactivated successfully.
Feb 01 07:53:47 np0005604215.localdomain podman[53372]: 2026-02-01 07:53:47.819617399 +0000 UTC m=+0.117114975 container cleanup a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=container-puppet-nova_libvirt, container_name=container-puppet-nova_libvirt, distribution-scope=public, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_puppet_step1, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 07:53:47 np0005604215.localdomain systemd[1]: libpod-conmon-a702590bfcf1f706e901ffe0737cf048f7386190369052664149dd19c5bfbecb.scope: Deactivated successfully.
Feb 01 07:53:47 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-nova_libvirt --conmon-pidfile /run/container-puppet-nova_libvirt.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password --env NAME=nova_libvirt --env STEP_CONFIG=include ::tripleo::packages
                                                         # TODO(emilien): figure how to deal with libvirt profile.
                                                         # We'll probably treat it like we do with Neutron plugins.
                                                         # Until then, just include it in the default nova-compute role.
                                                         include tripleo::profile::base::nova::compute::libvirt
                                                         
                                                         include tripleo::profile::base::nova::libvirt
                                                         
                                                         include tripleo::profile::base::nova::compute::libvirt_guests
                                                         
                                                         include tripleo::profile::base::sshd
                                                         include tripleo::profile::base::nova::migration::target --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-nova_libvirt --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,nova_config,libvirtd_config,virtlogd_config,virtproxyd_config,virtqemud_config,virtnodedevd_config,virtsecretd_config,virtstoraged_config,nova_config,file,libvirt_tls_password,libvirtd_config,nova_config,file,libvirt_tls_password', 'NAME': 'nova_libvirt', 'STEP_CONFIG': "include ::tripleo::packages\n# TODO(emilien): figure how to deal with libvirt profile.\n# We'll probably treat it like we do with Neutron plugins.\n# Until then, just include it in the default nova-compute role.\ninclude tripleo::profile::base::nova::compute::libvirt\n\ninclude tripleo::profile::base::nova::libvirt\n\ninclude tripleo::profile::base::nova::compute::libvirt_guests\n\ninclude tripleo::profile::base::sshd\ninclude tripleo::profile::base::nova::migration::target"}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-nova_libvirt.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 07:53:52 np0005604215.localdomain podman[52893]: 2026-02-01 07:53:42.814258019 +0000 UTC m=+0.034796972 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 01 07:53:52 np0005604215.localdomain podman[53472]: 2026-02-01 07:53:52.667882905 +0000 UTC m=+0.102146038 container create 5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, build-date=2026-01-12T22:57:35Z, container_name=container-puppet-neutron, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:57:35Z, vcs-type=git, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-server, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-server, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-neutron-server-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_puppet_step1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 07:53:52 np0005604215.localdomain podman[53472]: 2026-02-01 07:53:52.603368397 +0000 UTC m=+0.037631560 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 01 07:53:52 np0005604215.localdomain systemd[1]: Started libpod-conmon-5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19.scope.
Feb 01 07:53:52 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:53:52 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ef459e28ad8635c7a92e994211ce7b874f14e5a38aca9f947ab317c65716a008/merged/var/lib/config-data supports timestamps until 2038 (0x7fffffff)
Feb 01 07:53:52 np0005604215.localdomain podman[53472]: 2026-02-01 07:53:52.785884158 +0000 UTC m=+0.220147301 container init 5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, version=17.1.13, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-server, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, container_name=container-puppet-neutron, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-server, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_puppet_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:57:35Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-server-container, build-date=2026-01-12T22:57:35Z, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20260112.1)
Feb 01 07:53:52 np0005604215.localdomain podman[53472]: 2026-02-01 07:53:52.833985188 +0000 UTC m=+0.268248341 container start 5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, config_id=tripleo_puppet_step1, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:57:35Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=container-puppet-neutron, vendor=Red Hat, Inc., tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-neutron-server-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:57:35Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-server)
Feb 01 07:53:52 np0005604215.localdomain podman[53472]: 2026-02-01 07:53:52.83490662 +0000 UTC m=+0.269169773 container attach 5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-server, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-server, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:57:35Z, io.openshift.expose-services=, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:57:35Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, container_name=container-puppet-neutron, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-server, vcs-type=git, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-server-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_puppet_step1)
Feb 01 07:53:54 np0005604215.localdomain puppet-user[53502]: Error: Facter: error while resolving custom fact "haproxy_version": undefined method `strip' for nil:NilClass
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]:    (file: /etc/puppet/hiera.yaml)
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Warning: Undefined variable '::deploy_config_name';
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]:    (file & line not available)
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]:    (file & line not available)
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Warning: Unknown variable: 'dhcp_agents_per_net'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/neutron.pp, line: 154, column: 37)
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.61 seconds
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/auth_strategy]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/core_plugin]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/host]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dns_domain]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/dhcp_agent_notification]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/allow_overlapping_ips]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/global_physnet_mtu]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/vlan_transparent]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[agent/root_helper]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[agent/report_interval]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Neutron_config[DEFAULT/service_plugins]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/debug]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_host]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/nova_metadata_protocol]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_proxy_shared_secret]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/metadata_workers]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/state_path]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[DEFAULT/hwol_qos_enabled]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[agent/root_helper]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovs/ovsdb_connection_timeout]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovsdb_probe_interval]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_nb_connection]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Agents::Ovn_metadata/Ovn_metadata_agent_config[ovn/ovn_sb_connection]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/transport_url]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Default[neutron_config]/Neutron_config[DEFAULT/control_exchange]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Concurrency[neutron_config]/Neutron_config[oslo_concurrency/lock_path]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/driver]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Notifications[neutron_config]/Neutron_config[oslo_messaging_notifications/transport_url]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_in_pthread]/ensure: created
Feb 01 07:53:55 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron/Oslo::Messaging::Rabbit[neutron_config]/Neutron_config[oslo_messaging_rabbit/heartbeat_timeout_threshold]/ensure: created
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/debug]/ensure: created
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Notice: /Stage[main]/Neutron::Logging/Oslo::Log[neutron_config]/Neutron_config[DEFAULT/log_dir]/ensure: created
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Notice: Applied catalog in 0.51 seconds
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Application:
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Initial environment: production
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Converged environment: production
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:          Run mode: user
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Changes:
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:             Total: 33
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Events:
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:           Success: 33
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:             Total: 33
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Resources:
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:           Skipped: 21
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:           Changed: 33
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:       Out of sync: 33
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:             Total: 155
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Time:
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:         Resources: 0.00
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Ovn metadata agent config: 0.01
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Neutron config: 0.44
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Transaction evaluation: 0.50
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Catalog application: 0.51
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:    Config retrieval: 0.67
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:          Last run: 1769932436
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:             Total: 0.51
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]: Version:
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:            Config: 1769932435
Feb 01 07:53:56 np0005604215.localdomain puppet-user[53502]:            Puppet: 7.10.0
Feb 01 07:53:56 np0005604215.localdomain systemd[1]: libpod-5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19.scope: Deactivated successfully.
Feb 01 07:53:56 np0005604215.localdomain systemd[1]: libpod-5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19.scope: Consumed 3.668s CPU time.
Feb 01 07:53:56 np0005604215.localdomain podman[53472]: 2026-02-01 07:53:56.737535641 +0000 UTC m=+4.171798814 container died 5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, description=Red Hat OpenStack Platform 17.1 neutron-server, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:57:35Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:57:35Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=container-puppet-neutron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-server, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_puppet_step1, name=rhosp-rhel9/openstack-neutron-server, io.buildah.version=1.41.5, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, com.redhat.component=openstack-neutron-server-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, vcs-type=git, version=17.1.13)
Feb 01 07:53:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19-userdata-shm.mount: Deactivated successfully.
Feb 01 07:53:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ef459e28ad8635c7a92e994211ce7b874f14e5a38aca9f947ab317c65716a008-merged.mount: Deactivated successfully.
Feb 01 07:53:56 np0005604215.localdomain podman[53614]: 2026-02-01 07:53:56.884187094 +0000 UTC m=+0.137782357 container cleanup 5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1, name=container-puppet-neutron, release=1766032510, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-server, config_id=tripleo_puppet_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-server, summary=Red Hat OpenStack Platform 17.1 neutron-server, architecture=x86_64, com.redhat.component=openstack-neutron-server-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-server, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-server, config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-server, org.opencontainers.image.created=2026-01-12T22:57:35Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=container-puppet-neutron, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:57:35Z, version=17.1.13)
Feb 01 07:53:56 np0005604215.localdomain systemd[1]: libpod-conmon-5031d7afe8cffd929b5248a1201872d9cbbd95beef630a2d68870efd023aaa19.scope: Deactivated successfully.
Feb 01 07:53:56 np0005604215.localdomain python3[51659]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name container-puppet-neutron --conmon-pidfile /run/container-puppet-neutron.pid --detach=False --entrypoint /var/lib/container-puppet/container-puppet.sh --env STEP=6 --env NET_HOST=true --env DEBUG=true --env HOSTNAME=np0005604215 --env NO_ARCHIVE= --env PUPPET_TAGS=file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config --env NAME=neutron --env STEP_CONFIG=include ::tripleo::packages
                                                         include tripleo::profile::base::neutron::ovn_metadata
                                                          --label config_id=tripleo_puppet_step1 --label container_name=container-puppet-neutron --label managed_by=tripleo_ansible --label config_data={'security_opt': ['label=disable'], 'user': 0, 'detach': False, 'recreate': True, 'entrypoint': '/var/lib/container-puppet/container-puppet.sh', 'environment': {'STEP': 6, 'NET_HOST': 'true', 'DEBUG': 'true', 'HOSTNAME': 'np0005604215', 'NO_ARCHIVE': '', 'PUPPET_TAGS': 'file,file_line,concat,augeas,cron,neutron_config,ovn_metadata_agent_config', 'NAME': 'neutron', 'STEP_CONFIG': 'include ::tripleo::packages\ninclude tripleo::profile::base::neutron::ovn_metadata\n'}, 'net': ['host'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1', 'volumes': ['/dev/log:/dev/log:rw', '/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/config-data:/var/lib/config-data:rw', '/var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro', '/var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro', '/var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/container-puppet-neutron.log --network host --security-opt label=disable --user 0 --volume /dev/log:/dev/log:rw --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/config-data:/var/lib/config-data:rw --volume /var/lib/container-puppet/container-puppet.sh:/var/lib/container-puppet/container-puppet.sh:ro --volume /var/lib/container-puppet/puppetlabs/facter.conf:/etc/puppetlabs/facter/facter.conf:ro --volume /var/lib/container-puppet/puppetlabs:/opt/puppetlabs:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-server:17.1
Feb 01 07:53:57 np0005604215.localdomain sudo[51657]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:57 np0005604215.localdomain sudo[53666]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbcqzqzvmqislcvztiaavbwmeovpmhpx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:53:57 np0005604215.localdomain sudo[53666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:57 np0005604215.localdomain python3[53668]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:53:57 np0005604215.localdomain sudo[53666]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:57 np0005604215.localdomain sudo[53682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nelvgeiltznpgvhwnqgjlbldkflhvyap ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:53:57 np0005604215.localdomain sudo[53682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:58 np0005604215.localdomain sudo[53682]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:58 np0005604215.localdomain sudo[53698]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kogtbmqtijzuxzevwxqrpvgtsuremsta ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:53:58 np0005604215.localdomain sudo[53698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:58 np0005604215.localdomain python3[53700]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:53:58 np0005604215.localdomain sudo[53698]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:58 np0005604215.localdomain sudo[53748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdtjuqddtxttactcsmdrujnnflyvljca ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:53:58 np0005604215.localdomain sudo[53748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:59 np0005604215.localdomain python3[53750]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:53:59 np0005604215.localdomain sudo[53748]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:59 np0005604215.localdomain sudo[53791]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iiifasumagrzflkhtyyualvqijyjyqer ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:53:59 np0005604215.localdomain sudo[53791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:59 np0005604215.localdomain python3[53793]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932438.858567-84168-192413286051373/source dest=/usr/libexec/tripleo-container-shutdown mode=0700 owner=root group=root _original_basename=tripleo-container-shutdown follow=False checksum=7d67b1986212f5548057505748cd74cfcf9c0d35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:53:59 np0005604215.localdomain sudo[53791]: pam_unix(sudo:session): session closed for user root
Feb 01 07:53:59 np0005604215.localdomain sudo[53853]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnopwnwwlmucyxjhrjvfjcpdngstezqi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:53:59 np0005604215.localdomain sudo[53853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:53:59 np0005604215.localdomain python3[53855]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:53:59 np0005604215.localdomain sudo[53853]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:00 np0005604215.localdomain sudo[53896]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuwuzybtuarzppsqqwlbfevnhfbllsdy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:00 np0005604215.localdomain sudo[53896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:00 np0005604215.localdomain python3[53898]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932439.686717-84168-123115628839968/source dest=/usr/libexec/tripleo-start-podman-container mode=0700 owner=root group=root _original_basename=tripleo-start-podman-container follow=False checksum=536965633b8d3b1ce794269ffb07be0105a560a0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:00 np0005604215.localdomain sudo[53896]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:00 np0005604215.localdomain sudo[53958]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drkhcxcavqajuzltmzcifhqdldghxcnp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:00 np0005604215.localdomain sudo[53958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:00 np0005604215.localdomain python3[53960]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:54:00 np0005604215.localdomain sudo[53958]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:01 np0005604215.localdomain sudo[54001]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onfklrliqcrtqjinlwplxhaqjxiujuba ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:01 np0005604215.localdomain sudo[54001]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:01 np0005604215.localdomain python3[54003]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932440.6059961-84228-207305042306292/source dest=/usr/lib/systemd/system/tripleo-container-shutdown.service mode=0644 owner=root group=root _original_basename=tripleo-container-shutdown-service follow=False checksum=66c1d41406ba8714feb9ed0a35259a7a57ef9707 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:01 np0005604215.localdomain sudo[54001]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:01 np0005604215.localdomain sudo[54063]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndhvhaoiltjnznkfjkvntquurovvglck ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:01 np0005604215.localdomain sudo[54063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:01 np0005604215.localdomain python3[54065]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:54:01 np0005604215.localdomain sudo[54063]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:01 np0005604215.localdomain sudo[54106]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hrvsizahlcubqdzwungucpafmmxmhcaj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:01 np0005604215.localdomain sudo[54106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:02 np0005604215.localdomain python3[54108]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932441.5017838-84259-12409701200441/source dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset mode=0644 owner=root group=root _original_basename=91-tripleo-container-shutdown-preset follow=False checksum=bccb1207dcbcfaa5ca05f83c8f36ce4c2460f081 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:02 np0005604215.localdomain sudo[54106]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:02 np0005604215.localdomain sudo[54136]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtxawhvnzymqyjlehrmofrnlhkxwismf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:02 np0005604215.localdomain sudo[54136]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:02 np0005604215.localdomain python3[54138]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:54:02 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:54:02 np0005604215.localdomain systemd-rc-local-generator[54159]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:54:02 np0005604215.localdomain systemd-sysv-generator[54164]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:54:02 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:54:03 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:54:03 np0005604215.localdomain systemd-rc-local-generator[54204]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:54:03 np0005604215.localdomain systemd-sysv-generator[54207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:54:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:54:03 np0005604215.localdomain systemd[1]: Starting TripleO Container Shutdown...
Feb 01 07:54:03 np0005604215.localdomain systemd[1]: Finished TripleO Container Shutdown.
Feb 01 07:54:03 np0005604215.localdomain sudo[54136]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:03 np0005604215.localdomain sudo[54260]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-keunnvfmzkwikqovwggiznwvqjgaeimb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:03 np0005604215.localdomain sudo[54260]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:03 np0005604215.localdomain python3[54262]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:54:03 np0005604215.localdomain sudo[54260]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:03 np0005604215.localdomain sudo[54303]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrqhaxwkfkufwmcfodjcxrbvvlpeeflw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:03 np0005604215.localdomain sudo[54303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:04 np0005604215.localdomain python3[54305]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932443.450053-84304-257668070714577/source dest=/usr/lib/systemd/system/netns-placeholder.service mode=0644 owner=root group=root _original_basename=netns-placeholder-service follow=False checksum=8e9c6d5ce3a6e7f71c18780ec899f32f23de4c71 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:04 np0005604215.localdomain sudo[54303]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:04 np0005604215.localdomain sudo[54365]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvjynqddlkdjqrqmukyjyociubikbuld ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:04 np0005604215.localdomain sudo[54365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:04 np0005604215.localdomain python3[54367]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:54:04 np0005604215.localdomain sudo[54365]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:04 np0005604215.localdomain sudo[54408]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqbmtodxcbhnkzamdlsjmerbxstfnydj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:04 np0005604215.localdomain sudo[54408]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:04 np0005604215.localdomain python3[54410]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932444.2926428-84324-204530046180919/source dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset mode=0644 owner=root group=root _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:04 np0005604215.localdomain sudo[54408]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:05 np0005604215.localdomain sudo[54438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wunlxuvdzchlmqcwngotirftjpxhdrmk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:05 np0005604215.localdomain sudo[54438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:05 np0005604215.localdomain python3[54440]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:54:05 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:54:05 np0005604215.localdomain systemd-rc-local-generator[54465]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:54:05 np0005604215.localdomain systemd-sysv-generator[54468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:54:05 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:54:05 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:54:05 np0005604215.localdomain systemd-rc-local-generator[54504]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:54:05 np0005604215.localdomain systemd-sysv-generator[54507]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:54:05 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:54:06 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 07:54:06 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 07:54:06 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 07:54:06 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 07:54:06 np0005604215.localdomain sudo[54438]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:06 np0005604215.localdomain sudo[54532]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nwpxfrdohcgzgrfwnnfaybfausfnpivx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:06 np0005604215.localdomain sudo[54532]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for metrics_qdr, new hash: b8acc88e7150a91ea5eddde509e925f2
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for collectd, new hash: d31718fcd17fdeee6489534105191c7a
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for iscsid, new hash: 848fbaed99314033c0982eb0cffd8af7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtlogd_wrapper, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtnodedevd, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtproxyd, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtqemud, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtsecretd, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_virtstoraged, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for rsyslog, new hash: 52a7bad153b9a3530edb4c6869c1fe7c
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_compute, new hash: 63e53a2f3cd2422147592f2c2c6c2f61
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for ceilometer_agent_ipmi, new hash: 63e53a2f3cd2422147592f2c2c6c2f61
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for logrotate_crond, new hash: 53ed83bb0cae779ff95edb2002262c6f
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_libvirt_init_secret, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_migration_target, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for ovn_metadata_agent, new hash: 08ca8fb8877681656a098784127ead43
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_compute, new hash: 848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain python3[54534]: ansible-container_puppet_config [WARNING] Config change detected for nova_wait_for_compute_service, new hash: 1296029e90a465a2201c8dc6f8be17e7
Feb 01 07:54:06 np0005604215.localdomain sudo[54532]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:06 np0005604215.localdomain sudo[54548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlosaeqmijglelkhcswqbuxpukwuqcmx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:06 np0005604215.localdomain sudo[54548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:07 np0005604215.localdomain sudo[54548]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:07 np0005604215.localdomain sudo[54591]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrwijxollfomtmlmbfiiuybgaozbuoty ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:07 np0005604215.localdomain sudo[54591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:07 np0005604215.localdomain python3[54593]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step1 config_dir=/var/lib/tripleo-config/container-startup-config/step_1 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 01 07:54:08 np0005604215.localdomain podman[54632]: 2026-02-01 07:54:08.205938049 +0000 UTC m=+0.074491652 container create b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr_init_logs, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: Started libpod-conmon-b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583.scope.
Feb 01 07:54:08 np0005604215.localdomain podman[54632]: 2026-02-01 07:54:08.163695731 +0000 UTC m=+0.032249374 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:54:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac18d148f1ccb0eaa519a008e32625aabf00d458250cb02e5015187c1942ecc7/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 01 07:54:08 np0005604215.localdomain podman[54632]: 2026-02-01 07:54:08.284336895 +0000 UTC m=+0.152890479 container init b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr_init_logs, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 07:54:08 np0005604215.localdomain podman[54632]: 2026-02-01 07:54:08.293402358 +0000 UTC m=+0.161955941 container start b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=metrics_qdr_init_logs, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 07:54:08 np0005604215.localdomain podman[54632]: 2026-02-01 07:54:08.293543803 +0000 UTC m=+0.162097386 container attach b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, container_name=metrics_qdr_init_logs, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, distribution-scope=public, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1)
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: libpod-b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583.scope: Deactivated successfully.
Feb 01 07:54:08 np0005604215.localdomain podman[54632]: 2026-02-01 07:54:08.304566874 +0000 UTC m=+0.173120477 container died b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, tcib_managed=true, release=1766032510, config_id=tripleo_step1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container)
Feb 01 07:54:08 np0005604215.localdomain podman[54653]: 2026-02-01 07:54:08.387412904 +0000 UTC m=+0.067633906 container cleanup b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr_init_logs, container_name=metrics_qdr_init_logs, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible)
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: libpod-conmon-b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583.scope: Deactivated successfully.
Feb 01 07:54:08 np0005604215.localdomain python3[54593]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr_init_logs --conmon-pidfile /run/metrics_qdr_init_logs.pid --detach=False --label config_id=tripleo_step1 --label container_name=metrics_qdr_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R qdrouterd:qdrouterd /var/log/qdrouterd'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'none', 'privileged': False, 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr_init_logs.log --network none --privileged=False --user root --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1 /bin/bash -c chown -R qdrouterd:qdrouterd /var/log/qdrouterd
Feb 01 07:54:08 np0005604215.localdomain podman[54727]: 2026-02-01 07:54:08.870763919 +0000 UTC m=+0.089395646 container create 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step1, version=17.1.13, managed_by=tripleo_ansible)
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: Started libpod-conmon-75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.scope.
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 07:54:08 np0005604215.localdomain podman[54727]: 2026-02-01 07:54:08.827162705 +0000 UTC m=+0.045794472 image pull  registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 01 07:54:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f747231ffc56e15c128dac75ec633f161eee676530b28d17cb7b8d0be7728054/merged/var/log/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 01 07:54:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f747231ffc56e15c128dac75ec633f161eee676530b28d17cb7b8d0be7728054/merged/var/lib/qdrouterd supports timestamps until 2038 (0x7fffffff)
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:54:08 np0005604215.localdomain podman[54727]: 2026-02-01 07:54:08.961808853 +0000 UTC m=+0.180440620 container init 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, container_name=metrics_qdr, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, url=https://www.redhat.com, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 07:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:54:08 np0005604215.localdomain sudo[54748]: qdrouterd : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 07:54:08 np0005604215.localdomain podman[54727]: 2026-02-01 07:54:08.989774818 +0000 UTC m=+0.208406535 container start 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container)
Feb 01 07:54:08 np0005604215.localdomain sudo[54748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42465)
Feb 01 07:54:08 np0005604215.localdomain python3[54593]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name metrics_qdr --conmon-pidfile /run/metrics_qdr.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=b8acc88e7150a91ea5eddde509e925f2 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step1 --label container_name=metrics_qdr --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/metrics_qdr.log --network host --privileged=False --user qdrouterd --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro --volume /var/lib/metrics_qdr:/var/lib/qdrouterd:z --volume /var/log/containers/metrics_qdr:/var/log/qdrouterd:z registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1
Feb 01 07:54:09 np0005604215.localdomain sudo[54748]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:09 np0005604215.localdomain podman[54749]: 2026-02-01 07:54:09.096087748 +0000 UTC m=+0.096809153 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=starting, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, version=17.1.13, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z)
Feb 01 07:54:09 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ac18d148f1ccb0eaa519a008e32625aabf00d458250cb02e5015187c1942ecc7-merged.mount: Deactivated successfully.
Feb 01 07:54:09 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b37b394a6b121871a6e7de0a6a690655cf4345f3120333fa8002d7236b9b6583-userdata-shm.mount: Deactivated successfully.
Feb 01 07:54:09 np0005604215.localdomain sudo[54591]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:09 np0005604215.localdomain podman[54749]: 2026-02-01 07:54:09.314808148 +0000 UTC m=+0.315529543 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z)
Feb 01 07:54:09 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:54:09 np0005604215.localdomain sudo[54818]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-thmxongwxekmqpddxkclhjhljoqlucvg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:09 np0005604215.localdomain sudo[54818]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:09 np0005604215.localdomain python3[54820]: ansible-file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:09 np0005604215.localdomain sudo[54818]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:09 np0005604215.localdomain sudo[54834]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmplhmjrifvuqbjsgweutrzgrcljbdcy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:09 np0005604215.localdomain sudo[54834]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:09 np0005604215.localdomain python3[54836]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_metrics_qdr_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 07:54:09 np0005604215.localdomain sudo[54834]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:10 np0005604215.localdomain sudo[54895]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zswhyuxjgqyidxiwudcxxxpjhrorfrkg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:10 np0005604215.localdomain sudo[54895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:10 np0005604215.localdomain python3[54897]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932449.9111855-84498-102374918916503/source dest=/etc/systemd/system/tripleo_metrics_qdr.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:10 np0005604215.localdomain sudo[54895]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:10 np0005604215.localdomain sudo[54911]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgkkbvwbdscuqmyjstvqltxzmnzeuvhq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:10 np0005604215.localdomain sudo[54911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:10 np0005604215.localdomain python3[54913]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 07:54:10 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:54:10 np0005604215.localdomain systemd-rc-local-generator[54938]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:54:10 np0005604215.localdomain systemd-sysv-generator[54941]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:54:10 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:54:11 np0005604215.localdomain sudo[54911]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:11 np0005604215.localdomain sudo[54963]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejiemsrxcduyqhwyixgegboupyyrqqar ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 07:54:11 np0005604215.localdomain sudo[54963]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:11 np0005604215.localdomain python3[54965]: ansible-systemd Invoked with state=restarted name=tripleo_metrics_qdr.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 07:54:11 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 07:54:11 np0005604215.localdomain systemd-rc-local-generator[54991]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 07:54:11 np0005604215.localdomain systemd-sysv-generator[54994]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 07:54:11 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 07:54:11 np0005604215.localdomain systemd[1]: Starting metrics_qdr container...
Feb 01 07:54:12 np0005604215.localdomain systemd[1]: Started metrics_qdr container.
Feb 01 07:54:12 np0005604215.localdomain sudo[54963]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:12 np0005604215.localdomain sudo[55043]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wawtohwvpnjvbamcfbnicacqbpvrjuqm ; /usr/bin/python3
Feb 01 07:54:12 np0005604215.localdomain sudo[55043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:12 np0005604215.localdomain python3[55045]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks1.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:12 np0005604215.localdomain sudo[55043]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:12 np0005604215.localdomain sudo[55091]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvfoccmnilnrwzabpnpkzspdoixfuate ; /usr/bin/python3
Feb 01 07:54:12 np0005604215.localdomain sudo[55091]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:13 np0005604215.localdomain sudo[55091]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:13 np0005604215.localdomain sudo[55134]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilcdxdikwnednenxhqcuufpgqdkqursp ; /usr/bin/python3
Feb 01 07:54:13 np0005604215.localdomain sudo[55134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:13 np0005604215.localdomain sudo[55134]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:13 np0005604215.localdomain sudo[55164]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llmdipnnnnaamfubzbiqywtbpywdjfuf ; /usr/bin/python3
Feb 01 07:54:13 np0005604215.localdomain sudo[55164]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:13 np0005604215.localdomain python3[55166]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks1.json short_hostname=np0005604215 step=1 update_config_hash_only=False
Feb 01 07:54:13 np0005604215.localdomain sudo[55164]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:14 np0005604215.localdomain sudo[55180]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cbmmyvswdpxejahvxqavjcgypmsqwwha ; /usr/bin/python3
Feb 01 07:54:14 np0005604215.localdomain sudo[55180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:14 np0005604215.localdomain python3[55182]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:54:14 np0005604215.localdomain sudo[55180]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:14 np0005604215.localdomain sudo[55196]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dwyqlouhfsevhxynervortyjncfxkcqg ; /usr/bin/python3
Feb 01 07:54:14 np0005604215.localdomain sudo[55196]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:54:14 np0005604215.localdomain python3[55198]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_1 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 01 07:54:14 np0005604215.localdomain sudo[55196]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:54:39 np0005604215.localdomain systemd[1]: tmp-crun.MkuwSG.mount: Deactivated successfully.
Feb 01 07:54:39 np0005604215.localdomain podman[55199]: 2026-02-01 07:54:39.870384519 +0000 UTC m=+0.085370166 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, container_name=metrics_qdr, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 07:54:40 np0005604215.localdomain podman[55199]: 2026-02-01 07:54:40.028732795 +0000 UTC m=+0.243718362 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:54:40 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:54:42 np0005604215.localdomain sudo[55228]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:54:42 np0005604215.localdomain sudo[55228]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:54:42 np0005604215.localdomain sudo[55228]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:42 np0005604215.localdomain sudo[55243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:54:42 np0005604215.localdomain sudo[55243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:54:42 np0005604215.localdomain sudo[55243]: pam_unix(sudo:session): session closed for user root
Feb 01 07:54:43 np0005604215.localdomain sudo[55290]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:54:43 np0005604215.localdomain sudo[55290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:54:43 np0005604215.localdomain sudo[55290]: pam_unix(sudo:session): session closed for user root
Feb 01 07:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:55:10 np0005604215.localdomain systemd[1]: tmp-crun.oV8ac2.mount: Deactivated successfully.
Feb 01 07:55:10 np0005604215.localdomain podman[55305]: 2026-02-01 07:55:10.863717331 +0000 UTC m=+0.081324264 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 07:55:11 np0005604215.localdomain podman[55305]: 2026-02-01 07:55:11.079070023 +0000 UTC m=+0.296676956 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_id=tripleo_step1, maintainer=OpenStack TripleO Team)
Feb 01 07:55:11 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:55:41 np0005604215.localdomain podman[55334]: 2026-02-01 07:55:41.862357457 +0000 UTC m=+0.077890232 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, config_id=tripleo_step1)
Feb 01 07:55:42 np0005604215.localdomain podman[55334]: 2026-02-01 07:55:42.080728408 +0000 UTC m=+0.296261163 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z)
Feb 01 07:55:42 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:55:43 np0005604215.localdomain sudo[55364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:55:43 np0005604215.localdomain sudo[55364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:55:43 np0005604215.localdomain sudo[55364]: pam_unix(sudo:session): session closed for user root
Feb 01 07:55:43 np0005604215.localdomain sudo[55379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:55:43 np0005604215.localdomain sudo[55379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:55:44 np0005604215.localdomain sudo[55379]: pam_unix(sudo:session): session closed for user root
Feb 01 07:55:45 np0005604215.localdomain sudo[55426]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:55:45 np0005604215.localdomain sudo[55426]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:55:45 np0005604215.localdomain sudo[55426]: pam_unix(sudo:session): session closed for user root
Feb 01 07:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:56:12 np0005604215.localdomain podman[55441]: 2026-02-01 07:56:12.863074921 +0000 UTC m=+0.077756127 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 07:56:13 np0005604215.localdomain podman[55441]: 2026-02-01 07:56:13.077230564 +0000 UTC m=+0.291911740 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 01 07:56:13 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:56:43 np0005604215.localdomain podman[55470]: 2026-02-01 07:56:43.870803784 +0000 UTC m=+0.085293088 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 07:56:44 np0005604215.localdomain podman[55470]: 2026-02-01 07:56:44.075907171 +0000 UTC m=+0.290396435 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 07:56:44 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:56:45 np0005604215.localdomain sudo[55499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:56:45 np0005604215.localdomain sudo[55499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:56:45 np0005604215.localdomain sudo[55499]: pam_unix(sudo:session): session closed for user root
Feb 01 07:56:45 np0005604215.localdomain sudo[55514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:56:45 np0005604215.localdomain sudo[55514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:56:45 np0005604215.localdomain sudo[55514]: pam_unix(sudo:session): session closed for user root
Feb 01 07:56:46 np0005604215.localdomain sudo[55560]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:56:46 np0005604215.localdomain sudo[55560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:56:46 np0005604215.localdomain sudo[55560]: pam_unix(sudo:session): session closed for user root
Feb 01 07:57:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:57:14 np0005604215.localdomain podman[55575]: 2026-02-01 07:57:14.853157832 +0000 UTC m=+0.068043751 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 01 07:57:15 np0005604215.localdomain podman[55575]: 2026-02-01 07:57:15.028471261 +0000 UTC m=+0.243357160 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, io.openshift.expose-services=, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:57:15 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:57:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:57:45 np0005604215.localdomain systemd[1]: tmp-crun.mMD4YQ.mount: Deactivated successfully.
Feb 01 07:57:45 np0005604215.localdomain podman[55605]: 2026-02-01 07:57:45.861194997 +0000 UTC m=+0.080154276 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 01 07:57:46 np0005604215.localdomain podman[55605]: 2026-02-01 07:57:46.076150972 +0000 UTC m=+0.295110251 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=metrics_qdr, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 01 07:57:46 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:57:46 np0005604215.localdomain sudo[55634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:57:46 np0005604215.localdomain sudo[55634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:57:46 np0005604215.localdomain sudo[55634]: pam_unix(sudo:session): session closed for user root
Feb 01 07:57:46 np0005604215.localdomain sudo[55649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:57:46 np0005604215.localdomain sudo[55649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:57:47 np0005604215.localdomain sudo[55649]: pam_unix(sudo:session): session closed for user root
Feb 01 07:57:48 np0005604215.localdomain sudo[55697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:57:48 np0005604215.localdomain sudo[55697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:57:48 np0005604215.localdomain sudo[55697]: pam_unix(sudo:session): session closed for user root
Feb 01 07:58:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:58:16 np0005604215.localdomain systemd[1]: tmp-crun.zFcmCU.mount: Deactivated successfully.
Feb 01 07:58:16 np0005604215.localdomain podman[55712]: 2026-02-01 07:58:16.895539929 +0000 UTC m=+0.110887424 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, vendor=Red Hat, Inc., container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 07:58:17 np0005604215.localdomain podman[55712]: 2026-02-01 07:58:17.108855483 +0000 UTC m=+0.324202958 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, config_id=tripleo_step1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 07:58:17 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:58:39 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 19 pg[2.0( empty local-lis/les=0/0 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2,1,3] r=0 lpr=19 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:58:40 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 20 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=0/0 les/c/f=0/0/0 sis=19) [2,1,3] r=0 lpr=19 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:58:41 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 20 pg[3.0( empty local-lis/les=0/0 n=0 ec=20/20 lis/c=0/0 les/c/f=0/0/0 sis=20) [1,2,0] r=1 lpr=20 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:58:43 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 22 pg[4.0( empty local-lis/les=0/0 n=0 ec=22/22 lis/c=0/0 les/c/f=0/0/0 sis=22) [3,5,1] r=1 lpr=22 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:58:45 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 24 pg[5.0( empty local-lis/les=0/0 n=0 ec=24/24 lis/c=0/0 les/c/f=0/0/0 sis=24) [4,3,2] r=2 lpr=24 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:58:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:58:47 np0005604215.localdomain systemd[1]: tmp-crun.utZO7s.mount: Deactivated successfully.
Feb 01 07:58:47 np0005604215.localdomain podman[55739]: 2026-02-01 07:58:47.865255547 +0000 UTC m=+0.087092091 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, release=1766032510, io.openshift.expose-services=)
Feb 01 07:58:48 np0005604215.localdomain podman[55739]: 2026-02-01 07:58:48.058689465 +0000 UTC m=+0.280525999 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Feb 01 07:58:48 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:58:48 np0005604215.localdomain sudo[55767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 07:58:48 np0005604215.localdomain sudo[55767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:58:48 np0005604215.localdomain sudo[55767]: pam_unix(sudo:session): session closed for user root
Feb 01 07:58:48 np0005604215.localdomain sudo[55783]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 07:58:48 np0005604215.localdomain sudo[55783]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:58:48 np0005604215.localdomain sudo[55783]: pam_unix(sudo:session): session closed for user root
Feb 01 07:58:49 np0005604215.localdomain sudo[55830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:58:49 np0005604215.localdomain sudo[55830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:58:49 np0005604215.localdomain sudo[55830]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:01 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 30 pg[6.0( empty local-lis/les=0/0 n=0 ec=30/30 lis/c=0/0 les/c/f=0/0/0 sis=30) [0,5,1] r=1 lpr=30 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:02 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 32 pg[7.0( empty local-lis/les=0/0 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [5,1,3] r=0 lpr=32 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:04 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 33 pg[7.0( empty local-lis/les=32/33 n=0 ec=32/32 lis/c=0/0 les/c/f=0/0/0 sis=32) [5,1,3] r=0 lpr=32 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:05 np0005604215.localdomain sudo[55846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:59:05 np0005604215.localdomain sudo[55846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:59:05 np0005604215.localdomain sudo[55846]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:07 np0005604215.localdomain sudo[55861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:59:07 np0005604215.localdomain sudo[55861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:59:07 np0005604215.localdomain sudo[55861]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:08 np0005604215.localdomain sudo[55876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 07:59:08 np0005604215.localdomain sudo[55876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 07:59:08 np0005604215.localdomain sudo[55876]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:59:18 np0005604215.localdomain podman[55891]: 2026-02-01 07:59:18.866761368 +0000 UTC m=+0.081706842 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:59:19 np0005604215.localdomain podman[55891]: 2026-02-01 07:59:19.056803439 +0000 UTC m=+0.271748913 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 07:59:19 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:59:19 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 36 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36 pruub=9.109359741s) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active pruub 1122.711425781s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,3], acting [2,1,3] -> [2,1,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:19 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 36 pg[2.0( empty local-lis/les=19/20 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36 pruub=9.109359741s) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown pruub 1122.711425781s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1f( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1b( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1c( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1d( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1a( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1e( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.8( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.9( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.6( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.5( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.2( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.4( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.7( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.3( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.a( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.b( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.d( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.e( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.c( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.f( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.12( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.11( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.13( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.10( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.14( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.15( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.16( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.17( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.18( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.19( empty local-lis/les=19/20 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.0( empty local-lis/les=36/37 n=0 ec=19/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:20 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 37 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=19/19 les/c/f=20/20/0 sis=36) [2,1,3] r=0 lpr=36 pi=[19,36)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:21 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.0 deep-scrub starts
Feb 01 07:59:21 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.0 deep-scrub ok
Feb 01 07:59:21 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 38 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=8.304828644s) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 active pruub 1123.930908203s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,0], acting [1,2,0] -> [1,2,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:21 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 38 pg[3.0( empty local-lis/les=20/21 n=0 ec=20/20 lis/c=20/20 les/c/f=21/21/0 sis=38 pruub=8.302339554s) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.930908203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:21 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 38 pg[4.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38 pruub=10.063297272s) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 active pruub 1121.191040039s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,1], acting [3,5,1] -> [3,5,1], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:21 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 38 pg[4.0( empty local-lis/les=22/23 n=0 ec=22/22 lis/c=22/22 les/c/f=23/23/0 sis=38 pruub=10.060767174s) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1121.191040039s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.16( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.18( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.17( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.15( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.14( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.13( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.11( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.10( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.f( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.d( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.c( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.b( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.e( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.2( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.3( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.19( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.4( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.9( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1a( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.5( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.a( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.6( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.7( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.8( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1b( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1e( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1f( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1c( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.12( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 39 pg[4.1d( empty local-lis/les=22/23 n=0 ec=38/22 lis/c=22/22 les/c/f=23/23/0 sis=38) [3,5,1] r=1 lpr=38 pi=[22,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.8( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.9( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.7( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.5( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.3( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.6( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.4( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.1( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.b( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.a( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.d( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.c( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.f( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.e( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.10( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.11( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.13( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.12( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.15( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.17( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.14( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.16( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.19( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.2( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:22 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 39 pg[3.18( empty local-lis/les=20/21 n=0 ec=38/20 lis/c=20/20 les/c/f=21/21/0 sis=38) [1,2,0] r=1 lpr=38 pi=[20,38)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:23 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.15 scrub starts
Feb 01 07:59:23 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.15 scrub ok
Feb 01 07:59:23 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=10.047298431s) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 active pruub 1127.717285156s@ mbc={}] start_peering_interval up [4,3,2] -> [4,3,2], acting [4,3,2] -> [4,3,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:23 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 40 pg[5.0( empty local-lis/les=24/25 n=0 ec=24/24 lis/c=24/24 les/c/f=25/25/0 sis=40 pruub=10.044237137s) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1127.717285156s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:23 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 40 pg[6.0( empty local-lis/les=30/31 n=0 ec=30/30 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=10.134173393s) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 active pruub 1123.288208008s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,1], acting [0,5,1] -> [0,5,1], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:23 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 40 pg[6.0( empty local-lis/les=30/31 n=0 ec=30/30 lis/c=30/30 les/c/f=31/31/0 sis=40 pruub=10.130561829s) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1123.288208008s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.14 scrub starts
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.14 scrub ok
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.15( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1a( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.14( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.16( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.12( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.17( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.10( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.11( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.13( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.d( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.c( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.f( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.e( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.2( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.9( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.6( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.3( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1b( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.18( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.b( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.7( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.4( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.a( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.19( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.8( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.5( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1e( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1d( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1c( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 41 pg[6.1f( empty local-lis/les=30/31 n=0 ec=40/30 lis/c=30/30 les/c/f=31/31/0 sis=40) [0,5,1] r=1 lpr=40 pi=[30,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.19( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.18( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.2( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.5( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.3( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.7( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.6( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.4( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.d( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.a( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.9( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.b( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.8( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.c( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.17( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.16( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.14( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.15( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.13( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.12( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.11( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1f( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.1e( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:24 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 41 pg[5.10( empty local-lis/les=24/25 n=0 ec=40/24 lis/c=24/24 les/c/f=25/25/0 sis=40) [4,3,2] r=2 lpr=40 pi=[24,40)/1 crt=0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:25 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.17 scrub starts
Feb 01 07:59:25 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.17 scrub ok
Feb 01 07:59:25 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 42 pg[7.0( v 34'39 (0'0,34'39] local-lis/les=32/33 n=22 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=10.773263931s) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 34'38 mlcod 34'38 active pruub 1125.998168945s@ mbc={}] start_peering_interval up [5,1,3] -> [5,1,3], acting [5,1,3] -> [5,1,3], acting_primary 5 -> 5, up_primary 5 -> 5, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:25 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 42 pg[7.0( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42 pruub=10.773263931s) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 34'38 mlcod 0'0 unknown pruub 1125.998168945s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.18 scrub starts
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.18 scrub ok
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.5( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.9( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.4( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.6( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.a( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.7( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.2( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.3( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=32/33 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.8( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.f( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.e( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.d( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.c( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.b( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=32/33 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.0( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=32/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 34'38 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.4( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.2( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.8( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 43 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=32/32 les/c/f=33/33/0 sis=42) [5,1,3] r=0 lpr=42 pi=[32,42)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:26 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.0 scrub starts
Feb 01 07:59:27 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.0 scrub ok
Feb 01 07:59:31 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Feb 01 07:59:31 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.671681404s) [3,2,4] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646362305s@ mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.748115540s) [0,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.722778320s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,4], acting [4,3,2] -> [0,2,4], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747766495s) [2,4,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.722412109s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,3], acting [4,3,2] -> [2,4,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.706328392s) [3,1,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.680908203s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,5], acting [1,2,0] -> [3,1,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.706067085s) [0,2,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.680786133s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,1], acting [1,2,0] -> [0,2,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.18( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.706243515s) [3,1,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.680908203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747766495s) [2,4,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.722412109s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747913361s) [0,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.722778320s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.19( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.706022263s) [0,2,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.680786133s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.19( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.671585083s) [3,2,4] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.646362305s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.669023514s) [4,2,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644409180s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,3], acting [2,1,3] -> [4,2,3], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.18( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668961525s) [4,2,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644409180s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668678284s) [5,1,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644042969s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747649193s) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.723022461s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.697118759s) [2,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.672607422s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,4], acting [1,2,0] -> [2,3,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.10( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747649193s) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.723022461s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.17( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668617249s) [5,1,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644042969s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.16( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.697118759s) [2,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.672607422s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747053146s) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.722656250s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.11( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747053146s) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.722656250s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667881012s) [5,1,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.643554688s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,0], acting [2,1,3] -> [5,1,0], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.697459221s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.673217773s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747257233s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.723022461s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.17( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.697418213s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.673217773s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.12( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.747172356s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.723022461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.664317131s) [1,0,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.640258789s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.704445839s) [2,4,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.680419922s@ mbc={}] start_peering_interval up [1,2,0] -> [2,4,0], acting [1,2,0] -> [2,4,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.15( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.664267540s) [1,0,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.640258789s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.16( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667556763s) [5,1,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.14( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.704445839s) [2,4,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.680419922s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745916367s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.722045898s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.13( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745870590s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.722045898s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667304039s) [2,4,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.643554688s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,0], acting [2,1,3] -> [2,4,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745551109s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.721801758s@ mbc={}] start_peering_interval up [4,3,2] -> [3,4,5], acting [4,3,2] -> [3,4,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.14( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745512009s) [3,4,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.721801758s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.14( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667304039s) [2,4,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.643554688s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667317390s) [1,5,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.643554688s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.695111275s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.671508789s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,1], acting [1,2,0] -> [0,5,1], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.13( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667275429s) [1,5,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.643554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745864868s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.722290039s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.12( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.695067406s) [0,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.671508789s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.695174217s) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.671630859s@ mbc={}] start_peering_interval up [1,2,0] -> [2,3,1], acting [1,2,0] -> [2,3,1], acting_primary 1 -> 2, up_primary 1 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.15( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745836258s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.722290039s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745570183s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.722167969s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,1], acting [4,3,2] -> [5,3,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.13( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.695174217s) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.671630859s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667712212s) [4,3,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644165039s@ mbc={}] start_peering_interval up [2,1,3] -> [4,3,2], acting [2,1,3] -> [4,3,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.16( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745529175s) [5,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.722167969s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668106079s) [5,3,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644775391s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,4], acting [2,1,3] -> [5,3,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.12( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667649269s) [4,3,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644165039s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.694184303s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670898438s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.11( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668056488s) [5,3,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.10( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.694154739s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.670898438s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.746485710s) [3,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.723266602s@ mbc={}] start_peering_interval up [4,3,2] -> [3,1,5], acting [4,3,2] -> [3,1,5], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.17( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.746417046s) [3,1,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.723266602s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.694130898s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.671142578s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,0], acting [1,2,0] -> [4,5,0], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.11( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.694103241s) [4,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.671142578s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.744627953s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.721557617s@ mbc={}] start_peering_interval up [4,3,2] -> [1,0,5], acting [4,3,2] -> [1,0,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668325424s) [4,0,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.645385742s@ mbc={}] start_peering_interval up [2,1,3] -> [4,0,5], acting [2,1,3] -> [4,0,5], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693974495s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.671020508s@ mbc={}] start_peering_interval up [1,2,0] -> [1,5,0], acting [1,2,0] -> [1,5,0], acting_primary 1 -> 1, up_primary 1 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.672636986s) [4,2,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.649780273s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.8( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.744574547s) [1,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.721557617s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693933487s) [1,5,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.671020508s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.672598839s) [4,2,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.649780273s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.10( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668267250s) [4,0,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.645385742s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743574142s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.720825195s@ mbc={}] start_peering_interval up [4,3,2] -> [5,4,0], acting [4,3,2] -> [5,4,0], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.9( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743518829s) [5,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.720825195s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743288040s) [0,1,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.720581055s@ mbc={}] start_peering_interval up [4,3,2] -> [0,1,2], acting [4,3,2] -> [0,1,2], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693218231s) [2,1,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670410156s@ mbc={}] start_peering_interval up [1,2,0] -> [2,1,0], acting [1,2,0] -> [2,1,0], acting_primary 1 -> 2, up_primary 1 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743253708s) [0,1,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.720581055s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667409897s) [1,5,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644775391s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,3], acting [2,1,3] -> [1,5,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693218231s) [2,1,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.670410156s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667363167s) [1,5,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.695250511s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.672729492s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743683815s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.721191406s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,5], acting [4,3,2] -> [4,0,5], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666829109s) [3,4,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644409180s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,2], acting [2,1,3] -> [3,4,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743658066s) [4,0,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.721191406s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.695183754s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.672729492s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666769981s) [3,4,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644409180s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.703211784s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.680908203s@ mbc={}] start_peering_interval up [1,2,0] -> [5,1,3], acting [1,2,0] -> [5,1,3], acting_primary 1 -> 5, up_primary 1 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745508194s) [3,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.723144531s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.745478630s) [3,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.723144531s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.703177452s) [5,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.680908203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667142868s) [1,5,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.644897461s@ mbc={}] start_peering_interval up [2,1,3] -> [1,5,0], acting [2,1,3] -> [1,5,0], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667075157s) [1,5,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.644897461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.702480316s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.680419922s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.702448845s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.680419922s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743301392s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.721313477s@ mbc={}] start_peering_interval up [4,3,2] -> [4,5,0], acting [4,3,2] -> [4,5,0], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667696953s) [1,3,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.645629883s@ mbc={}] start_peering_interval up [2,1,3] -> [1,3,2], acting [2,1,3] -> [1,3,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743267059s) [4,5,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.721313477s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667664528s) [1,3,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.645629883s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.702930450s) [3,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.681030273s@ mbc={}] start_peering_interval up [1,2,0] -> [3,5,1], acting [1,2,0] -> [3,5,1], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.702900887s) [3,5,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.681030273s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743289948s) [1,3,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.721435547s@ mbc={}] start_peering_interval up [4,3,2] -> [1,3,5], acting [4,3,2] -> [1,3,5], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.4( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.743255615s) [1,3,5] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.721435547s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.702823639s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.681152344s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666821480s) [5,3,1] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.645263672s@ mbc={}] start_peering_interval up [2,1,3] -> [5,3,1], acting [2,1,3] -> [5,3,1], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.2( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.702794075s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.681152344s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.741458893s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.719848633s@ mbc={}] start_peering_interval up [4,3,2] -> [5,3,4], acting [4,3,2] -> [5,3,4], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.7( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.741419792s) [5,3,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.719848633s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.3( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666716576s) [5,3,1] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.645263672s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692336082s) [0,2,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670898438s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692304611s) [0,2,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.670898438s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.741690636s) [3,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.720336914s@ mbc={}] start_peering_interval up [4,3,2] -> [3,5,4], acting [4,3,2] -> [3,5,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.6( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.741659164s) [3,5,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.720336914s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667568207s) [3,4,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646362305s@ mbc={}] start_peering_interval up [2,1,3] -> [3,4,5], acting [2,1,3] -> [3,4,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666659355s) [5,1,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.645507812s@ mbc={}] start_peering_interval up [2,1,3] -> [5,1,3], acting [2,1,3] -> [5,1,3], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.667483330s) [3,4,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.646362305s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.7( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666622162s) [5,1,3] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.645507812s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692347527s) [0,2,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.671264648s@ mbc={}] start_peering_interval up [1,2,0] -> [0,2,4], acting [1,2,0] -> [0,2,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.6( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692308426s) [0,2,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.671264648s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666785240s) [5,0,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.645874023s@ mbc={}] start_peering_interval up [2,1,3] -> [5,0,4], acting [2,1,3] -> [5,0,4], acting_primary 2 -> 5, up_primary 2 -> 5, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.741279602s) [0,2,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.720458984s@ mbc={}] start_peering_interval up [4,3,2] -> [0,2,1], acting [4,3,2] -> [0,2,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691709518s) [2,0,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670898438s@ mbc={}] start_peering_interval up [1,2,0] -> [2,0,4], acting [1,2,0] -> [2,0,4], acting_primary 1 -> 2, up_primary 1 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.2( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666743279s) [5,0,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.645874023s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.5( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.741243362s) [0,2,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.720458984s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.3( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691709518s) [2,0,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1140.670898438s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740839958s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.720092773s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.3( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740790367s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.720092773s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668323517s) [3,1,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.647583008s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,2], acting [2,1,3] -> [3,1,2], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691550255s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670898438s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,4], acting [1,2,0] -> [5,3,4], acting_primary 1 -> 5, up_primary 1 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.4( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.668286324s) [3,1,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.647583008s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740085602s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.719482422s@ mbc={}] start_peering_interval up [4,3,2] -> [5,0,1], acting [4,3,2] -> [5,0,1], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.5( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.691515923s) [5,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.670898438s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.2( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740049362s) [5,0,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.719482422s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.670497894s) [1,0,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.650024414s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,2], acting [2,1,3] -> [1,0,2], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.5( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.670463562s) [1,0,2] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.650024414s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693242073s) [3,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.672851562s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740189552s) [2,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.719726562s@ mbc={}] start_peering_interval up [4,3,2] -> [2,3,1], acting [4,3,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.4( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693207741s) [3,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.672851562s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666841507s) [3,1,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646484375s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.740189552s) [2,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.719726562s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.6( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666806221s) [3,1,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.646484375s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692721367s) [3,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.672485352s@ mbc={}] start_peering_interval up [1,2,0] -> [3,1,2], acting [1,2,0] -> [3,1,2], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739383698s) [5,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.719360352s@ mbc={}] start_peering_interval up [4,3,2] -> [5,1,3], acting [4,3,2] -> [5,1,3], acting_primary 4 -> 5, up_primary 4 -> 5, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666863441s) [2,1,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646850586s@ mbc={}] start_peering_interval up [2,1,3] -> [2,1,0], acting [2,1,3] -> [2,1,0], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693044662s) [4,2,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.673095703s@ mbc={}] start_peering_interval up [1,2,0] -> [4,2,3], acting [1,2,0] -> [4,2,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.f( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739315987s) [5,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.719360352s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.8( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666863441s) [2,1,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.646850586s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738803864s) [4,0,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.718994141s@ mbc={}] start_peering_interval up [4,3,2] -> [4,0,2], acting [4,3,2] -> [4,0,2], acting_primary 4 -> 4, up_primary 4 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.e( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738779068s) [4,0,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.718994141s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666206360s) [3,5,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646484375s@ mbc={}] start_peering_interval up [2,1,3] -> [3,5,4], acting [2,1,3] -> [3,5,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.9( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666164398s) [3,5,4] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.646484375s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.9( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.693004608s) [4,2,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.673095703s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.7( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692625999s) [3,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.672485352s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692666054s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.673095703s@ mbc={}] start_peering_interval up [1,2,0] -> [4,0,5], acting [1,2,0] -> [4,0,5], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738787651s) [3,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.719238281s@ mbc={}] start_peering_interval up [4,3,2] -> [3,2,4], acting [4,3,2] -> [3,2,4], acting_primary 4 -> 3, up_primary 4 -> 3, role 2 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.8( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.692623138s) [4,0,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.673095703s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1d( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738755226s) [3,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.719238281s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.689872742s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670410156s@ mbc={}] start_peering_interval up [1,2,0] -> [4,5,3], acting [1,2,0] -> [4,5,3], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1b( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.689837456s) [4,5,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.670410156s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.665906906s) [2,4,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646606445s@ mbc={}] start_peering_interval up [2,1,3] -> [2,4,3], acting [2,1,3] -> [2,4,3], acting_primary 2 -> 2, up_primary 2 -> 2, role 0 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739544868s) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.720214844s@ mbc={}] start_peering_interval up [4,3,2] -> [2,4,0], acting [4,3,2] -> [2,4,0], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.689888000s) [4,3,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.670654297s@ mbc={}] start_peering_interval up [1,2,0] -> [4,3,2], acting [1,2,0] -> [4,3,2], acting_primary 1 -> 4, up_primary 1 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1a( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.665906906s) [2,4,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.646606445s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.669133186s) [1,2,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.649902344s@ mbc={}] start_peering_interval up [2,1,3] -> [1,2,3], acting [2,1,3] -> [1,2,3], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1c( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.739544868s) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.720214844s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1b( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.669072151s) [1,2,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.649902344s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738558769s) [2,0,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.719482422s@ mbc={}] start_peering_interval up [4,3,2] -> [2,0,4], acting [4,3,2] -> [2,0,4], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.700650215s) [1,2,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.681762695s@ mbc={}] start_peering_interval up [1,2,0] -> [1,2,3], acting [1,2,0] -> [1,2,3], acting_primary 1 -> 1, up_primary 1 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666586876s) [4,2,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.647583008s@ mbc={}] start_peering_interval up [2,1,3] -> [4,2,0], acting [2,1,3] -> [4,2,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1b( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.738558769s) [2,0,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.719482422s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1d( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.700602531s) [1,2,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.681762695s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.666532516s) [4,2,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.647583008s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737504959s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.718872070s@ mbc={}] start_peering_interval up [4,3,2] -> [1,5,3], acting [4,3,2] -> [1,5,3], acting_primary 4 -> 1, up_primary 4 -> 1, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.699731827s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.681030273s@ mbc={}] start_peering_interval up [1,2,0] -> [5,3,1], acting [1,2,0] -> [5,3,1], acting_primary 1 -> 5, up_primary 1 -> 5, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.1a( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737471581s) [1,5,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.718872070s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.665313721s) [4,5,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646728516s@ mbc={}] start_peering_interval up [2,1,3] -> [4,5,0], acting [2,1,3] -> [4,5,0], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1c( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.699681282s) [5,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.681030273s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1d( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.665266991s) [4,5,0] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.646728516s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1a( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.689131737s) [4,3,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.670654297s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737303734s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.718872070s@ mbc={}] start_peering_interval up [4,3,2] -> [0,5,1], acting [4,3,2] -> [0,5,1], acting_primary 4 -> 0, up_primary 4 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.19( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737273216s) [0,5,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1142.718872070s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737103462s) [2,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1142.718750000s@ mbc={}] start_peering_interval up [4,3,2] -> [2,1,3], acting [4,3,2] -> [2,1,3], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.700087547s) [0,5,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.681762695s@ mbc={}] start_peering_interval up [1,2,0] -> [0,5,4], acting [1,2,0] -> [0,5,4], acting_primary 1 -> 0, up_primary 1 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[5.18( empty local-lis/les=40/41 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.737103462s) [2,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1142.718750000s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.700148582s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1140.681884766s@ mbc={}] start_peering_interval up [1,2,0] -> [3,4,5], acting [1,2,0] -> [3,4,5], acting_primary 1 -> 3, up_primary 1 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1e( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.700118065s) [3,4,5] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.681884766s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[3.1f( empty local-lis/les=38/39 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.700004578s) [0,5,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1140.681762695s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.664992332s) [0,2,4] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.646972656s@ mbc={}] start_peering_interval up [2,1,3] -> [0,2,4], acting [2,1,3] -> [0,2,4], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1f( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.664954185s) [0,2,4] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.646972656s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663228035s) [1,0,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.645385742s@ mbc={}] start_peering_interval up [2,1,3] -> [1,0,5], acting [2,1,3] -> [1,0,5], acting_primary 2 -> 1, up_primary 2 -> 1, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.c( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.663178444s) [1,0,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.645385742s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.664872169s) [3,1,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.647216797s@ mbc={}] start_peering_interval up [2,1,3] -> [3,1,5], acting [2,1,3] -> [3,1,5], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[2.1e( empty local-lis/les=36/37 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44 pruub=11.664809227s) [3,1,5] r=-1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.647216797s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.18( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.1b( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.1a( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.19( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.733100891s) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197143555s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,1], acting [0,5,1] -> [5,3,1], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.19( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.733100891s) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.197143555s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.684711456s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149291992s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.684578896s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149291992s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.1c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.9( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.7( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.15( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.d( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.3( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,3,1] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.8( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.2( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,0,4] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.e( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.677124977s) [1,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144897461s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.7( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,1,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.7( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.728588104s) [1,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197143555s@ mbc={}] start_peering_interval up [0,5,1] -> [1,2,0], acting [0,5,1] -> [1,2,0], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.19( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.676439285s) [1,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144897461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.728481293s) [1,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.197143555s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.d( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.2( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.5( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.d( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.c( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.1( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.a( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.f( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.16( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,1,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.3( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.17( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,1,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.15( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.5( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.16( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.11( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,3,4] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.10( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.669281960s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.143798828s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,1], acting [3,5,1] -> [2,3,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.729084969s) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.203613281s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.729084969s) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.203613281s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.18( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.669144630s) [2,3,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.143798828s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665369034s) [3,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.140380859s@ mbc={}] start_peering_interval up [3,5,1] -> [3,2,4], acting [3,5,1] -> [3,2,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.15( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.721194267s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196044922s@ mbc={}] start_peering_interval up [0,5,1] -> [2,4,0], acting [0,5,1] -> [2,4,0], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.13( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.15( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.721135139s) [2,4,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196044922s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665118217s) [0,2,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.140258789s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,1], acting [3,5,1] -> [0,2,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.16( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665058136s) [0,2,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.140258789s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.668439865s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.143798828s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.14( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.720371246s) [3,5,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.195922852s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,4], acting [0,5,1] -> [3,5,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.15( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.668400764s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.143798828s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.17( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665307999s) [3,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.140380859s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.667908669s) [4,0,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.143798828s@ mbc={}] start_peering_interval up [3,5,1] -> [4,0,5], acting [3,5,1] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.14( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.720163345s) [3,5,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.195922852s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.14( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.667826653s) [4,0,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.143798828s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.16( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.720002174s) [0,5,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.195922852s@ mbc={}] start_peering_interval up [0,5,1] -> [0,5,4], acting [0,5,1] -> [0,5,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.17( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.720026016s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.195800781s@ mbc={}] start_peering_interval up [0,5,1] -> [1,0,2], acting [0,5,1] -> [1,0,2], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.16( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719942093s) [0,5,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.195922852s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672893524s) [1,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149291992s@ mbc={}] start_peering_interval up [3,5,1] -> [1,3,2], acting [3,5,1] -> [1,3,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.17( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719747543s) [1,0,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.195800781s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672736168s) [1,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149291992s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.11( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719455719s) [3,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196044922s@ mbc={}] start_peering_interval up [0,5,1] -> [3,1,2], acting [0,5,1] -> [3,1,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.11( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719401360s) [3,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196044922s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.668066978s) [2,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144775391s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672833443s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149658203s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.12( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.672800064s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149658203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.13( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.668010712s) [2,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.d( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.10( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719079018s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.195922852s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,2], acting [0,5,1] -> [0,1,2], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.10( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718941689s) [0,1,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.195922852s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.667149544s) [3,5,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144287109s@ mbc={}] start_peering_interval up [3,5,1] -> [3,5,4], acting [3,5,1] -> [3,5,4], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.11( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.667105675s) [3,5,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144287109s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.13( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.719029427s) [3,4,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196289062s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,2], acting [0,5,1] -> [3,4,2], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.13( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718981743s) [3,4,2] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196289062s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.12( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718306541s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.195800781s@ mbc={}] start_peering_interval up [0,5,1] -> [4,2,0], acting [0,5,1] -> [4,2,0], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.12( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718264580s) [4,2,0] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.195800781s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.667136192s) [3,4,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144775391s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,5], acting [3,5,1] -> [3,4,5], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718710899s) [2,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196533203s@ mbc={}] start_peering_interval up [0,5,1] -> [2,3,1], acting [0,5,1] -> [2,3,1], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.667104721s) [3,4,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718682289s) [2,3,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196533203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.666660309s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144897461s@ mbc={}] start_peering_interval up [3,5,1] -> [2,4,0], acting [3,5,1] -> [2,4,0], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.666120529s) [3,4,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144531250s@ mbc={}] start_peering_interval up [3,5,1] -> [3,4,2], acting [3,5,1] -> [3,4,2], acting_primary 3 -> 3, up_primary 3 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.666480064s) [2,4,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144897461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718157768s) [3,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196899414s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,4], acting [0,5,1] -> [3,2,4], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665976524s) [2,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144775391s@ mbc={}] start_peering_interval up [3,5,1] -> [2,1,3], acting [3,5,1] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.789257050s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.268066406s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665924072s) [2,1,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.718091965s) [3,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196899414s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.a( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,0,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.717349052s) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196289062s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.788927078s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.268066406s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.788862228s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.268066406s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.717156410s) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196289062s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.10( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665730476s) [3,4,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144531250s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.717579842s) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197143555s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.717579842s) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.197143555s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.788775444s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.268066406s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.9( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665179253s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144897461s@ mbc={}] start_peering_interval up [3,5,1] -> [0,2,4], acting [3,5,1] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.9( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.717197418s) [0,1,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197143555s@ mbc={}] start_peering_interval up [0,5,1] -> [0,1,5], acting [0,5,1] -> [0,1,5], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.b( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665058136s) [0,2,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144897461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[7.b( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.9( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.717163086s) [0,1,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.197143555s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665095329s) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144653320s@ mbc={}] start_peering_interval up [3,5,1] -> [5,3,1], acting [3,5,1] -> [5,3,1], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.c( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665095329s) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1136.144653320s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.786544800s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.267822266s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.2( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715110779s) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196533203s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.786461830s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.267822266s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.2( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.715110779s) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.196533203s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663132668s) [1,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144775391s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.714653015s) [1,5,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196533203s@ mbc={}] start_peering_interval up [0,5,1] -> [1,5,3], acting [0,5,1] -> [1,5,3], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.785576820s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.267333984s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.3( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.663085938s) [1,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144775391s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=42/43 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.785248756s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.267333984s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662741661s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144897461s@ mbc={}] start_peering_interval up [3,5,1] -> [4,2,0], acting [3,5,1] -> [4,2,0], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.3( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.714677811s) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197021484s@ mbc={}] start_peering_interval up [0,5,1] -> [5,4,0], acting [0,5,1] -> [5,4,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.714594841s) [1,5,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196533203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.3( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.714677811s) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.197021484s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664740562s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.147338867s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662286758s) [1,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.144897461s@ mbc={}] start_peering_interval up [3,5,1] -> [1,5,3], acting [3,5,1] -> [1,5,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.4( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664683342s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.147338867s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.2( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662236214s) [1,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144897461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664308548s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.147338867s@ mbc={}] start_peering_interval up [3,5,1] -> [1,0,2], acting [3,5,1] -> [1,0,2], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.6( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.713985443s) [3,5,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197021484s@ mbc={}] start_peering_interval up [0,5,1] -> [3,5,1], acting [0,5,1] -> [3,5,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.6( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.713943481s) [3,5,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.197021484s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.713064194s) [3,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196166992s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.9( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.664274216s) [1,0,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.147338867s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.b( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.713011742s) [3,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196166992s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.18( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.713712692s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197021484s@ mbc={}] start_peering_interval up [0,5,1] -> [0,2,4], acting [0,5,1] -> [0,2,4], acting_primary 0 -> 0, up_primary 0 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665072441s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.148437500s@ mbc={}] start_peering_interval up [3,5,1] -> [2,3,4], acting [3,5,1] -> [2,3,4], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.18( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.713678360s) [0,2,4] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.197021484s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.665037155s) [2,3,4] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.148437500s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.784646988s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.267822266s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.7( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.712370872s) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196777344s@ mbc={}] start_peering_interval up [0,5,1] -> [5,3,4], acting [0,5,1] -> [5,3,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662937164s) [5,1,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.147460938s@ mbc={}] start_peering_interval up [3,5,1] -> [5,1,0], acting [3,5,1] -> [5,1,0], acting_primary 3 -> 5, up_primary 3 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.786987305s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.271484375s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.784045219s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.267822266s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.5( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662937164s) [5,1,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1136.147460938s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.7( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.712370872s) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.196777344s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.786813736s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.271484375s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662644386s) [2,0,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.147827148s@ mbc={}] start_peering_interval up [3,5,1] -> [2,0,1], acting [3,5,1] -> [2,0,1], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.a( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662581444s) [2,0,1] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.147827148s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.786248207s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.271728516s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.8( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.710806847s) [2,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196166992s@ mbc={}] start_peering_interval up [0,5,1] -> [2,1,3], acting [0,5,1] -> [2,1,3], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.4( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.710494995s) [3,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196289062s@ mbc={}] start_peering_interval up [0,5,1] -> [3,2,1], acting [0,5,1] -> [3,2,1], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.785942078s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.271728516s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.4( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.710417747s) [3,2,1] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196289062s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662599564s) [4,2,0] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.144897461s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.8( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.710174561s) [2,1,3] r=-1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196166992s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.5( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.710013390s) [5,1,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196533203s@ mbc={}] start_peering_interval up [0,5,1] -> [5,1,0], acting [0,5,1] -> [5,1,0], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662632942s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149291992s@ mbc={}] start_peering_interval up [3,5,1] -> [4,3,2], acting [3,5,1] -> [4,3,2], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.5( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.710013390s) [5,1,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.196533203s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662511826s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149047852s@ mbc={}] start_peering_interval up [3,5,1] -> [0,1,2], acting [3,5,1] -> [0,1,2], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.7( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662325859s) [0,1,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149047852s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.6( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.662568092s) [4,3,2] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149291992s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.709159851s) [5,0,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196289062s@ mbc={}] start_peering_interval up [0,5,1] -> [5,0,4], acting [0,5,1] -> [5,0,4], acting_primary 0 -> 5, up_primary 0 -> 5, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.709859848s) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197143555s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1f( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.709807396s) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.197143555s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.a( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.709159851s) [5,0,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown pruub 1138.196289062s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.784202576s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1132.271484375s@ mbc={}] start_peering_interval up [5,1,3] -> [2,1,3], acting [5,1,3] -> [2,1,3], acting_primary 5 -> 2, up_primary 5 -> 2, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.661916733s) [1,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149536133s@ mbc={}] start_peering_interval up [3,5,1] -> [1,2,3], acting [3,5,1] -> [1,2,3], acting_primary 3 -> 1, up_primary 3 -> 1, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.709301949s) [4,5,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.197021484s@ mbc={}] start_peering_interval up [0,5,1] -> [4,5,3], acting [0,5,1] -> [4,5,3], acting_primary 0 -> 4, up_primary 0 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.8( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.661812782s) [1,2,3] r=-1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149536133s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.661817551s) [4,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149658203s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1d( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.661761284s) [4,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149658203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.707850456s) [1,3,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.195800781s@ mbc={}] start_peering_interval up [0,5,1] -> [1,3,5], acting [0,5,1] -> [1,3,5], acting_primary 0 -> 1, up_primary 0 -> 1, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1c( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.707735062s) [1,3,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.195800781s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.661364555s) [0,5,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149658203s@ mbc={}] start_peering_interval up [3,5,1] -> [0,5,1], acting [3,5,1] -> [0,5,1], acting_primary 3 -> 0, up_primary 3 -> 0, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1e( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.661241531s) [0,5,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149658203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1e( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.709239006s) [4,5,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.197021484s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.707995415s) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active pruub 1138.196899414s@ mbc={}] start_peering_interval up [0,5,1] -> [3,4,5], acting [0,5,1] -> [3,4,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44 pruub=9.784082413s) [2,1,3] r=-1 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1132.271484375s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.660345078s) [4,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active pruub 1136.149414062s@ mbc={}] start_peering_interval up [3,5,1] -> [4,5,3], acting [3,5,1] -> [4,5,3], acting_primary 3 -> 4, up_primary 3 -> 4, role 1 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[6.1d( empty local-lis/les=40/41 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44 pruub=15.707566261s) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1138.196899414s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:32 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[4.1f( empty local-lis/les=38/39 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44 pruub=13.660030365s) [4,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown NOTIFY pruub 1136.149414062s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.15( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.12( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,2,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.1( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,2,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.6( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,3,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.1d( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,5,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.1b( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.8( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,0,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.19( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,3,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.1c( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,3,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.b( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.8( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,2,3] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.11( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [4,5,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.1b( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,2,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.13( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,0,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.9( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,0,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.10( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [4,0,5] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.d( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [4,5,0] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.17( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.17( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.b( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,5,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.9( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,5,4] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.2( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.c( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.6( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,1,5] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.1( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,4,5] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.4( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.b( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,2,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.10( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,1,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.6( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,5,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.1e( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.16( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.1e( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [3,1,5] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.12( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.7( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.4( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,1,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.14( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.1a( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.4( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,3,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.18( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,2,4] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.18( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,1,5] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.b( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,2,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.b( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,5,0] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.1b( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,0,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.11( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,1,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.12( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,5,3] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[2.8( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [2,1,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.19( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.17( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,2,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.e( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.1f( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,4] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[3.3( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,0,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[6.15( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.3( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [0,5,1] r=1 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[5.8( empty local-lis/les=0/0 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [1,0,5] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[4.10( empty local-lis/les=0/0 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [3,4,2] r=2 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.12( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 44 pg[6.13( empty local-lis/les=0/0 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [3,4,2] r=2 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.17( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [0,5,1] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[6.d( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.c( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,0,5] r=2 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.1( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.1( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[2.14( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [2,4,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[3.14( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,4,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.10( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.d( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,5,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[3.e( empty local-lis/les=0/0 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [1,5,0] r=1 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 44 pg[2.13( empty local-lis/les=0/0 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [1,5,3] r=1 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[2.11( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,3,4] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.7( v 34'39 lc 34'21 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.d( v 34'39 lc 34'13 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(2+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.3( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 mlcod 0'0 active+degraded m=2 mbc={255={(2+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.1a( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[3.5( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[3.10( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[5.16( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[5.15( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[2.17( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,1,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[2.1a( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [2,4,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.1b( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.18( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[3.a( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[2.7( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,1,3] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[5.7( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.7( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[5.f( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[3.d( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.e( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.2( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.18( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[3.16( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,4] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[3.c( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[2.3( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,3,1] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[3.1c( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[4.c( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.5( v 34'39 lc 34'11 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=2 mbc={255={(2+1)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.f( v 34'39 lc 34'1 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=3 mbc={255={(2+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.19( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,3,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[6.8( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,1,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[3.f( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.a( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,0,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.13( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[3.13( empty local-lis/les=44/45 n=0 ec=38/20 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,3,1] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[7.b( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=44) [2,1,3] r=0 lpr=44 pi=[42,44)/1 crt=34'39 mlcod 0'0 active+degraded m=1 mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.1f( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,3] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[4.d( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [2,1,3] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.11( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 45 pg[5.1c( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [2,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[2.2( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,0,4] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[2.16( empty local-lis/les=44/45 n=0 ec=36/19 lis/c=36/36 les/c/f=37/37/0 sis=44) [5,1,0] r=0 lpr=44 pi=[36,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[5.9( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.a( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,4] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[4.5( empty local-lis/les=44/45 n=0 ec=38/22 lis/c=38/38 les/c/f=39/39/0 sis=44) [5,1,0] r=0 lpr=44 pi=[38,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[5.2( empty local-lis/les=44/45 n=0 ec=40/24 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,0,1] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.5( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,1,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.1a( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:33 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 45 pg[6.3( empty local-lis/les=44/45 n=0 ec=40/30 lis/c=40/40 les/c/f=41/41/0 sis=44) [5,4,0] r=0 lpr=44 pi=[40,44)/1 crt=0'0 mlcod 0'0 active mbc={}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.220208168s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1140.271850586s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.219919205s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1140.271606445s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.220129013s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1140.271850586s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.219849586s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1140.271606445s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.2( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.216197968s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1140.268188477s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.2( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.216075897s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1140.268188477s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.215479851s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1140.268188477s@ mbc={}] start_peering_interval up [5,1,3] -> [3,1,5], acting [5,1,3] -> [3,1,5], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:35 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 46 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=46 pruub=15.215414047s) [3,1,5] r=2 lpr=46 pi=[42,46)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1140.268188477s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:36 np0005604215.localdomain sudo[55934]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irqgfjfjljicfcojhdnrvfgkbdfbbmak ; /usr/bin/python3
Feb 01 07:59:36 np0005604215.localdomain sudo[55934]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:36 np0005604215.localdomain python3[55936]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:36 np0005604215.localdomain sudo[55934]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.3( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.399884224s) [3,2,4] r=1 lpr=48 pi=[42,48)/2 crt=34'39 mlcod 0'0 active pruub 1144.015136719s@ m=2 mbc={255={(2+1)=2}}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.3( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.399776459s) [3,2,4] r=1 lpr=48 pi=[42,48)/2 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1144.015136719s@ m=2 mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/46/0 sis=48 pruub=12.399454117s) [3,2,4] r=1 lpr=48 pi=[44,48)/1 crt=34'39 mlcod 0'0 active pruub 1144.015014648s@ mbc={255={}}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/46/0 sis=48 pruub=12.399329185s) [3,2,4] r=1 lpr=48 pi=[44,48)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1144.015014648s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.f( v 34'39 lc 34'1 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.401145935s) [3,2,4] r=1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1144.017211914s@ m=3 mbc={255={(2+1)=3}}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.f( v 34'39 lc 34'1 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.401041031s) [3,2,4] r=1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1144.017211914s@ m=3 mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.b( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.402244568s) [3,2,4] r=1 lpr=48 pi=[42,48)/2 crt=34'39 mlcod 0'0 active pruub 1144.018554688s@ m=1 mbc={255={(2+1)=1}}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 48 pg[7.b( v 34'39 lc 0'0 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/42 les/c/f=45/43/0 sis=48 pruub=12.402137756s) [3,2,4] r=1 lpr=48 pi=[42,48)/2 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1144.018554688s@ m=1 mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.b( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.7( v 34'39 (0'0,34'39] lb MIN local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.7( v 34'39 (0'0,34'39] lb MIN local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.3( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active mbc={}] start_peering_interval up [2,1,3] -> [3,2,4], acting [2,1,3] -> [3,2,4], acting_primary 2 -> 3, up_primary 2 -> 3, role -1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 48 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=48) [3,2,4] r=-1 lpr=48 pi=[42,48)/2 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:37 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.4 scrub starts
Feb 01 07:59:38 np0005604215.localdomain sudo[55950]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrooomwzevfxyutedexwhxseddrkbovl ; /usr/bin/python3
Feb 01 07:59:38 np0005604215.localdomain sudo[55950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:38 np0005604215.localdomain python3[55952]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:38 np0005604215.localdomain sudo[55950]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:39 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.c deep-scrub starts
Feb 01 07:59:40 np0005604215.localdomain sudo[55966]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zvyktuoxidctedocynbuaehczkpczogv ; /usr/bin/python3
Feb 01 07:59:40 np0005604215.localdomain sudo[55966]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:40 np0005604215.localdomain python3[55968]: ansible-file Invoked with path=/var/lib/tripleo-config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:40 np0005604215.localdomain sudo[55966]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:40 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 7.8 scrub starts
Feb 01 07:59:41 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.5 scrub starts
Feb 01 07:59:41 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.5 scrub ok
Feb 01 07:59:42 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 4.c scrub starts
Feb 01 07:59:43 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 50 pg[7.4( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.010475159s) [0,5,4] r=1 lpr=50 pi=[42,50)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1148.268310547s@ mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:43 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 50 pg[7.4( v 34'39 (0'0,34'39] local-lis/les=42/43 n=2 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.010384560s) [0,5,4] r=1 lpr=50 pi=[42,50)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1148.268310547s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:43 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 50 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.013626099s) [0,5,4] r=1 lpr=50 pi=[42,50)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1148.271850586s@ TIME_FOR_DEEP mbc={}] start_peering_interval up [5,1,3] -> [0,5,4], acting [5,1,3] -> [0,5,4], acting_primary 5 -> 0, up_primary 5 -> 0, role 0 -> 1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:43 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 50 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=50 pruub=15.013566017s) [0,5,4] r=1 lpr=50 pi=[42,50)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1148.271850586s@ TIME_FOR_DEEP mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:43 np0005604215.localdomain sudo[56014]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwhsigtihrvdctxssqnryipnjvwzsrtn ; /usr/bin/python3
Feb 01 07:59:43 np0005604215.localdomain sudo[56014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:43 np0005604215.localdomain python3[56016]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:59:43 np0005604215.localdomain sudo[56014]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:43 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Feb 01 07:59:44 np0005604215.localdomain sudo[56057]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vzwqzqmajvgytyobphksygsupanrcepd ; /usr/bin/python3
Feb 01 07:59:44 np0005604215.localdomain sudo[56057]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:44 np0005604215.localdomain python3[56059]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932783.5295072-91478-238905355061275/source dest=/var/lib/tripleo-config/ceph/ceph.client.openstack.keyring mode=600 _original_basename=ceph.client.openstack.keyring follow=False checksum=814f759dcc97f4b50c85badaa6f3819c2533c70a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:44 np0005604215.localdomain sudo[56057]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:44 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.19 deep-scrub starts
Feb 01 07:59:45 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 52 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/49/0 sis=52 pruub=12.211205482s) [4,0,2] r=2 lpr=52 pi=[44,52)/1 crt=34'39 mlcod 0'0 active pruub 1152.015258789s@ mbc={255={}}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:45 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 52 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/44 les/c/f=45/48/0 sis=52 pruub=12.213039398s) [4,0,2] r=2 lpr=52 pi=[44,52)/1 crt=34'39 mlcod 0'0 active pruub 1152.017211914s@ mbc={255={}}] start_peering_interval up [2,1,3] -> [4,0,2], acting [2,1,3] -> [4,0,2], acting_primary 2 -> 4, up_primary 2 -> 4, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:45 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 52 pg[7.5( v 34'39 (0'0,34'39] local-lis/les=44/45 n=2 ec=42/32 lis/c=44/44 les/c/f=45/48/0 sis=52 pruub=12.212736130s) [4,0,2] r=2 lpr=52 pi=[44,52)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1152.017211914s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:45 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 52 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/49/0 sis=52 pruub=12.210746765s) [4,0,2] r=2 lpr=52 pi=[44,52)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1152.015258789s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:46 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Feb 01 07:59:47 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.f scrub starts
Feb 01 07:59:47 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.f scrub ok
Feb 01 07:59:47 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 54 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=46/47 n=2 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.748791695s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1150.093139648s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:47 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 54 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.748353958s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1150.093139648s@ mbc={}] start_peering_interval up [3,1,5] -> [0,2,4], acting [3,1,5] -> [0,2,4], acting_primary 3 -> 0, up_primary 3 -> 0, role 2 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:47 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 54 pg[7.6( v 34'39 (0'0,34'39] local-lis/les=46/47 n=2 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.748511314s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1150.093139648s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:47 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 54 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54 pruub=12.748250008s) [0,2,4] r=-1 lpr=54 pi=[46,54)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1150.093139648s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 54 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=1 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 54 pg[7.6( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=54) [0,2,4] r=1 lpr=54 pi=[46,54)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 55 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.773110390s) [2,1,3] r=0 lpr=55 pi=[48,55)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1156.685668945s@ mbc={}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 55 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.772881508s) [2,1,3] r=0 lpr=55 pi=[48,55)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1156.685668945s@ mbc={}] start_peering_interval up [3,2,4] -> [2,1,3], acting [3,2,4] -> [2,1,3], acting_primary 3 -> 2, up_primary 3 -> 2, role 1 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 55 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.773110390s) [2,1,3] r=0 lpr=55 pi=[48,55)/1 crt=34'39 mlcod 0'0 unknown pruub 1156.685668945s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 55 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=48/49 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55 pruub=13.772881508s) [2,1,3] r=0 lpr=55 pi=[48,55)/1 crt=34'39 mlcod 0'0 unknown pruub 1156.685668945s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:48 np0005604215.localdomain sudo[56119]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gphtmkwxnutuqatbwmusfoohhfzqurin ; /usr/bin/python3
Feb 01 07:59:48 np0005604215.localdomain sudo[56119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:48 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.e deep-scrub starts
Feb 01 07:59:49 np0005604215.localdomain python3[56121]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:59:49 np0005604215.localdomain sudo[56119]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:49 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.e deep-scrub ok
Feb 01 07:59:49 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.14 scrub starts
Feb 01 07:59:49 np0005604215.localdomain sudo[56162]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlelkrhkdtbyullnfmveidmuoyemthog ; /usr/bin/python3
Feb 01 07:59:49 np0005604215.localdomain sudo[56162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 07:59:49 np0005604215.localdomain podman[56165]: 2026-02-01 07:59:49.348539374 +0000 UTC m=+0.089585690 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public)
Feb 01 07:59:49 np0005604215.localdomain python3[56164]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932788.7011445-91478-207462224252359/source dest=/var/lib/tripleo-config/ceph/ceph.client.manila.keyring mode=600 _original_basename=ceph.client.manila.keyring follow=False checksum=9a0c41ba35379304dc7e57883346ea3531963e9b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:49 np0005604215.localdomain sudo[56162]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:49 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.14 scrub ok
Feb 01 07:59:49 np0005604215.localdomain podman[56165]: 2026-02-01 07:59:49.563941953 +0000 UTC m=+0.304988199 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 07:59:49 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 07:59:49 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 56 pg[7.7( v 34'39 (0'0,34'39] local-lis/les=55/56 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55) [2,1,3] r=0 lpr=55 pi=[48,55)/1 crt=34'39 mlcod 0'0 active+degraded mbc={255={(2+1)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:49 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 56 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=55/56 n=1 ec=42/32 lis/c=48/48 les/c/f=49/49/0 sis=55) [2,1,3] r=0 lpr=55 pi=[48,55)/1 crt=34'39 mlcod 0'0 active+degraded mbc={255={(2+1)=3}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:49 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.2 scrub starts
Feb 01 07:59:50 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.2 scrub ok
Feb 01 07:59:50 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 57 pg[7.8( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=57 pruub=15.852387428s) [3,2,1] r=-1 lpr=57 pi=[42,57)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1156.272338867s@ mbc={}] start_peering_interval up [5,1,3] -> [3,2,1], acting [5,1,3] -> [3,2,1], acting_primary 5 -> 3, up_primary 5 -> 3, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:50 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 57 pg[7.8( v 34'39 (0'0,34'39] local-lis/les=42/43 n=1 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=57 pruub=15.852313042s) [3,2,1] r=-1 lpr=57 pi=[42,57)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1156.272338867s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:50 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.7 deep-scrub starts
Feb 01 07:59:50 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.7 deep-scrub ok
Feb 01 07:59:51 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1c scrub starts
Feb 01 07:59:51 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1c scrub ok
Feb 01 07:59:51 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 57 pg[7.8( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=42/42 les/c/f=43/43/0 sis=57) [3,2,1] r=1 lpr=57 pi=[42,57)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:52 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.1a scrub starts
Feb 01 07:59:52 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.1a scrub ok
Feb 01 07:59:53 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 59 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/45/0 sis=59 pruub=12.394413948s) [0,4,2] r=2 lpr=59 pi=[44,59)/1 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1160.018554688s@ mbc={}] start_peering_interval up [2,1,3] -> [0,4,2], acting [2,1,3] -> [0,4,2], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:53 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 59 pg[7.9( v 34'39 (0'0,34'39] local-lis/les=44/45 n=1 ec=42/32 lis/c=44/44 les/c/f=45/45/0 sis=59 pruub=12.394232750s) [0,4,2] r=2 lpr=59 pi=[44,59)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1160.018554688s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:54 np0005604215.localdomain sudo[56251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjychwcernvfiyppptxczipxcchbpfzi ; /usr/bin/python3
Feb 01 07:59:54 np0005604215.localdomain sudo[56251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:54 np0005604215.localdomain python3[56253]: ansible-ansible.legacy.stat Invoked with path=/var/lib/tripleo-config/ceph/ceph.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:59:54 np0005604215.localdomain sudo[56251]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:54 np0005604215.localdomain sudo[56294]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enpjtoxvgjlaywxizgszecoqwmxwnydn ; /usr/bin/python3
Feb 01 07:59:54 np0005604215.localdomain sudo[56294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:54 np0005604215.localdomain python3[56296]: ansible-ansible.legacy.copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932793.9390342-91478-104316381665708/source dest=/var/lib/tripleo-config/ceph/ceph.conf mode=644 _original_basename=ceph.conf follow=False checksum=c332e57191fea146df898938173f766e25b9bcd9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:54 np0005604215.localdomain sudo[56294]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:55 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 61 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=61 pruub=12.825468063s) [4,0,5] r=2 lpr=61 pi=[46,61)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1158.093627930s@ mbc={}] start_peering_interval up [3,1,5] -> [4,0,5], acting [3,1,5] -> [4,0,5], acting_primary 3 -> 4, up_primary 3 -> 4, role 2 -> 2, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:55 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 61 pg[7.a( v 34'39 (0'0,34'39] local-lis/les=46/47 n=1 ec=42/32 lis/c=46/46 les/c/f=47/47/0 sis=61 pruub=12.825372696s) [4,0,5] r=2 lpr=61 pi=[46,61)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1158.093627930s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:55 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 4.5 scrub starts
Feb 01 07:59:55 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 4.5 scrub ok
Feb 01 07:59:56 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.a scrub starts
Feb 01 07:59:56 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.a scrub ok
Feb 01 07:59:57 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.7 scrub starts
Feb 01 07:59:58 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.7 scrub ok
Feb 01 07:59:58 np0005604215.localdomain sudo[56356]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gegokolkiftliytzzrmqgyhndpbihnvj ; /usr/bin/python3
Feb 01 07:59:58 np0005604215.localdomain sudo[56356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:58 np0005604215.localdomain python3[56358]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 07:59:58 np0005604215.localdomain sudo[56356]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:58 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 64 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=50/51 n=1 ec=42/32 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=9.910997391s) [2,3,4] r=-1 lpr=64 pi=[50,64)/1 luod=0'0 crt=34'39 lcod 0'0 mlcod 0'0 active pruub 1158.308227539s@ TIME_FOR_DEEP mbc={}] start_peering_interval up [0,5,4] -> [2,3,4], acting [0,5,4] -> [2,3,4], acting_primary 0 -> 2, up_primary 0 -> 2, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 07:59:58 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 64 pg[7.c( v 34'39 (0'0,34'39] local-lis/les=50/51 n=1 ec=42/32 lis/c=50/50 les/c/f=51/51/0 sis=64 pruub=9.910633087s) [2,3,4] r=-1 lpr=64 pi=[50,64)/1 crt=34'39 lcod 0'0 mlcod 0'0 unknown NOTIFY pruub 1158.308227539s@ TIME_FOR_DEEP mbc={}] state<Start>: transitioning to Stray
Feb 01 07:59:58 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 64 pg[7.c( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=50/50 les/c/f=51/51/0 sis=64) [2,3,4] r=0 lpr=64 pi=[50,64)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Primary
Feb 01 07:59:58 np0005604215.localdomain sudo[56401]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbmtkhwnjsxcblmypveodsmrukttwtnq ; /usr/bin/python3
Feb 01 07:59:58 np0005604215.localdomain sudo[56401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 07:59:59 np0005604215.localdomain python3[56403]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932798.294558-91839-218751065619437/source _original_basename=tmpq3joktrv follow=False checksum=f17091ee142621a3c8290c8c96b5b52d67b3a864 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 07:59:59 np0005604215.localdomain sudo[56401]: pam_unix(sudo:session): session closed for user root
Feb 01 07:59:59 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1b scrub starts
Feb 01 07:59:59 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1b scrub ok
Feb 01 07:59:59 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 65 pg[7.c( v 34'39 lc 34'17 (0'0,34'39] local-lis/les=64/65 n=1 ec=42/32 lis/c=50/50 les/c/f=51/51/0 sis=64) [2,3,4] r=0 lpr=64 pi=[50,64)/1 crt=34'39 lcod 0'0 mlcod 0'0 active+degraded m=1 mbc={255={(1+2)=1}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 07:59:59 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.9 scrub starts
Feb 01 08:00:00 np0005604215.localdomain sudo[56463]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dubgakgwzjoyzjazldfuqjhvojywgdpy ; /usr/bin/python3
Feb 01 08:00:00 np0005604215.localdomain sudo[56463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:00 np0005604215.localdomain python3[56465]: ansible-ansible.legacy.stat Invoked with path=/usr/local/sbin/containers-tmpwatch follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:00 np0005604215.localdomain sudo[56463]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:00 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.9 scrub ok
Feb 01 08:00:00 np0005604215.localdomain sudo[56506]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhuzbwpkvonhvglrzxddzitvbxuuquph ; /usr/bin/python3
Feb 01 08:00:00 np0005604215.localdomain sudo[56506]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:00 np0005604215.localdomain python3[56508]: ansible-ansible.legacy.copy Invoked with dest=/usr/local/sbin/containers-tmpwatch group=root mode=493 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932799.8762736-91927-236625377719175/source _original_basename=tmph1rbryab follow=False checksum=84397b037dad9813fed388c4bcdd4871f384cd22 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:00 np0005604215.localdomain sudo[56506]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:00 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 66 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=52/53 n=1 ec=42/32 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=9.914355278s) [2,3,1] r=0 lpr=66 pi=[52,66)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1164.864868164s@ mbc={}] start_peering_interval up [4,0,2] -> [2,3,1], acting [4,0,2] -> [2,3,1], acting_primary 4 -> 2, up_primary 4 -> 2, role 2 -> 0, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 08:00:00 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 66 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=52/53 n=1 ec=42/32 lis/c=52/52 les/c/f=53/53/0 sis=66 pruub=9.914355278s) [2,3,1] r=0 lpr=66 pi=[52,66)/1 crt=34'39 mlcod 0'0 unknown pruub 1164.864868164s@ mbc={}] state<Start>: transitioning to Primary
Feb 01 08:00:00 np0005604215.localdomain sudo[56536]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxccaolnvuoxeriqfzrhhujbbwqzvtdg ; /usr/bin/python3
Feb 01 08:00:00 np0005604215.localdomain sudo[56536]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:01 np0005604215.localdomain python3[56538]: ansible-cron Invoked with job=/usr/local/sbin/containers-tmpwatch name=Remove old logs special_time=daily user=root state=present backup=False minute=* hour=* day=* month=* weekday=* disabled=False env=False cron_file=None insertafter=None insertbefore=None
Feb 01 08:00:01 np0005604215.localdomain crontab[56539]: (root) LIST (root)
Feb 01 08:00:01 np0005604215.localdomain crontab[56540]: (root) REPLACE (root)
Feb 01 08:00:01 np0005604215.localdomain sudo[56536]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:01 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.3 deep-scrub starts
Feb 01 08:00:01 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.3 deep-scrub ok
Feb 01 08:00:01 np0005604215.localdomain sudo[56554]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncpckumdvoznrtalrmnvptkbtevipjku ; /usr/bin/python3
Feb 01 08:00:01 np0005604215.localdomain sudo[56554]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:01 np0005604215.localdomain python3[56556]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_2 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:00:01 np0005604215.localdomain sudo[56554]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:01 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.7 scrub starts
Feb 01 08:00:01 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 67 pg[7.d( v 34'39 (0'0,34'39] local-lis/les=66/67 n=1 ec=42/32 lis/c=52/52 les/c/f=53/53/0 sis=66) [2,3,1] r=0 lpr=66 pi=[52,66)/1 crt=34'39 mlcod 0'0 active+degraded mbc={255={(1+2)=2}}] state<Started/Primary/Active>: react AllReplicasActivated Activating complete
Feb 01 08:00:01 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.7 scrub ok
Feb 01 08:00:01 np0005604215.localdomain sudo[56604]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dyftkanxysfqyhogfxmdnqibedebzpfq ; /usr/bin/python3
Feb 01 08:00:01 np0005604215.localdomain sudo[56604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:02 np0005604215.localdomain sudo[56604]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:02 np0005604215.localdomain sudo[56622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzadqgmdvgxwzfdoxyajrcsllwcfsswt ; /usr/bin/python3
Feb 01 08:00:02 np0005604215.localdomain sudo[56622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:02 np0005604215.localdomain sudo[56622]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:02 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.2 scrub starts
Feb 01 08:00:02 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.2 scrub ok
Feb 01 08:00:02 np0005604215.localdomain sudo[56726]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zqexlwyzzzkavpvovbbtxubswzblgknn ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932802.4000807-92012-270336552753347/async_wrapper.py 185955943630 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932802.4000807-92012-270336552753347/AnsiballZ_command.py _
Feb 01 08:00:02 np0005604215.localdomain sudo[56726]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 08:00:02 np0005604215.localdomain ansible-async_wrapper.py[56728]: Invoked with 185955943630 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769932802.4000807-92012-270336552753347/AnsiballZ_command.py _
Feb 01 08:00:02 np0005604215.localdomain ansible-async_wrapper.py[56731]: Starting module and watcher
Feb 01 08:00:02 np0005604215.localdomain ansible-async_wrapper.py[56731]: Start watching 56732 (3600)
Feb 01 08:00:02 np0005604215.localdomain ansible-async_wrapper.py[56732]: Start module (56732)
Feb 01 08:00:02 np0005604215.localdomain ansible-async_wrapper.py[56728]: Return async_wrapper task started.
Feb 01 08:00:02 np0005604215.localdomain sudo[56726]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:03 np0005604215.localdomain sudo[56747]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqxovexljwccpoajdldomauafldzjdos ; /usr/bin/python3
Feb 01 08:00:03 np0005604215.localdomain sudo[56747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:03 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 68 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=54/55 n=1 ec=42/32 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=9.451250076s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 luod=0'0 crt=34'39 mlcod 0'0 active pruub 1166.962158203s@ mbc={}] start_peering_interval up [0,2,4] -> [3,1,5], acting [0,2,4] -> [3,1,5], acting_primary 0 -> 3, up_primary 0 -> 3, role 1 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 08:00:03 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 68 pg[7.e( v 34'39 (0'0,34'39] local-lis/les=54/55 n=1 ec=42/32 lis/c=54/54 les/c/f=55/55/0 sis=68 pruub=9.451083183s) [3,1,5] r=-1 lpr=68 pi=[54,68)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1166.962158203s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 08:00:03 np0005604215.localdomain python3[56749]: ansible-ansible.legacy.async_status Invoked with jid=185955943630.56728 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:00:03 np0005604215.localdomain sudo[56747]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:03 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.3 scrub starts
Feb 01 08:00:03 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.3 scrub ok
Feb 01 08:00:04 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.8 scrub starts
Feb 01 08:00:04 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 68 pg[7.e( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=54/54 les/c/f=55/55/0 sis=68) [3,1,5] r=2 lpr=68 pi=[54,68)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 08:00:04 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 2.8 scrub ok
Feb 01 08:00:05 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 70 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=55/56 n=1 ec=42/32 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=8.112198830s) [0,4,5] r=-1 lpr=70 pi=[55,70)/1 crt=34'39 mlcod 0'0 active pruub 1167.941162109s@ mbc={255={}}] start_peering_interval up [2,1,3] -> [0,4,5], acting [2,1,3] -> [0,4,5], acting_primary 2 -> 0, up_primary 2 -> 0, role 0 -> -1, features acting 4540138322906710015 upacting 4540138322906710015
Feb 01 08:00:05 np0005604215.localdomain ceph-osd[31357]: osd.2 pg_epoch: 70 pg[7.f( v 34'39 (0'0,34'39] local-lis/les=55/56 n=1 ec=42/32 lis/c=55/55 les/c/f=56/56/0 sis=70 pruub=8.112115860s) [0,4,5] r=-1 lpr=70 pi=[55,70)/1 crt=34'39 mlcod 0'0 unknown NOTIFY pruub 1167.941162109s@ mbc={}] state<Start>: transitioning to Stray
Feb 01 08:00:06 np0005604215.localdomain ceph-osd[32318]: osd.5 pg_epoch: 70 pg[7.f( empty local-lis/les=0/0 n=0 ec=42/32 lis/c=55/55 les/c/f=56/56/0 sis=70) [0,4,5] r=2 lpr=70 pi=[55,70)/1 crt=0'0 mlcod 0'0 unknown mbc={}] state<Start>: transitioning to Stray
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]:    (file: /etc/puppet/hiera.yaml)
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]: Warning: Undefined variable '::deploy_config_name';
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]:    (file & line not available)
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]:    (file & line not available)
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 01 08:00:06 np0005604215.localdomain puppet-user[56752]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.11 seconds
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Notice: Applied catalog in 0.03 seconds
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Application:
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:    Initial environment: production
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:    Converged environment: production
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:          Run mode: user
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Changes:
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Events:
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Resources:
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:             Total: 10
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Time:
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:          Schedule: 0.00
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:              File: 0.00
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:              Exec: 0.01
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:            Augeas: 0.01
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:    Transaction evaluation: 0.03
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:    Catalog application: 0.03
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:    Config retrieval: 0.14
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:          Last run: 1769932807
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:        Filebucket: 0.00
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:             Total: 0.04
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]: Version:
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:            Config: 1769932806
Feb 01 08:00:07 np0005604215.localdomain puppet-user[56752]:            Puppet: 7.10.0
Feb 01 08:00:07 np0005604215.localdomain ansible-async_wrapper.py[56732]: Module complete (56732)
Feb 01 08:00:07 np0005604215.localdomain ansible-async_wrapper.py[56731]: Done in kid B.
Feb 01 08:00:08 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.13 scrub starts
Feb 01 08:00:08 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.13 scrub ok
Feb 01 08:00:08 np0005604215.localdomain sudo[56863]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:00:08 np0005604215.localdomain sudo[56863]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:00:08 np0005604215.localdomain sudo[56863]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:08 np0005604215.localdomain sudo[56878]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:00:08 np0005604215.localdomain sudo[56878]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:00:09 np0005604215.localdomain sudo[56878]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:09 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.11 deep-scrub starts
Feb 01 08:00:09 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.11 deep-scrub ok
Feb 01 08:00:10 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.11 scrub starts
Feb 01 08:00:10 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.11 scrub ok
Feb 01 08:00:10 np0005604215.localdomain sudo[56925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:00:10 np0005604215.localdomain sudo[56925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:00:10 np0005604215.localdomain sudo[56925]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:10 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.17 deep-scrub starts
Feb 01 08:00:10 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.17 deep-scrub ok
Feb 01 08:00:11 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.16 scrub starts
Feb 01 08:00:11 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 2.16 scrub ok
Feb 01 08:00:13 np0005604215.localdomain sudo[56953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-phjnmtythfcwlpxfvpfhtlbdwyuwqiev ; /usr/bin/python3
Feb 01 08:00:13 np0005604215.localdomain sudo[56953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:13 np0005604215.localdomain python3[56955]: ansible-ansible.legacy.async_status Invoked with jid=185955943630.56728 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:00:13 np0005604215.localdomain sudo[56953]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:14 np0005604215.localdomain sudo[56969]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqljdkwoypstfjfancucxaertaarrbdl ; /usr/bin/python3
Feb 01 08:00:14 np0005604215.localdomain sudo[56969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:14 np0005604215.localdomain python3[56971]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:00:14 np0005604215.localdomain sudo[56969]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:14 np0005604215.localdomain sudo[56985]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhwmscbpbwaqbhysbgbjlwgvfjbemdhs ; /usr/bin/python3
Feb 01 08:00:14 np0005604215.localdomain sudo[56985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:14 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.2 scrub starts
Feb 01 08:00:14 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.2 scrub ok
Feb 01 08:00:14 np0005604215.localdomain python3[56987]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:00:14 np0005604215.localdomain sudo[56985]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:14 np0005604215.localdomain sudo[57035]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wliazvxfaqgawmlexbrlrogazgbzwgql ; /usr/bin/python3
Feb 01 08:00:15 np0005604215.localdomain sudo[57035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:15 np0005604215.localdomain python3[57037]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:15 np0005604215.localdomain sudo[57035]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:15 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.16 scrub starts
Feb 01 08:00:15 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 3.16 scrub ok
Feb 01 08:00:15 np0005604215.localdomain sudo[57053]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kqumfvdpcvxgdutrlreccaguktecuaop ; /usr/bin/python3
Feb 01 08:00:15 np0005604215.localdomain sudo[57053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:15 np0005604215.localdomain python3[57055]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpdlejn6da recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:00:15 np0005604215.localdomain sudo[57053]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:15 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.f scrub starts
Feb 01 08:00:15 np0005604215.localdomain sudo[57083]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqnlzlsmnvlpcnbqppcqqvtwfstkuecr ; /usr/bin/python3
Feb 01 08:00:15 np0005604215.localdomain sudo[57083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:15 np0005604215.localdomain python3[57085]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:15 np0005604215.localdomain sudo[57083]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:15 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.f scrub ok
Feb 01 08:00:15 np0005604215.localdomain sudo[57099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtsyzxllblzatisyzdtkszzfbxeuqasy ; /usr/bin/python3
Feb 01 08:00:15 np0005604215.localdomain sudo[57099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:16 np0005604215.localdomain sudo[57099]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:16 np0005604215.localdomain sudo[57187]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ifuwhvetbwiimjhfijkvkwvbjdqipimf ; /usr/bin/python3
Feb 01 08:00:16 np0005604215.localdomain sudo[57187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:16 np0005604215.localdomain python3[57189]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 01 08:00:16 np0005604215.localdomain sudo[57187]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:17 np0005604215.localdomain sudo[57206]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kupuismybndjbcqpdhwuatjecaqcmpol ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:17 np0005604215.localdomain sudo[57206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:17 np0005604215.localdomain python3[57208]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:17 np0005604215.localdomain sudo[57206]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:17 np0005604215.localdomain sudo[57222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpfedwxzmrvdroddcjwqkwrsmkhctzcd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:17 np0005604215.localdomain sudo[57222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:17 np0005604215.localdomain sudo[57222]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:18 np0005604215.localdomain sudo[57238]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pzgsvnbfmhjdremnnscoxjbamzynxjxb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:18 np0005604215.localdomain sudo[57238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:18 np0005604215.localdomain python3[57240]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:00:18 np0005604215.localdomain sudo[57238]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:18 np0005604215.localdomain sudo[57288]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxwghgpognihahywuaacodavlqdariza ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:18 np0005604215.localdomain sudo[57288]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:19 np0005604215.localdomain python3[57290]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:19 np0005604215.localdomain sudo[57288]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:19 np0005604215.localdomain sudo[57306]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdbkjsfwvfjhpsikiognaivfspncqjcr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:19 np0005604215.localdomain sudo[57306]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:19 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.10 deep-scrub starts
Feb 01 08:00:19 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.10 deep-scrub ok
Feb 01 08:00:19 np0005604215.localdomain python3[57308]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:19 np0005604215.localdomain sudo[57306]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:19 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.1c scrub starts
Feb 01 08:00:19 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.1c scrub ok
Feb 01 08:00:19 np0005604215.localdomain sudo[57368]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsisptavjaxqyhppcchozuchegvzacpl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:00:19 np0005604215.localdomain sudo[57368]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:19 np0005604215.localdomain podman[57370]: 2026-02-01 08:00:19.744403745 +0000 UTC m=+0.081338570 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, architecture=x86_64, release=1766032510, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z)
Feb 01 08:00:19 np0005604215.localdomain python3[57371]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:19 np0005604215.localdomain sudo[57368]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:19 np0005604215.localdomain sudo[57414]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hjhmrxvknslmpzhnjutejqavxzeacprq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:19 np0005604215.localdomain sudo[57414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:19 np0005604215.localdomain podman[57370]: 2026-02-01 08:00:19.933757155 +0000 UTC m=+0.270692020 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:00:19 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:00:20 np0005604215.localdomain python3[57416]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:20 np0005604215.localdomain sudo[57414]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:20 np0005604215.localdomain sudo[57480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqumpmaqahqddfuofrjmgdmgxdmbrxhp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:20 np0005604215.localdomain sudo[57480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:20 np0005604215.localdomain python3[57482]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:20 np0005604215.localdomain sudo[57480]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:20 np0005604215.localdomain sudo[57498]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ydodztqrjndqwopsguayssfoxppozmlu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:20 np0005604215.localdomain sudo[57498]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:20 np0005604215.localdomain python3[57500]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:20 np0005604215.localdomain sudo[57498]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:21 np0005604215.localdomain sudo[57560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdympwvfgbputxilhbzjrrhoztfrumnx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:21 np0005604215.localdomain sudo[57560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:21 np0005604215.localdomain python3[57562]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:21 np0005604215.localdomain sudo[57560]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:21 np0005604215.localdomain sudo[57578]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uyswnknxrfiaqrruxxbhivqpwppyymvv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:21 np0005604215.localdomain sudo[57578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:21 np0005604215.localdomain python3[57580]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:21 np0005604215.localdomain sudo[57578]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:21 np0005604215.localdomain sudo[57608]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xvdiyrnfzbspcujrbysktqmramkcwjfu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:21 np0005604215.localdomain sudo[57608]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:22 np0005604215.localdomain python3[57610]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:00:22 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:00:22 np0005604215.localdomain systemd-sysv-generator[57640]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:00:22 np0005604215.localdomain systemd-rc-local-generator[57634]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:00:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:00:22 np0005604215.localdomain sudo[57608]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:22 np0005604215.localdomain sudo[57694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toshzbnddkswarcezeiyhwutqnjcnpcn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:22 np0005604215.localdomain sudo[57694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:23 np0005604215.localdomain python3[57696]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:23 np0005604215.localdomain sudo[57694]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:23 np0005604215.localdomain sudo[57712]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-libdvqyostznhysxioevwplwaisbekzf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:23 np0005604215.localdomain sudo[57712]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:23 np0005604215.localdomain python3[57714]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:23 np0005604215.localdomain sudo[57712]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:23 np0005604215.localdomain sudo[57774]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wmhfowcbycexknmzogvxtkthjtcbgmnf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:23 np0005604215.localdomain sudo[57774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:23 np0005604215.localdomain python3[57776]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:00:23 np0005604215.localdomain sudo[57774]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:23 np0005604215.localdomain sudo[57792]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gctuyrxsapftpquzgzjcmzhuwvtwdhdm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:23 np0005604215.localdomain sudo[57792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:24 np0005604215.localdomain python3[57794]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:24 np0005604215.localdomain sudo[57792]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:24 np0005604215.localdomain sudo[57822]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xooncokohzucmrzhskjwucrihjfviuln ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:24 np0005604215.localdomain sudo[57822]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:24 np0005604215.localdomain python3[57824]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:00:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:00:24 np0005604215.localdomain systemd-sysv-generator[57856]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:00:24 np0005604215.localdomain systemd-rc-local-generator[57852]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:00:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:00:24 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 08:00:24 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 08:00:24 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 08:00:24 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 08:00:24 np0005604215.localdomain sudo[57822]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:25 np0005604215.localdomain sudo[57881]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppnoztfjojzqfcgxgdqybpwrerpneqcu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:25 np0005604215.localdomain sudo[57881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:25 np0005604215.localdomain python3[57883]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 01 08:00:25 np0005604215.localdomain sudo[57881]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:25 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.16 deep-scrub starts
Feb 01 08:00:25 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.16 deep-scrub ok
Feb 01 08:00:25 np0005604215.localdomain sudo[57897]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mydnwvapasiabszpyhnepptpyxpkbvxu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:25 np0005604215.localdomain sudo[57897]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:26 np0005604215.localdomain sudo[57897]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:26 np0005604215.localdomain sudo[57938]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcnouopoogcgfryqmunsjfwzucxlankt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:00:26 np0005604215.localdomain sudo[57938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:27 np0005604215.localdomain python3[57940]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step2 config_dir=/var/lib/tripleo-config/container-startup-config/step_2 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 01 08:00:27 np0005604215.localdomain podman[58003]: 2026-02-01 08:00:27.333396107 +0000 UTC m=+0.080581997 container create 0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtqemud_init_logs, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, build-date=2026-01-12T23:31:49Z, vcs-type=git, config_id=tripleo_step2, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: Started libpod-conmon-0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787.scope.
Feb 01 08:00:27 np0005604215.localdomain podman[58003]: 2026-02-01 08:00:27.286592045 +0000 UTC m=+0.033777965 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:00:27 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0bc97c01fd5eabbf2d8e0d9991f11a9043512db93b5f6f0454866fe7414277f1/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Feb 01 08:00:27 np0005604215.localdomain podman[58003]: 2026-02-01 08:00:27.417350119 +0000 UTC m=+0.164535989 container init 0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtqemud_init_logs, build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step2, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team)
Feb 01 08:00:27 np0005604215.localdomain podman[58003]: 2026-02-01 08:00:27.433017042 +0000 UTC m=+0.180202932 container start 0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, managed_by=tripleo_ansible, config_id=tripleo_step2, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, vcs-type=git, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, container_name=nova_virtqemud_init_logs, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:00:27 np0005604215.localdomain python3[57940]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud_init_logs --conmon-pidfile /run/nova_virtqemud_init_logs.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step2 --label container_name=nova_virtqemud_init_logs --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud_init_logs.log --network none --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --user root --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /bin/bash -c chown -R tss:tss /var/log/swtpm
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: libpod-0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787.scope: Deactivated successfully.
Feb 01 08:00:27 np0005604215.localdomain podman[58024]: 2026-02-01 08:00:27.450446351 +0000 UTC m=+0.144515600 container create 023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, version=17.1.13, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step2, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute_init_log)
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: Started libpod-conmon-023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd.scope.
Feb 01 08:00:27 np0005604215.localdomain podman[58024]: 2026-02-01 08:00:27.39864214 +0000 UTC m=+0.092711419 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:00:27 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2e81b03279955a60a1adecef9798de6e2f56144145c95c44327ebc53e7747a37/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:00:27 np0005604215.localdomain podman[58043]: 2026-02-01 08:00:27.518686828 +0000 UTC m=+0.065559554 container died 0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, container_name=nova_virtqemud_init_logs, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:00:27 np0005604215.localdomain podman[58049]: 2026-02-01 08:00:27.591810159 +0000 UTC m=+0.126890644 container cleanup 0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud_init_logs, container_name=nova_virtqemud_init_logs, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step2, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, architecture=x86_64, config_data={'command': ['/bin/bash', '-c', 'chown -R tss:tss /var/log/swtpm'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'none', 'privileged': True, 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'user': 'root', 'volumes': ['/var/log/containers/libvirt/swtpm:/var/log/swtpm:shared,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-libvirt-container, release=1766032510)
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: libpod-conmon-0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787.scope: Deactivated successfully.
Feb 01 08:00:27 np0005604215.localdomain podman[58024]: 2026-02-01 08:00:27.619203511 +0000 UTC m=+0.313272770 container init 023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, container_name=nova_compute_init_log, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step2, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:00:27 np0005604215.localdomain podman[58024]: 2026-02-01 08:00:27.627796942 +0000 UTC m=+0.321866191 container start 023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute_init_log, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, version=17.1.13)
Feb 01 08:00:27 np0005604215.localdomain python3[57940]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute_init_log --conmon-pidfile /run/nova_compute_init_log.pid --detach=True --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step2 --label container_name=nova_compute_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute_init_log.log --network none --privileged=False --user root --volume /var/log/containers/nova:/var/log/nova:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /bin/bash -c chown -R nova:nova /var/log/nova
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: libpod-023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd.scope: Deactivated successfully.
Feb 01 08:00:27 np0005604215.localdomain podman[58086]: 2026-02-01 08:00:27.704992582 +0000 UTC m=+0.057153211 container died 023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, config_id=tripleo_step2, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13)
Feb 01 08:00:27 np0005604215.localdomain podman[58092]: 2026-02-01 08:00:27.741333755 +0000 UTC m=+0.076894691 container cleanup 023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute_init_log, version=17.1.13, config_id=tripleo_step2, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_compute_init_log, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, config_data={'command': ['/bin/bash', '-c', 'chown -R nova:nova /var/log/nova'], 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'user': 'root', 'volumes': ['/var/log/containers/nova:/var/log/nova:z']}, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:00:27 np0005604215.localdomain systemd[1]: libpod-conmon-023b656bbb0901ce30f777777a74fde646d7cce4b5244f472a7e788e575157dd.scope: Deactivated successfully.
Feb 01 08:00:28 np0005604215.localdomain podman[58186]: 2026-02-01 08:00:28.123062269 +0000 UTC m=+0.090007774 container create 09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=create_haproxy_wrapper, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step2, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:00:28 np0005604215.localdomain systemd[1]: Started libpod-conmon-09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315.scope.
Feb 01 08:00:28 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:00:28 np0005604215.localdomain podman[58186]: 2026-02-01 08:00:28.078220417 +0000 UTC m=+0.045165972 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 01 08:00:28 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b2933f5278c3e34f217deba3df65be56d0deb9a26e06617aee0cea81e2014367/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 08:00:28 np0005604215.localdomain podman[58209]: 2026-02-01 08:00:28.19205868 +0000 UTC m=+0.085044418 container create 3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, container_name=create_virtlogd_wrapper, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=)
Feb 01 08:00:28 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1f scrub starts
Feb 01 08:00:28 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1f scrub ok
Feb 01 08:00:28 np0005604215.localdomain systemd[1]: Started libpod-conmon-3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58.scope.
Feb 01 08:00:28 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:00:28 np0005604215.localdomain podman[58186]: 2026-02-01 08:00:28.244621544 +0000 UTC m=+0.211567049 container init 09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step2, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, container_name=create_haproxy_wrapper, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z)
Feb 01 08:00:28 np0005604215.localdomain podman[58209]: 2026-02-01 08:00:28.148954833 +0000 UTC m=+0.041940601 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:00:28 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/50254bf8e87a075d183197f5531e6c0f97888346b53b5d118b5ece2506404cbc/merged/var/lib/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 01 08:00:28 np0005604215.localdomain podman[58186]: 2026-02-01 08:00:28.254377161 +0000 UTC m=+0.221322666 container start 09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=create_haproxy_wrapper, tcib_managed=true, config_id=tripleo_step2, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 01 08:00:28 np0005604215.localdomain podman[58186]: 2026-02-01 08:00:28.254893787 +0000 UTC m=+0.221839332 container attach 09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step2, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=create_haproxy_wrapper, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:00:28 np0005604215.localdomain podman[58209]: 2026-02-01 08:00:28.306015006 +0000 UTC m=+0.199000764 container init 3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step2, build-date=2026-01-12T23:31:49Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=create_virtlogd_wrapper, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 01 08:00:28 np0005604215.localdomain podman[58209]: 2026-02-01 08:00:28.314460062 +0000 UTC m=+0.207445830 container start 3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com)
Feb 01 08:00:28 np0005604215.localdomain podman[58209]: 2026-02-01 08:00:28.314789523 +0000 UTC m=+0.207775281 container attach 3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, container_name=create_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step2, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true)
Feb 01 08:00:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0bc97c01fd5eabbf2d8e0d9991f11a9043512db93b5f6f0454866fe7414277f1-merged.mount: Deactivated successfully.
Feb 01 08:00:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0605aa4760a862e958c0fb6713ff69acd745a7a6a9c31cccc8745e77505c3787-userdata-shm.mount: Deactivated successfully.
Feb 01 08:00:29 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.d scrub starts
Feb 01 08:00:29 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.d scrub ok
Feb 01 08:00:29 np0005604215.localdomain ovs-vsctl[58323]: ovs|00001|db_ctl_base|ERR|unix:/var/run/openvswitch/db.sock: database connection failed (No such file or directory)
Feb 01 08:00:30 np0005604215.localdomain systemd[1]: libpod-3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58.scope: Deactivated successfully.
Feb 01 08:00:30 np0005604215.localdomain systemd[1]: libpod-3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58.scope: Consumed 2.049s CPU time.
Feb 01 08:00:30 np0005604215.localdomain podman[58209]: 2026-02-01 08:00:30.366927595 +0000 UTC m=+2.259913363 container died 3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, io.openshift.expose-services=, config_id=tripleo_step2, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=create_virtlogd_wrapper, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:00:30 np0005604215.localdomain systemd[1]: tmp-crun.sfDeCW.mount: Deactivated successfully.
Feb 01 08:00:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58-userdata-shm.mount: Deactivated successfully.
Feb 01 08:00:30 np0005604215.localdomain podman[58449]: 2026-02-01 08:00:30.477572457 +0000 UTC m=+0.097179179 container cleanup 3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=create_virtlogd_wrapper, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']}, container_name=create_virtlogd_wrapper, build-date=2026-01-12T23:31:49Z, architecture=x86_64, config_id=tripleo_step2, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 01 08:00:30 np0005604215.localdomain systemd[1]: libpod-conmon-3738f4d9f86def7281cb8a2a5f2901e3ed40bf88ad618b18e33db25a08a07b58.scope: Deactivated successfully.
Feb 01 08:00:30 np0005604215.localdomain python3[57940]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/create_virtlogd_wrapper.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step2 --label container_name=create_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::nova::virtlogd_wrapper'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_virtlogd_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /var/lib/container-config-scripts:/var/lib/container-config-scripts:shared,z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::nova::virtlogd_wrapper
Feb 01 08:00:31 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1 scrub starts
Feb 01 08:00:31 np0005604215.localdomain systemd[1]: libpod-09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315.scope: Deactivated successfully.
Feb 01 08:00:31 np0005604215.localdomain systemd[1]: libpod-09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315.scope: Consumed 2.115s CPU time.
Feb 01 08:00:31 np0005604215.localdomain podman[58186]: 2026-02-01 08:00:31.309054544 +0000 UTC m=+3.276000029 container died 09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=create_haproxy_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step2, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:00:31 np0005604215.localdomain podman[58490]: 2026-02-01 08:00:31.380274116 +0000 UTC m=+0.060801215 container cleanup 09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=create_haproxy_wrapper, config_id=tripleo_step2, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, vcs-type=git, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=create_haproxy_wrapper, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']}, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible)
Feb 01 08:00:31 np0005604215.localdomain systemd[1]: libpod-conmon-09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315.scope: Deactivated successfully.
Feb 01 08:00:31 np0005604215.localdomain python3[57940]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name create_haproxy_wrapper --conmon-pidfile /run/create_haproxy_wrapper.pid --detach=False --label config_id=tripleo_step2 --label container_name=create_haproxy_wrapper --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'file', 'include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers'], 'detach': False, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'start_order': 1, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/create_haproxy_wrapper.log --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 file include ::tripleo::profile::base::neutron::ovn_metadata_agent_wrappers
Feb 01 08:00:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-50254bf8e87a075d183197f5531e6c0f97888346b53b5d118b5ece2506404cbc-merged.mount: Deactivated successfully.
Feb 01 08:00:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b2933f5278c3e34f217deba3df65be56d0deb9a26e06617aee0cea81e2014367-merged.mount: Deactivated successfully.
Feb 01 08:00:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09968e3f6820d539a0f919b784a87aaed2031785c033ca7e095dfdf925256315-userdata-shm.mount: Deactivated successfully.
Feb 01 08:00:31 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.1 scrub ok
Feb 01 08:00:31 np0005604215.localdomain sudo[57938]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:31 np0005604215.localdomain sudo[58541]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wsctezbbddxlvffnqazdktrbjupyfnuk ; /usr/bin/python3
Feb 01 08:00:31 np0005604215.localdomain sudo[58541]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:31 np0005604215.localdomain python3[58543]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks2.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:31 np0005604215.localdomain sudo[58541]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:32 np0005604215.localdomain sudo[58589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgvvutzfavrpypohrrkfqtiumigdnysi ; /usr/bin/python3
Feb 01 08:00:32 np0005604215.localdomain sudo[58589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:32 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.c deep-scrub starts
Feb 01 08:00:32 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.c deep-scrub ok
Feb 01 08:00:32 np0005604215.localdomain sudo[58589]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:32 np0005604215.localdomain sudo[58632]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbslszrdryguvlbnmjijixdyfycjkbno ; /usr/bin/python3
Feb 01 08:00:32 np0005604215.localdomain sudo[58632]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:33 np0005604215.localdomain sudo[58632]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:33 np0005604215.localdomain sudo[58662]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svhbxwjssjzbbrlqeyqaiohqklmhsikx ; /usr/bin/python3
Feb 01 08:00:33 np0005604215.localdomain sudo[58662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:33 np0005604215.localdomain python3[58664]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks2.json short_hostname=np0005604215 step=2 update_config_hash_only=False
Feb 01 08:00:33 np0005604215.localdomain sudo[58662]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:33 np0005604215.localdomain sudo[58678]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-titfmzxfjzirkugwjhcvwapzufmvzwel ; /usr/bin/python3
Feb 01 08:00:33 np0005604215.localdomain sudo[58678]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:34 np0005604215.localdomain python3[58680]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:00:34 np0005604215.localdomain sudo[58678]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:34 np0005604215.localdomain sudo[58694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oovqdccoetbvyomvalcshcbgajwwhcwo ; /usr/bin/python3
Feb 01 08:00:34 np0005604215.localdomain sudo[58694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:00:34 np0005604215.localdomain python3[58696]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_2 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 01 08:00:34 np0005604215.localdomain sudo[58694]: pam_unix(sudo:session): session closed for user root
Feb 01 08:00:34 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.5 scrub starts
Feb 01 08:00:34 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.5 scrub ok
Feb 01 08:00:35 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.10 scrub starts
Feb 01 08:00:35 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.10 scrub ok
Feb 01 08:00:37 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.a scrub starts
Feb 01 08:00:37 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 3.a scrub ok
Feb 01 08:00:38 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.18 scrub starts
Feb 01 08:00:38 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 5.18 scrub ok
Feb 01 08:00:38 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.15 scrub starts
Feb 01 08:00:38 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 5.15 scrub ok
Feb 01 08:00:41 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 6.15 scrub starts
Feb 01 08:00:41 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 6.15 scrub ok
Feb 01 08:00:43 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.13 scrub starts
Feb 01 08:00:43 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.13 scrub ok
Feb 01 08:00:44 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 4.c scrub starts
Feb 01 08:00:44 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 4.c scrub ok
Feb 01 08:00:45 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.1a scrub starts
Feb 01 08:00:45 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.1a scrub ok
Feb 01 08:00:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:00:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4860 writes, 21K keys, 4860 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4860 writes, 515 syncs, 9.44 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1464 writes, 5264 keys, 1464 commit groups, 1.0 writes per commit group, ingest: 2.32 MB, 0.00 MB/s
                                                          Interval WAL: 1464 writes, 314 syncs, 4.66 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 1.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 5.3e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 08:00:46 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.19 scrub starts
Feb 01 08:00:46 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.19 scrub ok
Feb 01 08:00:47 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.3 scrub starts
Feb 01 08:00:47 np0005604215.localdomain ceph-osd[32318]: log_channel(cluster) log [DBG] : 6.3 scrub ok
Feb 01 08:00:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:00:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Cumulative writes: 4708 writes, 21K keys, 4708 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4708 writes, 468 syncs, 10.06 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 1461 writes, 5079 keys, 1461 commit groups, 1.0 writes per commit group, ingest: 2.28 MB, 0.00 MB/s
                                                          Interval WAL: 1461 writes, 329 syncs, 4.44 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 1200.1 total, 600.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 8 last_secs: 4.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 08:00:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:00:50 np0005604215.localdomain systemd[1]: tmp-crun.1rA6p2.mount: Deactivated successfully.
Feb 01 08:00:50 np0005604215.localdomain podman[58697]: 2026-02-01 08:00:50.888753804 +0000 UTC m=+0.096683414 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, config_id=tripleo_step1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com)
Feb 01 08:00:51 np0005604215.localdomain podman[58697]: 2026-02-01 08:00:51.085931629 +0000 UTC m=+0.293861219 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:00:51 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:00:52 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 6.8 scrub starts
Feb 01 08:00:52 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 6.8 scrub ok
Feb 01 08:00:55 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 6.d scrub starts
Feb 01 08:00:55 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 6.d scrub ok
Feb 01 08:00:57 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.d scrub starts
Feb 01 08:00:57 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.d scrub ok
Feb 01 08:01:01 np0005604215.localdomain anacron[18977]: Job `cron.weekly' started
Feb 01 08:01:01 np0005604215.localdomain anacron[18977]: Job `cron.weekly' terminated
Feb 01 08:01:01 np0005604215.localdomain CROND[58729]: (root) CMD (run-parts /etc/cron.hourly)
Feb 01 08:01:01 np0005604215.localdomain run-parts[58732]: (/etc/cron.hourly) starting 0anacron
Feb 01 08:01:01 np0005604215.localdomain run-parts[58738]: (/etc/cron.hourly) finished 0anacron
Feb 01 08:01:01 np0005604215.localdomain CROND[58728]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 01 08:01:10 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.a scrub starts
Feb 01 08:01:10 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.a scrub ok
Feb 01 08:01:10 np0005604215.localdomain sudo[58739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:01:10 np0005604215.localdomain sudo[58739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:01:10 np0005604215.localdomain sudo[58739]: pam_unix(sudo:session): session closed for user root
Feb 01 08:01:10 np0005604215.localdomain sudo[58754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 08:01:10 np0005604215.localdomain sudo[58754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:01:11 np0005604215.localdomain sudo[58754]: pam_unix(sudo:session): session closed for user root
Feb 01 08:01:11 np0005604215.localdomain sudo[58789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:01:11 np0005604215.localdomain sudo[58789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:01:11 np0005604215.localdomain sudo[58789]: pam_unix(sudo:session): session closed for user root
Feb 01 08:01:11 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.e scrub starts
Feb 01 08:01:11 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.e scrub ok
Feb 01 08:01:11 np0005604215.localdomain sudo[58804]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:01:11 np0005604215.localdomain sudo[58804]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:01:12 np0005604215.localdomain sudo[58804]: pam_unix(sudo:session): session closed for user root
Feb 01 08:01:12 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.1a scrub starts
Feb 01 08:01:12 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.1a scrub ok
Feb 01 08:01:12 np0005604215.localdomain sudo[58850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:01:12 np0005604215.localdomain sudo[58850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:01:12 np0005604215.localdomain sudo[58850]: pam_unix(sudo:session): session closed for user root
Feb 01 08:01:18 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.18 scrub starts
Feb 01 08:01:18 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.18 scrub ok
Feb 01 08:01:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:01:21 np0005604215.localdomain systemd[1]: tmp-crun.EYk5DB.mount: Deactivated successfully.
Feb 01 08:01:21 np0005604215.localdomain podman[58865]: 2026-02-01 08:01:21.879579406 +0000 UTC m=+0.093761417 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:01:22 np0005604215.localdomain podman[58865]: 2026-02-01 08:01:22.139684137 +0000 UTC m=+0.353866128 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:01:22 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:01:26 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.1b scrub starts
Feb 01 08:01:26 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 4.1b scrub ok
Feb 01 08:01:27 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.1 scrub starts
Feb 01 08:01:27 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.1 scrub ok
Feb 01 08:01:28 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.7 deep-scrub starts
Feb 01 08:01:28 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.7 deep-scrub ok
Feb 01 08:01:29 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.c scrub starts
Feb 01 08:01:29 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.c scrub ok
Feb 01 08:01:31 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.d scrub starts
Feb 01 08:01:31 np0005604215.localdomain ceph-osd[31357]: log_channel(cluster) log [DBG] : 7.d scrub ok
Feb 01 08:01:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:01:52 np0005604215.localdomain podman[58894]: 2026-02-01 08:01:52.870597049 +0000 UTC m=+0.081965511 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step1)
Feb 01 08:01:53 np0005604215.localdomain podman[58894]: 2026-02-01 08:01:53.095573902 +0000 UTC m=+0.306942364 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=metrics_qdr)
Feb 01 08:01:53 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:02:12 np0005604215.localdomain sudo[58923]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:02:12 np0005604215.localdomain sudo[58923]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:02:12 np0005604215.localdomain sudo[58923]: pam_unix(sudo:session): session closed for user root
Feb 01 08:02:12 np0005604215.localdomain sudo[58938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:02:12 np0005604215.localdomain sudo[58938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:02:13 np0005604215.localdomain sudo[58938]: pam_unix(sudo:session): session closed for user root
Feb 01 08:02:14 np0005604215.localdomain sudo[58985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:02:14 np0005604215.localdomain sudo[58985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:02:14 np0005604215.localdomain sudo[58985]: pam_unix(sudo:session): session closed for user root
Feb 01 08:02:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:02:23 np0005604215.localdomain podman[59000]: 2026-02-01 08:02:23.870014934 +0000 UTC m=+0.084156629 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., container_name=metrics_qdr, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:02:24 np0005604215.localdomain podman[59000]: 2026-02-01 08:02:24.082716677 +0000 UTC m=+0.296858412 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:02:24 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:02:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:02:54 np0005604215.localdomain podman[59029]: 2026-02-01 08:02:54.860752267 +0000 UTC m=+0.076853462 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:02:55 np0005604215.localdomain podman[59029]: 2026-02-01 08:02:55.049872289 +0000 UTC m=+0.265973464 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:02:55 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:03:14 np0005604215.localdomain sudo[59059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:03:14 np0005604215.localdomain sudo[59059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:03:14 np0005604215.localdomain sudo[59059]: pam_unix(sudo:session): session closed for user root
Feb 01 08:03:14 np0005604215.localdomain sudo[59074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 08:03:14 np0005604215.localdomain sudo[59074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:03:15 np0005604215.localdomain podman[59161]: 2026-02-01 08:03:15.162516554 +0000 UTC m=+0.098901557 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, architecture=x86_64, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:03:15 np0005604215.localdomain podman[59161]: 2026-02-01 08:03:15.26527274 +0000 UTC m=+0.201657713 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, description=Red Hat Ceph Storage 7, version=7, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 08:03:15 np0005604215.localdomain sudo[59074]: pam_unix(sudo:session): session closed for user root
Feb 01 08:03:15 np0005604215.localdomain sudo[59226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:03:15 np0005604215.localdomain sudo[59226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:03:15 np0005604215.localdomain sudo[59226]: pam_unix(sudo:session): session closed for user root
Feb 01 08:03:15 np0005604215.localdomain sudo[59241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:03:15 np0005604215.localdomain sudo[59241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:03:16 np0005604215.localdomain sudo[59241]: pam_unix(sudo:session): session closed for user root
Feb 01 08:03:16 np0005604215.localdomain sudo[59289]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:03:16 np0005604215.localdomain sudo[59289]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:03:16 np0005604215.localdomain sudo[59289]: pam_unix(sudo:session): session closed for user root
Feb 01 08:03:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:03:25 np0005604215.localdomain podman[59304]: 2026-02-01 08:03:25.879467088 +0000 UTC m=+0.088438551 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 01 08:03:26 np0005604215.localdomain podman[59304]: 2026-02-01 08:03:26.080670245 +0000 UTC m=+0.289641768 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 01 08:03:26 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:03:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:03:56 np0005604215.localdomain systemd[1]: tmp-crun.gGfqhc.mount: Deactivated successfully.
Feb 01 08:03:56 np0005604215.localdomain podman[59333]: 2026-02-01 08:03:56.878365047 +0000 UTC m=+0.095668925 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:03:57 np0005604215.localdomain podman[59333]: 2026-02-01 08:03:57.068719296 +0000 UTC m=+0.286023164 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:03:57 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:04:17 np0005604215.localdomain sudo[59363]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:04:17 np0005604215.localdomain sudo[59363]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:04:17 np0005604215.localdomain sudo[59363]: pam_unix(sudo:session): session closed for user root
Feb 01 08:04:17 np0005604215.localdomain sudo[59378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:04:17 np0005604215.localdomain sudo[59378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:04:17 np0005604215.localdomain sudo[59378]: pam_unix(sudo:session): session closed for user root
Feb 01 08:04:18 np0005604215.localdomain sudo[59424]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:04:18 np0005604215.localdomain sudo[59424]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:04:18 np0005604215.localdomain sudo[59424]: pam_unix(sudo:session): session closed for user root
Feb 01 08:04:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:04:27 np0005604215.localdomain systemd[1]: tmp-crun.PVjT6d.mount: Deactivated successfully.
Feb 01 08:04:27 np0005604215.localdomain podman[59439]: 2026-02-01 08:04:27.884859142 +0000 UTC m=+0.099711132 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true)
Feb 01 08:04:28 np0005604215.localdomain podman[59439]: 2026-02-01 08:04:28.080888848 +0000 UTC m=+0.295740868 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, release=1766032510, container_name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:04:28 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:04:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:04:58 np0005604215.localdomain podman[59468]: 2026-02-01 08:04:58.857882328 +0000 UTC m=+0.073990770 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1766032510, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:04:59 np0005604215.localdomain podman[59468]: 2026-02-01 08:04:59.071772981 +0000 UTC m=+0.287881383 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:04:59 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:05:10 np0005604215.localdomain sudo[59544]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pybyhfwiuwtmqybsdfhcabvcrnmbnodq ; /usr/bin/python3
Feb 01 08:05:10 np0005604215.localdomain sudo[59544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:10 np0005604215.localdomain python3[59546]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:10 np0005604215.localdomain sudo[59544]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:10 np0005604215.localdomain sudo[59589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqrgprwyxmaaebbjbjypwhqrjltnybff ; /usr/bin/python3
Feb 01 08:05:10 np0005604215.localdomain sudo[59589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:11 np0005604215.localdomain python3[59591]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933110.2941833-98265-189130431710396/source _original_basename=tmpezp7ee7s follow=False checksum=62439dd24dde40c90e7a39f6a1b31cc6061fe59b backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:11 np0005604215.localdomain sudo[59589]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:11 np0005604215.localdomain sudo[59619]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiyhwiqonjqmpbhhsvnzsnspahfpihgm ; /usr/bin/python3
Feb 01 08:05:11 np0005604215.localdomain sudo[59619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:12 np0005604215.localdomain python3[59621]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_3 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:12 np0005604215.localdomain sudo[59619]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:12 np0005604215.localdomain sudo[59669]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykrqpbdobtznlhxhqujhflefuhggeoqh ; /usr/bin/python3
Feb 01 08:05:12 np0005604215.localdomain sudo[59669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:13 np0005604215.localdomain sudo[59669]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:13 np0005604215.localdomain sudo[59687]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mysfijagylwaybzvifjuigkqmhwgixiy ; /usr/bin/python3
Feb 01 08:05:13 np0005604215.localdomain sudo[59687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:13 np0005604215.localdomain sudo[59687]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:13 np0005604215.localdomain sudo[59791]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgjvgtllbyoonbrfxxyausrztdzbzpuq ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933113.4942925-98453-246736775600007/async_wrapper.py 156375756012 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933113.4942925-98453-246736775600007/AnsiballZ_command.py _
Feb 01 08:05:13 np0005604215.localdomain sudo[59791]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 08:05:14 np0005604215.localdomain ansible-async_wrapper.py[59793]: Invoked with 156375756012 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933113.4942925-98453-246736775600007/AnsiballZ_command.py _
Feb 01 08:05:14 np0005604215.localdomain ansible-async_wrapper.py[59796]: Starting module and watcher
Feb 01 08:05:14 np0005604215.localdomain ansible-async_wrapper.py[59796]: Start watching 59797 (3600)
Feb 01 08:05:14 np0005604215.localdomain ansible-async_wrapper.py[59797]: Start module (59797)
Feb 01 08:05:14 np0005604215.localdomain ansible-async_wrapper.py[59793]: Return async_wrapper task started.
Feb 01 08:05:14 np0005604215.localdomain sudo[59791]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:14 np0005604215.localdomain sudo[59812]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epfgphqbnwfwylbsuzwwsstcidgrwssc ; /usr/bin/python3
Feb 01 08:05:14 np0005604215.localdomain sudo[59812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:14 np0005604215.localdomain python3[59817]: ansible-ansible.legacy.async_status Invoked with jid=156375756012.59793 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:05:14 np0005604215.localdomain sudo[59812]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    (file: /etc/puppet/hiera.yaml)
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Warning: Undefined variable '::deploy_config_name';
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    (file & line not available)
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    (file & line not available)
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.12 seconds
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Notice: Applied catalog in 0.04 seconds
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Application:
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    Initial environment: production
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    Converged environment: production
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:          Run mode: user
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Changes:
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Events:
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Resources:
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:             Total: 10
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Time:
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:          Schedule: 0.00
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:              File: 0.00
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:              Exec: 0.01
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:            Augeas: 0.01
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    Transaction evaluation: 0.03
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    Catalog application: 0.04
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:    Config retrieval: 0.15
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:          Last run: 1769933117
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:        Filebucket: 0.00
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:             Total: 0.04
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]: Version:
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:            Config: 1769933117
Feb 01 08:05:17 np0005604215.localdomain puppet-user[59816]:            Puppet: 7.10.0
Feb 01 08:05:17 np0005604215.localdomain ansible-async_wrapper.py[59797]: Module complete (59797)
Feb 01 08:05:18 np0005604215.localdomain sudo[59928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:05:18 np0005604215.localdomain sudo[59928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:05:18 np0005604215.localdomain sudo[59928]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:18 np0005604215.localdomain sudo[59943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:05:18 np0005604215.localdomain sudo[59943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:05:19 np0005604215.localdomain ansible-async_wrapper.py[59796]: Done in kid B.
Feb 01 08:05:19 np0005604215.localdomain sudo[59943]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:19 np0005604215.localdomain sudo[59991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:05:19 np0005604215.localdomain sudo[59991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:05:19 np0005604215.localdomain sudo[59991]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:24 np0005604215.localdomain sudo[60019]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-calvlqeqvbwgscvtubjummbtyefajjar ; /usr/bin/python3
Feb 01 08:05:24 np0005604215.localdomain sudo[60019]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:24 np0005604215.localdomain python3[60021]: ansible-ansible.legacy.async_status Invoked with jid=156375756012.59793 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:05:24 np0005604215.localdomain sudo[60019]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:25 np0005604215.localdomain sudo[60035]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mapwubveyuvpaczmjjavpgnhataogpfd ; /usr/bin/python3
Feb 01 08:05:25 np0005604215.localdomain sudo[60035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:25 np0005604215.localdomain python3[60037]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:05:25 np0005604215.localdomain sudo[60035]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:25 np0005604215.localdomain sudo[60051]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvfttwihynqscqracelievqiqylsctwz ; /usr/bin/python3
Feb 01 08:05:25 np0005604215.localdomain sudo[60051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:25 np0005604215.localdomain python3[60053]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:25 np0005604215.localdomain sudo[60051]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:26 np0005604215.localdomain sudo[60101]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoyhamkoxrqjcpyvqaztmykelztzdwcj ; /usr/bin/python3
Feb 01 08:05:26 np0005604215.localdomain sudo[60101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:26 np0005604215.localdomain python3[60103]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:26 np0005604215.localdomain sudo[60101]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:26 np0005604215.localdomain sudo[60119]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tixmsvazjwvlwieayasxswiqtwfnddqp ; /usr/bin/python3
Feb 01 08:05:26 np0005604215.localdomain sudo[60119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:26 np0005604215.localdomain python3[60121]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpn8bpg59h recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:05:26 np0005604215.localdomain sudo[60119]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:26 np0005604215.localdomain sudo[60149]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqvglkabpjlgntokoiwaahhiiloclfij ; /usr/bin/python3
Feb 01 08:05:26 np0005604215.localdomain sudo[60149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:27 np0005604215.localdomain python3[60151]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:27 np0005604215.localdomain sudo[60149]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:27 np0005604215.localdomain sudo[60165]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqyitorkwbwooilehvpttjwmddfpqgjb ; /usr/bin/python3
Feb 01 08:05:27 np0005604215.localdomain sudo[60165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:27 np0005604215.localdomain sudo[60165]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:28 np0005604215.localdomain sudo[60252]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlgtkoqkeacdrjwxuexpwpbcrnuvzffm ; /usr/bin/python3
Feb 01 08:05:28 np0005604215.localdomain sudo[60252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:28 np0005604215.localdomain python3[60254]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 01 08:05:28 np0005604215.localdomain sudo[60252]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:29 np0005604215.localdomain sudo[60271]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzvjenhptyluhmwhmaxcpqalxcyglgwj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:05:29 np0005604215.localdomain sudo[60271]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:29 np0005604215.localdomain systemd[1]: tmp-crun.Ucx95S.mount: Deactivated successfully.
Feb 01 08:05:29 np0005604215.localdomain podman[60273]: 2026-02-01 08:05:29.434417223 +0000 UTC m=+0.093873451 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, config_id=tripleo_step1, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:05:29 np0005604215.localdomain python3[60274]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:29 np0005604215.localdomain sudo[60271]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:29 np0005604215.localdomain podman[60273]: 2026-02-01 08:05:29.624674452 +0000 UTC m=+0.284130680 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 01 08:05:29 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:05:29 np0005604215.localdomain sudo[60316]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfgyzhkdczkjjtmvjskftlbzzdckffla ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:29 np0005604215.localdomain sudo[60316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:29 np0005604215.localdomain sudo[60316]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:30 np0005604215.localdomain sudo[60332]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yyjyctydhefruqwvnyematsnzaxqtizn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:30 np0005604215.localdomain sudo[60332]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:30 np0005604215.localdomain python3[60334]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:30 np0005604215.localdomain sudo[60332]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:30 np0005604215.localdomain sudo[60382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwshzphpzdxycrfqypazmdrypftrlaja ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:30 np0005604215.localdomain sudo[60382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:31 np0005604215.localdomain python3[60384]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:31 np0005604215.localdomain sudo[60382]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:31 np0005604215.localdomain sudo[60400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzxayqeyogxbfenhhfhfxndcwqjzejkg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:31 np0005604215.localdomain sudo[60400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:31 np0005604215.localdomain python3[60402]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:31 np0005604215.localdomain sudo[60400]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:31 np0005604215.localdomain sudo[60462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvwlyossocivmbmxuzlagpwfsstqmucw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:31 np0005604215.localdomain sudo[60462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:31 np0005604215.localdomain python3[60464]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:31 np0005604215.localdomain sudo[60462]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:31 np0005604215.localdomain sudo[60480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pegaqfxgeeekdjgptruvvimeuaggsvau ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:32 np0005604215.localdomain sudo[60480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:32 np0005604215.localdomain python3[60482]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:32 np0005604215.localdomain sudo[60480]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:32 np0005604215.localdomain sudo[60542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-svslljdnnqbovmcqojechpxpwtkzdciu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:32 np0005604215.localdomain sudo[60542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:32 np0005604215.localdomain python3[60544]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:32 np0005604215.localdomain sudo[60542]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:32 np0005604215.localdomain sudo[60560]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqemgeupeyiwssxpthmesujgchbarrqu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:32 np0005604215.localdomain sudo[60560]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:32 np0005604215.localdomain python3[60562]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:32 np0005604215.localdomain sudo[60560]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:33 np0005604215.localdomain sudo[60622]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruvzsklevbjmbrcupqouuaqaqwmcwwdg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:33 np0005604215.localdomain sudo[60622]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:33 np0005604215.localdomain python3[60624]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:33 np0005604215.localdomain sudo[60622]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:33 np0005604215.localdomain sudo[60640]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqzedvzqkcrpueshmegdfkyqdbtgrtjs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:33 np0005604215.localdomain sudo[60640]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:33 np0005604215.localdomain python3[60642]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:33 np0005604215.localdomain sudo[60640]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:34 np0005604215.localdomain sudo[60670]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dufmcwnqmyvcluxylrlpyclpoxdnyyyi ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:34 np0005604215.localdomain sudo[60670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:34 np0005604215.localdomain python3[60672]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:05:34 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:05:34 np0005604215.localdomain systemd-sysv-generator[60698]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:05:34 np0005604215.localdomain systemd-rc-local-generator[60693]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:05:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:05:34 np0005604215.localdomain sudo[60670]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:34 np0005604215.localdomain sudo[60756]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvhwnccwcjcijiuunymdsalneeptttgf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:34 np0005604215.localdomain sudo[60756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:35 np0005604215.localdomain python3[60758]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:35 np0005604215.localdomain sudo[60756]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:35 np0005604215.localdomain sudo[60774]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdsbxquznzlezwwbecwnqafwbpwmrlwh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:35 np0005604215.localdomain sudo[60774]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:35 np0005604215.localdomain python3[60776]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:35 np0005604215.localdomain sudo[60774]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:35 np0005604215.localdomain sudo[60836]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnfltdgwnpoevpxfhydwibtanaqkezbd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:35 np0005604215.localdomain sudo[60836]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:35 np0005604215.localdomain python3[60838]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:05:35 np0005604215.localdomain sudo[60836]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:36 np0005604215.localdomain sudo[60854]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqfoqtekjpapnltswzqweqponbwekzks ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:36 np0005604215.localdomain sudo[60854]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:36 np0005604215.localdomain python3[60856]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:36 np0005604215.localdomain sudo[60854]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:36 np0005604215.localdomain sudo[60884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmaolocarhbnoujamutglhlgddkplfck ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:36 np0005604215.localdomain sudo[60884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:36 np0005604215.localdomain python3[60886]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:05:36 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:05:36 np0005604215.localdomain systemd-rc-local-generator[60909]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:05:36 np0005604215.localdomain systemd-sysv-generator[60915]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:05:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:05:37 np0005604215.localdomain systemd[1]: Starting dnf makecache...
Feb 01 08:05:37 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 08:05:37 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 08:05:37 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 08:05:37 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 08:05:37 np0005604215.localdomain sudo[60884]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:37 np0005604215.localdomain dnf[60923]: Updating Subscription Management repositories.
Feb 01 08:05:37 np0005604215.localdomain sudo[60941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozviejriospyhlmrydsfjyssulynsuaf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:37 np0005604215.localdomain sudo[60941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:37 np0005604215.localdomain python3[60943]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 01 08:05:37 np0005604215.localdomain sudo[60941]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:37 np0005604215.localdomain sudo[60957]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-badiajcrhwzbpkkcqicdmsbtmtxzkshx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:37 np0005604215.localdomain sudo[60957]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:38 np0005604215.localdomain sudo[60957]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:38 np0005604215.localdomain dnf[60923]: Metadata cache refreshed recently.
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Finished dnf makecache.
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: dnf-makecache.service: Consumed 2.064s CPU time.
Feb 01 08:05:39 np0005604215.localdomain sudo[61000]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhwbwselktfbivvxvwigbppbohlmhfwq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:39 np0005604215.localdomain sudo[61000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:39 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step3 config_dir=/var/lib/tripleo-config/container-startup-config/step_3 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 01 08:05:39 np0005604215.localdomain podman[61151]: 2026-02-01 08:05:39.842575645 +0000 UTC m=+0.057912195 container create e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:05:39 np0005604215.localdomain podman[61162]: 2026-02-01 08:05:39.873001588 +0000 UTC m=+0.077223500 container create d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, architecture=x86_64, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.scope.
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:39 np0005604215.localdomain podman[61186]: 2026-02-01 08:05:39.895137541 +0000 UTC m=+0.080675308 container create 1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, version=17.1.13)
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8.scope.
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f493ed320f2136eba98c6f6d73d7580e3273443b9599c34d1438e87453daf45/merged/scripts supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8f493ed320f2136eba98c6f6d73d7580e3273443b9599c34d1438e87453daf45/merged/var/log/collectd supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e3a7790e7cad798695025ef44722873ac2669462e661d130061be9d691861f40/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain podman[61163]: 2026-02-01 08:05:39.90915327 +0000 UTC m=+0.111642668 container create 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, container_name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:05:39 np0005604215.localdomain podman[61151]: 2026-02-01 08:05:39.809996354 +0000 UTC m=+0.025332924 image pull  registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e.scope.
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:39 np0005604215.localdomain podman[61162]: 2026-02-01 08:05:39.826390828 +0000 UTC m=+0.030612740 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d315715373fb2ed69473b661022a322c730f5613516f294042e6eac2843e9be/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain podman[61163]: 2026-02-01 08:05:39.828088361 +0000 UTC m=+0.030577749 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d315715373fb2ed69473b661022a322c730f5613516f294042e6eac2843e9be/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d315715373fb2ed69473b661022a322c730f5613516f294042e6eac2843e9be/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa.scope.
Feb 01 08:05:39 np0005604215.localdomain podman[61186]: 2026-02-01 08:05:39.934011109 +0000 UTC m=+0.119548876 container init 1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_statedir_owner, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510)
Feb 01 08:05:39 np0005604215.localdomain podman[61186]: 2026-02-01 08:05:39.839085536 +0000 UTC m=+0.024623353 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain podman[61214]: 2026-02-01 08:05:39.94554943 +0000 UTC m=+0.077352294 container create 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, build-date=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.5, container_name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 01 08:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:39 np0005604215.localdomain podman[61163]: 2026-02-01 08:05:39.949575936 +0000 UTC m=+0.152065324 container init 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtlogd_wrapper, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 01 08:05:39 np0005604215.localdomain podman[61163]: 2026-02-01 08:05:39.955879793 +0000 UTC m=+0.158369181 container start 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, container_name=nova_virtlogd_wrapper, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container)
Feb 01 08:05:39 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtlogd_wrapper --cgroupns=host --conmon-pidfile /run/nova_virtlogd_wrapper.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step3 --label container_name=nova_virtlogd_wrapper --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtlogd_wrapper.log --network host --pid host --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:39 np0005604215.localdomain podman[61162]: 2026-02-01 08:05:39.964614467 +0000 UTC m=+0.168836369 container init d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, build-date=2026-01-12T23:07:30Z, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1)
Feb 01 08:05:39 np0005604215.localdomain podman[61162]: 2026-02-01 08:05:39.971724069 +0000 UTC m=+0.175945981 container start d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, distribution-scope=public, container_name=ceilometer_init_log, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step3, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:05:39 np0005604215.localdomain podman[61151]: 2026-02-01 08:05:39.974658291 +0000 UTC m=+0.189994861 container init e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_id=tripleo_step3)
Feb 01 08:05:39 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_init_log --conmon-pidfile /run/ceilometer_init_log.pid --detach=True --label config_id=tripleo_step3 --label container_name=ceilometer_init_log --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_init_log.log --network none --user root --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1 /bin/bash -c chown -R ceilometer:ceilometer /var/log/ceilometer
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: libpod-d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8.scope: Deactivated successfully.
Feb 01 08:05:39 np0005604215.localdomain sudo[61252]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:39 np0005604215.localdomain sudo[61266]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:05:40 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:40 np0005604215.localdomain podman[61186]: 2026-02-01 08:05:40.001726899 +0000 UTC m=+0.187264676 container start 1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step3, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, release=1766032510, version=17.1.13, io.openshift.expose-services=, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 01 08:05:40 np0005604215.localdomain podman[61186]: 2026-02-01 08:05:40.008040477 +0000 UTC m=+0.193578284 container attach 1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, container_name=nova_statedir_owner, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:05:40 np0005604215.localdomain podman[61186]: 2026-02-01 08:05:40.012694463 +0000 UTC m=+0.198232260 container died 1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, container_name=nova_statedir_owner)
Feb 01 08:05:40 np0005604215.localdomain podman[61214]: 2026-02-01 08:05:39.913671331 +0000 UTC m=+0.045474225 image pull  registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: libpod-1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:40 np0005604215.localdomain podman[61151]: 2026-02-01 08:05:40.052214671 +0000 UTC m=+0.267551241 container start e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:05:40 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name collectd --cap-add IPC_LOCK --conmon-pidfile /run/collectd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=d31718fcd17fdeee6489534105191c7a --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=collectd --label managed_by=tripleo_ansible --label config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/collectd.log --memory 512m --network host --pid host --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro --volume /var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/collectd:/var/log/collectd:rw,z --volume /var/lib/container-config-scripts:/config-scripts:ro --volume /var/lib/container-user-scripts:/scripts:z --volume /run:/run:rw --volume /sys/fs/cgroup:/sys/fs/cgroup:ro registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1
Feb 01 08:05:40 np0005604215.localdomain podman[61264]: 2026-02-01 08:05:40.09369807 +0000 UTC m=+0.105600399 container died d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step3, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_init_log, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:05:40 np0005604215.localdomain podman[61284]: 2026-02-01 08:05:40.072028341 +0000 UTC m=+0.067998150 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step3, distribution-scope=public, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started libpod-conmon-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Queued start job for default target Main User Target.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Created slice User Application Slice.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 08:05:40 np0005604215.localdomain podman[61284]: 2026-02-01 08:05:40.154540445 +0000 UTC m=+0.150510254 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, container_name=collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Reached target Paths.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Reached target Timers.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Starting D-Bus User Message Bus Socket...
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Starting Create User's Volatile Files and Directories...
Feb 01 08:05:40 np0005604215.localdomain podman[61284]: unhealthy
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Listening on D-Bus User Message Bus Socket.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Reached target Sockets.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Failed with result 'exit-code'.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Finished Create User's Volatile Files and Directories.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Reached target Basic System.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Reached target Main User Target.
Feb 01 08:05:40 np0005604215.localdomain systemd[61307]: Startup finished in 112ms.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started User Manager for UID 0.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started Session c1 of User root.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started Session c2 of User root.
Feb 01 08:05:40 np0005604215.localdomain sudo[61252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:40 np0005604215.localdomain sudo[61266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:40 np0005604215.localdomain sudo[61252]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:40 np0005604215.localdomain podman[61297]: 2026-02-01 08:05:40.241147877 +0000 UTC m=+0.219112033 container cleanup 1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_statedir_owner, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_statedir_owner, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: session-c1.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: libpod-conmon-1ab3a2c69b685c0c6816908493618cb01de7af949dad63b760fc7780d0aa1e7e.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain podman[61214]: 2026-02-01 08:05:40.247430725 +0000 UTC m=+0.379233589 container init 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:09Z, container_name=rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, io.openshift.expose-services=, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Feb 01 08:05:40 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_statedir_owner --conmon-pidfile /run/nova_statedir_owner.pid --detach=False --env NOVA_STATEDIR_OWNERSHIP_SKIP=triliovault-mounts --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --env __OS_DEBUG=true --label config_id=tripleo_step3 --label container_name=nova_statedir_owner --label managed_by=tripleo_ansible --label config_data={'command': '/container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py', 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': 'triliovault-mounts', 'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690', '__OS_DEBUG': 'true'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'none', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/container-config-scripts:/container-config-scripts:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_statedir_owner.log --network none --privileged=False --security-opt label=disable --user root --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/container-config-scripts:/container-config-scripts:z registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1 /container-config-scripts/pyshim.sh /container-config-scripts/nova_statedir_ownership.py
Feb 01 08:05:40 np0005604215.localdomain sudo[61405]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:40 np0005604215.localdomain sudo[61405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:40 np0005604215.localdomain sudo[61266]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: session-c2.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain podman[61214]: 2026-02-01 08:05:40.27922007 +0000 UTC m=+0.411022954 container start 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=rsyslog, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-rsyslog-container)
Feb 01 08:05:40 np0005604215.localdomain podman[61272]: 2026-02-01 08:05:40.281739279 +0000 UTC m=+0.288617480 container cleanup d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_init_log, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'chown -R ceilometer:ceilometer /var/log/ceilometer'], 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'none', 'start_order': 0, 'user': 'root', 'volumes': ['/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_init_log, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:05:40 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name rsyslog --conmon-pidfile /run/rsyslog.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=52a7bad153b9a3530edb4c6869c1fe7c --label config_id=tripleo_step3 --label container_name=rsyslog --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/rsyslog.log --network host --privileged=True --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:ro --volume /var/log/containers/rsyslog:/var/log/rsyslog:rw,z --volume /var/log:/var/log/host:ro --volume /var/lib/rsyslog.container:/var/lib/rsyslog:rw,z registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: libpod-conmon-d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain sudo[61405]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: libpod-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain podman[61462]: 2026-02-01 08:05:40.402097709 +0000 UTC m=+0.031420035 container died 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:05:40 np0005604215.localdomain podman[61462]: 2026-02-01 08:05:40.423301122 +0000 UTC m=+0.052623428 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-type=git, container_name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vendor=Red Hat, Inc., config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:09Z, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: libpod-conmon-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain podman[61501]: 2026-02-01 08:05:40.548021679 +0000 UTC m=+0.057328557 container create 80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started libpod-conmon-80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5.scope.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79513147d587ecdcd7bb2edc01bb3b7dc549ee20844dd0dc1e7a6b286443d3ff/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79513147d587ecdcd7bb2edc01bb3b7dc549ee20844dd0dc1e7a6b286443d3ff/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79513147d587ecdcd7bb2edc01bb3b7dc549ee20844dd0dc1e7a6b286443d3ff/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/79513147d587ecdcd7bb2edc01bb3b7dc549ee20844dd0dc1e7a6b286443d3ff/merged/var/log/swtpm/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain podman[61501]: 2026-02-01 08:05:40.618606649 +0000 UTC m=+0.127913597 container init 80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt)
Feb 01 08:05:40 np0005604215.localdomain podman[61501]: 2026-02-01 08:05:40.529648503 +0000 UTC m=+0.038955371 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:40 np0005604215.localdomain podman[61501]: 2026-02-01 08:05:40.635107446 +0000 UTC m=+0.144414334 container start 80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e3a7790e7cad798695025ef44722873ac2669462e661d130061be9d691861f40-merged.mount: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d52030e77d7c02b05ca2a3f0e1bb43639ffbb98854fc09c2ed39dede099dd2d8-userdata-shm.mount: Deactivated successfully.
Feb 01 08:05:40 np0005604215.localdomain podman[61578]: 2026-02-01 08:05:40.859017389 +0000 UTC m=+0.082218147 container create a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, container_name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started libpod-conmon-a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3.scope.
Feb 01 08:05:40 np0005604215.localdomain podman[61578]: 2026-02-01 08:05:40.80989558 +0000 UTC m=+0.033096388 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:40 np0005604215.localdomain podman[61578]: 2026-02-01 08:05:40.93184562 +0000 UTC m=+0.155046388 container init a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:31:49Z, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtsecretd, com.redhat.component=openstack-nova-libvirt-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:05:40 np0005604215.localdomain podman[61578]: 2026-02-01 08:05:40.945735155 +0000 UTC m=+0.168935893 container start a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:31:49Z, container_name=nova_virtsecretd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 08:05:40 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtsecretd --cgroupns=host --conmon-pidfile /run/nova_virtsecretd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step3 --label container_name=nova_virtsecretd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtsecretd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:40 np0005604215.localdomain sudo[61597]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:40 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:40 np0005604215.localdomain systemd[1]: Started Session c3 of User root.
Feb 01 08:05:41 np0005604215.localdomain sudo[61597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:41 np0005604215.localdomain sudo[61597]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: session-c3.scope: Deactivated successfully.
Feb 01 08:05:41 np0005604215.localdomain podman[61711]: 2026-02-01 08:05:41.37600003 +0000 UTC m=+0.070527630 container create 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtnodedevd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public)
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started libpod-conmon-883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff.scope.
Feb 01 08:05:41 np0005604215.localdomain podman[61727]: 2026-02-01 08:05:41.434862073 +0000 UTC m=+0.085358674 container create 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1766032510, container_name=iscsid, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain podman[61711]: 2026-02-01 08:05:41.346021782 +0000 UTC m=+0.040549382 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:41 np0005604215.localdomain podman[61711]: 2026-02-01 08:05:41.450161433 +0000 UTC m=+0.144689063 container init 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, tcib_managed=true, container_name=nova_virtnodedevd, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z)
Feb 01 08:05:41 np0005604215.localdomain podman[61711]: 2026-02-01 08:05:41.459573887 +0000 UTC m=+0.154101517 container start 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, release=1766032510, config_id=tripleo_step3, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtnodedevd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z)
Feb 01 08:05:41 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtnodedevd --cgroupns=host --conmon-pidfile /run/nova_virtnodedevd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step3 --label container_name=nova_virtnodedevd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtnodedevd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started libpod-conmon-28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.scope.
Feb 01 08:05:41 np0005604215.localdomain sudo[61750]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:41 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:41 np0005604215.localdomain podman[61727]: 2026-02-01 08:05:41.401104486 +0000 UTC m=+0.051601107 image pull  registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started Session c4 of User root.
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/179e7ed4ab403439e752a2c426c6db4ca9807018662c061e320fe01562a6e116/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/179e7ed4ab403439e752a2c426c6db4ca9807018662c061e320fe01562a6e116/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:41 np0005604215.localdomain sudo[61750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:05:41 np0005604215.localdomain podman[61727]: 2026-02-01 08:05:41.53625314 +0000 UTC m=+0.186749771 container init 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid)
Feb 01 08:05:41 np0005604215.localdomain sudo[61772]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:05:41 np0005604215.localdomain podman[61727]: 2026-02-01 08:05:41.576919793 +0000 UTC m=+0.227416404 container start 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step3, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:05:41 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:41 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name iscsid --conmon-pidfile /run/iscsid.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=848fbaed99314033c0982eb0cffd8af7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step3 --label container_name=iscsid --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/iscsid.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: Started Session c5 of User root.
Feb 01 08:05:41 np0005604215.localdomain sudo[61772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:41 np0005604215.localdomain sudo[61750]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: session-c4.scope: Deactivated successfully.
Feb 01 08:05:41 np0005604215.localdomain sudo[61772]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: session-c5.scope: Deactivated successfully.
Feb 01 08:05:41 np0005604215.localdomain podman[61774]: 2026-02-01 08:05:41.673344313 +0000 UTC m=+0.083477956 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=starting, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, release=1766032510, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 01 08:05:41 np0005604215.localdomain kernel: Loading iSCSI transport class v2.0-870.
Feb 01 08:05:41 np0005604215.localdomain podman[61774]: 2026-02-01 08:05:41.726001551 +0000 UTC m=+0.136135254 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:05:41 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:05:42 np0005604215.localdomain podman[61893]: 2026-02-01 08:05:42.106122437 +0000 UTC m=+0.103160102 container create 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, config_id=tripleo_step3, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 01 08:05:42 np0005604215.localdomain podman[61893]: 2026-02-01 08:05:42.053190029 +0000 UTC m=+0.050227714 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: Started libpod-conmon-39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5.scope.
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain podman[61893]: 2026-02-01 08:05:42.196731474 +0000 UTC m=+0.193769149 container init 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, container_name=nova_virtstoraged, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt)
Feb 01 08:05:42 np0005604215.localdomain podman[61893]: 2026-02-01 08:05:42.210807325 +0000 UTC m=+0.207845000 container start 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, build-date=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, vcs-type=git, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_id=tripleo_step3, container_name=nova_virtstoraged, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:05:42 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtstoraged --cgroupns=host --conmon-pidfile /run/nova_virtstoraged.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step3 --label container_name=nova_virtstoraged --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtstoraged.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:42 np0005604215.localdomain sudo[61912]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:42 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: Started Session c6 of User root.
Feb 01 08:05:42 np0005604215.localdomain sudo[61912]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:42 np0005604215.localdomain sudo[61912]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: session-c6.scope: Deactivated successfully.
Feb 01 08:05:42 np0005604215.localdomain podman[61998]: 2026-02-01 08:05:42.665534247 +0000 UTC m=+0.082975230 container create 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_virtqemud, architecture=x86_64, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, tcib_managed=true, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git, build-date=2026-01-12T23:31:49Z)
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: Started libpod-conmon-526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70.scope.
Feb 01 08:05:42 np0005604215.localdomain podman[61998]: 2026-02-01 08:05:42.614205649 +0000 UTC m=+0.031646652 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/log/swtpm supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:42 np0005604215.localdomain podman[61998]: 2026-02-01 08:05:42.736557671 +0000 UTC m=+0.153998654 container init 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, release=1766032510, container_name=nova_virtqemud, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:05:42 np0005604215.localdomain podman[61998]: 2026-02-01 08:05:42.745485491 +0000 UTC m=+0.162926474 container start 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=nova_virtqemud, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 08:05:42 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtqemud --cgroupns=host --conmon-pidfile /run/nova_virtqemud.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step3 --label container_name=nova_virtqemud --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtqemud.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro --volume /var/log/containers/libvirt/swtpm:/var/log/swtpm:z registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:42 np0005604215.localdomain sudo[62017]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:42 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: Started Session c7 of User root.
Feb 01 08:05:42 np0005604215.localdomain sudo[62017]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: tmp-crun.B8fDCu.mount: Deactivated successfully.
Feb 01 08:05:42 np0005604215.localdomain sudo[62017]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:42 np0005604215.localdomain systemd[1]: session-c7.scope: Deactivated successfully.
Feb 01 08:05:43 np0005604215.localdomain podman[62104]: 2026-02-01 08:05:43.243745636 +0000 UTC m=+0.098635920 container create 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_id=tripleo_step3, container_name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, architecture=x86_64)
Feb 01 08:05:43 np0005604215.localdomain podman[62104]: 2026-02-01 08:05:43.189133436 +0000 UTC m=+0.044023800 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:43 np0005604215.localdomain systemd[1]: Started libpod-conmon-3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac.scope.
Feb 01 08:05:43 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/var/cache/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/var/log/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/var/lib/vhost_sockets supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:05:43 np0005604215.localdomain podman[62104]: 2026-02-01 08:05:43.318956822 +0000 UTC m=+0.173847136 container init 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, container_name=nova_virtproxyd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git)
Feb 01 08:05:43 np0005604215.localdomain podman[62104]: 2026-02-01 08:05:43.330217294 +0000 UTC m=+0.185107608 container start 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtproxyd, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510)
Feb 01 08:05:43 np0005604215.localdomain python3[61002]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_virtproxyd --cgroupns=host --conmon-pidfile /run/nova_virtproxyd.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step3 --label container_name=nova_virtproxyd --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_virtproxyd.log --network host --pid host --pids-limit 65536 --privileged=True --security-opt label=level:s0 --security-opt label=type:spc_t --security-opt label=filetype:container_file_t --ulimit nofile=131072 --ulimit nproc=126960 --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/libvirt:/var/log/libvirt:shared,z --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /run:/run --volume /sys/fs/cgroup:/sys/fs/cgroup --volume /sys/fs/selinux:/sys/fs/selinux --volume /etc/selinux/config:/etc/selinux/config:ro --volume /etc/libvirt:/etc/libvirt:shared --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/cache/libvirt:/var/cache/libvirt:shared --volume /var/lib/vhost_sockets:/var/lib/vhost_sockets --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:05:43 np0005604215.localdomain sudo[62123]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:05:43 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:05:43 np0005604215.localdomain systemd[1]: Started Session c8 of User root.
Feb 01 08:05:43 np0005604215.localdomain sudo[62123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:05:43 np0005604215.localdomain sudo[62123]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:43 np0005604215.localdomain systemd[1]: session-c8.scope: Deactivated successfully.
Feb 01 08:05:43 np0005604215.localdomain sudo[61000]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:43 np0005604215.localdomain sudo[62181]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjxyhgbtdzeawbiwpbblltvcsyfqrjnx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:43 np0005604215.localdomain sudo[62181]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:43 np0005604215.localdomain python3[62183]: ansible-file Invoked with path=/etc/systemd/system/tripleo_collectd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:43 np0005604215.localdomain sudo[62181]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:44 np0005604215.localdomain sudo[62197]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ermppbzdokbkpdcloivlmsbwlldedxhm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:44 np0005604215.localdomain sudo[62197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:44 np0005604215.localdomain python3[62199]: ansible-file Invoked with path=/etc/systemd/system/tripleo_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:44 np0005604215.localdomain sudo[62197]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:44 np0005604215.localdomain sudo[62213]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wyrpunmvwmwaevelaiftloaywpkuybwk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:44 np0005604215.localdomain sudo[62213]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:44 np0005604215.localdomain python3[62215]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:44 np0005604215.localdomain sudo[62213]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:44 np0005604215.localdomain sudo[62229]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etcgvhlmlzqqoofsqsqazmdykvwoyrxh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:44 np0005604215.localdomain sudo[62229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:44 np0005604215.localdomain python3[62231]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:44 np0005604215.localdomain sudo[62229]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:44 np0005604215.localdomain sudo[62245]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnjczvaatkczrcipclqrafbbrzxahgyd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:44 np0005604215.localdomain sudo[62245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:44 np0005604215.localdomain python3[62247]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:44 np0005604215.localdomain sudo[62245]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:45 np0005604215.localdomain sudo[62261]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmhxybrllbpawlcrbqmtntksjuhzvrcp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:45 np0005604215.localdomain sudo[62261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:45 np0005604215.localdomain python3[62263]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:45 np0005604215.localdomain sudo[62261]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:45 np0005604215.localdomain sudo[62277]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywifszfuixagsdeiyhhhuxdtjvxpogeq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:45 np0005604215.localdomain sudo[62277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:45 np0005604215.localdomain python3[62279]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:45 np0005604215.localdomain sudo[62277]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:45 np0005604215.localdomain sudo[62293]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiacdettujiaespkhpgxbakvqmrbgfie ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:45 np0005604215.localdomain sudo[62293]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:45 np0005604215.localdomain python3[62295]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:45 np0005604215.localdomain sudo[62293]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:45 np0005604215.localdomain sudo[62309]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apiknkhwvmybrdyitjdlntxcwwkkdhom ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:45 np0005604215.localdomain sudo[62309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:46 np0005604215.localdomain python3[62311]: ansible-file Invoked with path=/etc/systemd/system/tripleo_rsyslog.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:46 np0005604215.localdomain sudo[62309]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:46 np0005604215.localdomain sudo[62326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cguouvghwpybavrkwuqeujmhcxvztszs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:46 np0005604215.localdomain sudo[62326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:46 np0005604215.localdomain python3[62328]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_collectd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:46 np0005604215.localdomain sudo[62326]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:46 np0005604215.localdomain sudo[62342]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocmhbgdnmtegoawtrfhucsnnsdddycun ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:46 np0005604215.localdomain sudo[62342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:46 np0005604215.localdomain python3[62344]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_iscsid_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:46 np0005604215.localdomain sudo[62342]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:46 np0005604215.localdomain sudo[62358]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhvrumlrolxduzddhuqwbnfzjnvnfxco ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:46 np0005604215.localdomain sudo[62358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:46 np0005604215.localdomain python3[62360]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:46 np0005604215.localdomain sudo[62358]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:46 np0005604215.localdomain sudo[62374]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgjyfnbxzdwjvmkavgcyykayzirrmcer ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:46 np0005604215.localdomain sudo[62374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:47 np0005604215.localdomain python3[62376]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:47 np0005604215.localdomain sudo[62374]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:47 np0005604215.localdomain sudo[62390]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftzxbwyclivhmrqtglvaojtvvpjnoxzh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:47 np0005604215.localdomain sudo[62390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:47 np0005604215.localdomain python3[62392]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:47 np0005604215.localdomain sudo[62390]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:47 np0005604215.localdomain sudo[62406]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fufprbokvfnimvdahgjmvjvkwuhjmlmf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:47 np0005604215.localdomain sudo[62406]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:47 np0005604215.localdomain python3[62408]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:47 np0005604215.localdomain sudo[62406]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:47 np0005604215.localdomain sudo[62422]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xemzoxatixevstiybvonwalvekdtatwc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:47 np0005604215.localdomain sudo[62422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:47 np0005604215.localdomain python3[62424]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:47 np0005604215.localdomain sudo[62422]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:47 np0005604215.localdomain sudo[62438]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ggllbnyoznwyfvwgnaehzknxgitdaqlb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:47 np0005604215.localdomain sudo[62438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:48 np0005604215.localdomain python3[62440]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:48 np0005604215.localdomain sudo[62438]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:48 np0005604215.localdomain sudo[62454]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wtmzjbpjkkrgvwybuvemsvqkbytedapc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:48 np0005604215.localdomain sudo[62454]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:48 np0005604215.localdomain python3[62456]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_rsyslog_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:05:48 np0005604215.localdomain sudo[62454]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:48 np0005604215.localdomain sudo[62515]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyrnzrdcisgnbvdirrxabkvbsdxeohxn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:48 np0005604215.localdomain sudo[62515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:48 np0005604215.localdomain python3[62517]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_collectd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:49 np0005604215.localdomain sudo[62515]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:49 np0005604215.localdomain sudo[62544]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glrkeayevkpcuhdembdjliqbuqciwcop ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:49 np0005604215.localdomain sudo[62544]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:49 np0005604215.localdomain python3[62546]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_iscsid.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:49 np0005604215.localdomain sudo[62544]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:49 np0005604215.localdomain sudo[62573]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trttsgxonsbzvemnoaaczcfhlsrvyhmt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:49 np0005604215.localdomain sudo[62573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:50 np0005604215.localdomain python3[62575]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:50 np0005604215.localdomain sudo[62573]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:50 np0005604215.localdomain sudo[62602]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryorbbjckujbyurnvwflzflozbafdmyv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:50 np0005604215.localdomain sudo[62602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:50 np0005604215.localdomain python3[62604]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_nova_virtnodedevd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:50 np0005604215.localdomain sudo[62602]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:50 np0005604215.localdomain sudo[62631]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uofnoindykeblsjlrslekcithjajpdod ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:50 np0005604215.localdomain sudo[62631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:51 np0005604215.localdomain python3[62633]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_nova_virtproxyd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:51 np0005604215.localdomain sudo[62631]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:51 np0005604215.localdomain sudo[62660]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sahnhwsbacuohanldcdshlmefhtppznl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:51 np0005604215.localdomain sudo[62660]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:51 np0005604215.localdomain python3[62662]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_nova_virtqemud.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:51 np0005604215.localdomain sudo[62660]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:52 np0005604215.localdomain sudo[62689]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-duovcjcqujvbphfskdosfyxzviuofoxh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:52 np0005604215.localdomain sudo[62689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:53 np0005604215.localdomain python3[62691]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_nova_virtsecretd.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:53 np0005604215.localdomain sudo[62689]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:53 np0005604215.localdomain sudo[62718]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhxuutsddglrthpursiyvfblvhbkziul ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:53 np0005604215.localdomain sudo[62718]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Activating special unit Exit the Session...
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped target Main User Target.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped target Basic System.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped target Paths.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped target Sockets.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped target Timers.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Closed D-Bus User Message Bus Socket.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Stopped Create User's Volatile Files and Directories.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Removed slice User Application Slice.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Reached target Shutdown.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Finished Exit the Session.
Feb 01 08:05:53 np0005604215.localdomain systemd[61307]: Reached target Exit the Session.
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 01 08:05:53 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 01 08:05:53 np0005604215.localdomain python3[62720]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_nova_virtstoraged.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:53 np0005604215.localdomain sudo[62718]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:53 np0005604215.localdomain sudo[62751]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owhsqnamclonhnubnwgoeyfdlrrmcvlv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:53 np0005604215.localdomain sudo[62751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:54 np0005604215.localdomain python3[62753]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933148.4357355-99659-3428140139111/source dest=/etc/systemd/system/tripleo_rsyslog.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:05:54 np0005604215.localdomain sudo[62751]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:54 np0005604215.localdomain sudo[62767]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scfdahkelkhtphelzbxkupepkpyjkltz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:54 np0005604215.localdomain sudo[62767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:54 np0005604215.localdomain python3[62769]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 08:05:54 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:05:54 np0005604215.localdomain systemd-rc-local-generator[62794]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:05:54 np0005604215.localdomain systemd-sysv-generator[62799]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:05:54 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:05:54 np0005604215.localdomain sudo[62767]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:55 np0005604215.localdomain sudo[62819]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkblqspuxkuuccclszhctwkwyacdqagx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:55 np0005604215.localdomain sudo[62819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:55 np0005604215.localdomain python3[62821]: ansible-systemd Invoked with state=restarted name=tripleo_collectd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:05:55 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:05:55 np0005604215.localdomain systemd-rc-local-generator[62851]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:05:55 np0005604215.localdomain systemd-sysv-generator[62854]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:05:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:05:55 np0005604215.localdomain systemd[1]: Starting collectd container...
Feb 01 08:05:55 np0005604215.localdomain systemd[1]: Started collectd container.
Feb 01 08:05:55 np0005604215.localdomain sudo[62819]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:56 np0005604215.localdomain sudo[62886]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmtoelgdsjmcezpmlhgwkrlobkhugnwp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:56 np0005604215.localdomain sudo[62886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:56 np0005604215.localdomain python3[62888]: ansible-systemd Invoked with state=restarted name=tripleo_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:05:56 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:05:56 np0005604215.localdomain systemd-sysv-generator[62921]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:05:56 np0005604215.localdomain systemd-rc-local-generator[62916]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:05:56 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:05:56 np0005604215.localdomain systemd[1]: Starting iscsid container...
Feb 01 08:05:56 np0005604215.localdomain systemd[1]: Started iscsid container.
Feb 01 08:05:56 np0005604215.localdomain sudo[62886]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:57 np0005604215.localdomain sudo[62953]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqurlylqghkgnmmacaiqnhoxxcdqiwrc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:57 np0005604215.localdomain sudo[62953]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:57 np0005604215.localdomain python3[62955]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtlogd_wrapper.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:05:58 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:05:58 np0005604215.localdomain systemd-sysv-generator[62986]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:05:58 np0005604215.localdomain systemd-rc-local-generator[62982]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:05:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:05:58 np0005604215.localdomain systemd[1]: Starting nova_virtlogd_wrapper container...
Feb 01 08:05:58 np0005604215.localdomain systemd[1]: Started nova_virtlogd_wrapper container.
Feb 01 08:05:58 np0005604215.localdomain sudo[62953]: pam_unix(sudo:session): session closed for user root
Feb 01 08:05:59 np0005604215.localdomain sudo[63021]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlxaygwuicvzteyhlxgummprvhbkjbbm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:05:59 np0005604215.localdomain sudo[63021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:05:59 np0005604215.localdomain python3[63023]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtnodedevd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:05:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:05:59 np0005604215.localdomain podman[63025]: 2026-02-01 08:05:59.857383858 +0000 UTC m=+0.074421232 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, architecture=x86_64)
Feb 01 08:06:00 np0005604215.localdomain podman[63025]: 2026-02-01 08:06:00.038528411 +0000 UTC m=+0.255565865 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 01 08:06:00 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:06:00 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:00 np0005604215.localdomain systemd-rc-local-generator[63086]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:00 np0005604215.localdomain systemd-sysv-generator[63089]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:00 np0005604215.localdomain systemd[1]: Starting nova_virtnodedevd container...
Feb 01 08:06:00 np0005604215.localdomain tripleo-start-podman-container[63092]: Creating additional drop-in dependency for "nova_virtnodedevd" (883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff)
Feb 01 08:06:00 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:00 np0005604215.localdomain systemd-sysv-generator[63153]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:00 np0005604215.localdomain systemd-rc-local-generator[63148]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:01 np0005604215.localdomain systemd[1]: Started nova_virtnodedevd container.
Feb 01 08:06:01 np0005604215.localdomain sudo[63021]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:01 np0005604215.localdomain sudo[63173]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkntgpqdgebnhusvvjbngtljpwjgmefp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:06:01 np0005604215.localdomain sudo[63173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:01 np0005604215.localdomain python3[63175]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtproxyd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:06:01 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:01 np0005604215.localdomain systemd-sysv-generator[63207]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:01 np0005604215.localdomain systemd-rc-local-generator[63202]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:02 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:02 np0005604215.localdomain systemd[1]: Starting nova_virtproxyd container...
Feb 01 08:06:02 np0005604215.localdomain tripleo-start-podman-container[63215]: Creating additional drop-in dependency for "nova_virtproxyd" (3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac)
Feb 01 08:06:02 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:02 np0005604215.localdomain systemd-sysv-generator[63277]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:02 np0005604215.localdomain systemd-rc-local-generator[63271]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:02 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:02 np0005604215.localdomain systemd[1]: Started nova_virtproxyd container.
Feb 01 08:06:02 np0005604215.localdomain sudo[63173]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:03 np0005604215.localdomain sudo[63297]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stqpeqinhwcrpzaefvrmqimdxbktpgxk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:06:03 np0005604215.localdomain sudo[63297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:03 np0005604215.localdomain python3[63299]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtqemud.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:06:03 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:03 np0005604215.localdomain systemd-rc-local-generator[63329]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:03 np0005604215.localdomain systemd-sysv-generator[63332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:03 np0005604215.localdomain systemd[1]: Starting nova_virtqemud container...
Feb 01 08:06:03 np0005604215.localdomain tripleo-start-podman-container[63339]: Creating additional drop-in dependency for "nova_virtqemud" (526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70)
Feb 01 08:06:03 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:03 np0005604215.localdomain systemd-rc-local-generator[63395]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:03 np0005604215.localdomain systemd-sysv-generator[63399]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:04 np0005604215.localdomain systemd[1]: Started nova_virtqemud container.
Feb 01 08:06:04 np0005604215.localdomain sudo[63297]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:04 np0005604215.localdomain sudo[63422]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iicxpgpwqntiaiqjtaautxrervqdmuzg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:06:04 np0005604215.localdomain sudo[63422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:04 np0005604215.localdomain python3[63424]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtsecretd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:06:04 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:04 np0005604215.localdomain systemd-sysv-generator[63454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:04 np0005604215.localdomain systemd-rc-local-generator[63451]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:04 np0005604215.localdomain systemd[1]: Starting nova_virtsecretd container...
Feb 01 08:06:05 np0005604215.localdomain tripleo-start-podman-container[63464]: Creating additional drop-in dependency for "nova_virtsecretd" (a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3)
Feb 01 08:06:05 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:05 np0005604215.localdomain systemd-sysv-generator[63528]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:05 np0005604215.localdomain systemd-rc-local-generator[63523]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:05 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:05 np0005604215.localdomain systemd[1]: Started nova_virtsecretd container.
Feb 01 08:06:05 np0005604215.localdomain sudo[63422]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:05 np0005604215.localdomain sudo[63548]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxwoknxbcwxggczlshjdegsgryemxwdw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:06:05 np0005604215.localdomain sudo[63548]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:05 np0005604215.localdomain python3[63550]: ansible-systemd Invoked with state=restarted name=tripleo_nova_virtstoraged.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:06:06 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:06 np0005604215.localdomain systemd-sysv-generator[63578]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:06 np0005604215.localdomain systemd-rc-local-generator[63573]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:06 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:06 np0005604215.localdomain systemd[1]: Starting nova_virtstoraged container...
Feb 01 08:06:06 np0005604215.localdomain tripleo-start-podman-container[63590]: Creating additional drop-in dependency for "nova_virtstoraged" (39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5)
Feb 01 08:06:06 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:06 np0005604215.localdomain systemd-rc-local-generator[63645]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:06 np0005604215.localdomain systemd-sysv-generator[63649]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:06 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:06 np0005604215.localdomain systemd[1]: Started nova_virtstoraged container.
Feb 01 08:06:06 np0005604215.localdomain sudo[63548]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:07 np0005604215.localdomain sudo[63672]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prsjieaogjslmholxafrkwjesswuescf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:06:07 np0005604215.localdomain sudo[63672]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:07 np0005604215.localdomain python3[63674]: ansible-systemd Invoked with state=restarted name=tripleo_rsyslog.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:06:07 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:06:07 np0005604215.localdomain systemd-rc-local-generator[63700]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:06:07 np0005604215.localdomain systemd-sysv-generator[63706]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:06:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:06:07 np0005604215.localdomain systemd[1]: Starting rsyslog container...
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:06:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:08 np0005604215.localdomain podman[63714]: 2026-02-01 08:06:08.036129448 +0000 UTC m=+0.119578417 container init 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.openshift.expose-services=, tcib_managed=true, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z)
Feb 01 08:06:08 np0005604215.localdomain podman[63714]: 2026-02-01 08:06:08.046365669 +0000 UTC m=+0.129814648 container start 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:09Z, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 01 08:06:08 np0005604215.localdomain podman[63714]: rsyslog
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: Started rsyslog container.
Feb 01 08:06:08 np0005604215.localdomain sudo[63733]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:06:08 np0005604215.localdomain sudo[63733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:06:08 np0005604215.localdomain sudo[63672]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:08 np0005604215.localdomain sudo[63733]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: libpod-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:06:08 np0005604215.localdomain podman[63747]: 2026-02-01 08:06:08.207859546 +0000 UTC m=+0.050464451 container died 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=rsyslog)
Feb 01 08:06:08 np0005604215.localdomain podman[63747]: 2026-02-01 08:06:08.23032886 +0000 UTC m=+0.072933695 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=rsyslog, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20260112.1)
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:06:08 np0005604215.localdomain podman[63763]: 2026-02-01 08:06:08.315150217 +0000 UTC m=+0.059902818 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, build-date=2026-01-12T22:10:09Z, tcib_managed=true, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible)
Feb 01 08:06:08 np0005604215.localdomain podman[63763]: rsyslog
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 01 08:06:08 np0005604215.localdomain sudo[63788]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-endxbstjgebzxzaqebrvhovrapcalzil ; /usr/bin/python3
Feb 01 08:06:08 np0005604215.localdomain sudo[63788]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:08 np0005604215.localdomain python3[63790]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks3.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 1.
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: Stopped rsyslog container.
Feb 01 08:06:08 np0005604215.localdomain sudo[63788]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: Starting rsyslog container...
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:06:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:08 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:08 np0005604215.localdomain podman[63791]: 2026-02-01 08:06:08.625046832 +0000 UTC m=+0.090947319 container init 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, container_name=rsyslog, config_id=tripleo_step3, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container)
Feb 01 08:06:08 np0005604215.localdomain podman[63791]: 2026-02-01 08:06:08.633336052 +0000 UTC m=+0.099236529 container start 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_id=tripleo_step3, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510)
Feb 01 08:06:08 np0005604215.localdomain podman[63791]: rsyslog
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: Started rsyslog container.
Feb 01 08:06:08 np0005604215.localdomain sudo[63811]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:06:08 np0005604215.localdomain sudo[63811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:06:08 np0005604215.localdomain sudo[63811]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: libpod-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:06:08 np0005604215.localdomain podman[63814]: 2026-02-01 08:06:08.784086183 +0000 UTC m=+0.049561503 container died 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=rsyslog, com.redhat.component=openstack-rsyslog-container, tcib_managed=true, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step3, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, io.openshift.expose-services=)
Feb 01 08:06:08 np0005604215.localdomain podman[63814]: 2026-02-01 08:06:08.808852649 +0000 UTC m=+0.074327929 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-rsyslog, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, architecture=x86_64, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:06:08 np0005604215.localdomain podman[63841]: 2026-02-01 08:06:08.893634554 +0000 UTC m=+0.058222675 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:09Z, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=rsyslog, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, tcib_managed=true, build-date=2026-01-12T22:10:09Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:06:08 np0005604215.localdomain podman[63841]: rsyslog
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0-merged.mount: Deactivated successfully.
Feb 01 08:06:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893-userdata-shm.mount: Deactivated successfully.
Feb 01 08:06:08 np0005604215.localdomain sudo[63883]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ceizfhkkxjspperduufykcxlnunwyrga ; /usr/bin/python3
Feb 01 08:06:08 np0005604215.localdomain sudo[63883]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 2.
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Stopped rsyslog container.
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Starting rsyslog container...
Feb 01 08:06:09 np0005604215.localdomain sudo[63883]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:06:09 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:09 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:09 np0005604215.localdomain podman[63886]: 2026-02-01 08:06:09.159310025 +0000 UTC m=+0.116575833 container init 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, container_name=rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, name=rhosp-rhel9/openstack-rsyslog, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, distribution-scope=public, vcs-type=git)
Feb 01 08:06:09 np0005604215.localdomain podman[63886]: 2026-02-01 08:06:09.165518119 +0000 UTC m=+0.122783957 container start 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 rsyslog, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, build-date=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog)
Feb 01 08:06:09 np0005604215.localdomain podman[63886]: rsyslog
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Started rsyslog container.
Feb 01 08:06:09 np0005604215.localdomain sudo[63905]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:06:09 np0005604215.localdomain sudo[63905]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:06:09 np0005604215.localdomain sudo[63905]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: libpod-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:06:09 np0005604215.localdomain podman[63929]: 2026-02-01 08:06:09.298197135 +0000 UTC m=+0.032626403 container died 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, org.opencontainers.image.created=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=rsyslog, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 rsyslog, tcib_managed=true, config_id=tripleo_step3, build-date=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, com.redhat.component=openstack-rsyslog-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-rsyslog, architecture=x86_64, release=1766032510)
Feb 01 08:06:09 np0005604215.localdomain podman[63929]: 2026-02-01 08:06:09.318778939 +0000 UTC m=+0.053208177 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-rsyslog, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:06:09 np0005604215.localdomain sudo[63970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stndrqhrpasnasmyeuplbswxtubydzzi ; /usr/bin/python3
Feb 01 08:06:09 np0005604215.localdomain sudo[63970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:09 np0005604215.localdomain podman[63948]: 2026-02-01 08:06:09.405674451 +0000 UTC m=+0.058722961 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-rsyslog-container, description=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, build-date=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=rsyslog, version=17.1.13)
Feb 01 08:06:09 np0005604215.localdomain podman[63948]: rsyslog
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 01 08:06:09 np0005604215.localdomain sudo[63970]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 3.
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Stopped rsyslog container.
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Starting rsyslog container...
Feb 01 08:06:09 np0005604215.localdomain sudo[64003]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwvzlflosxnksekskbhgtcjwcdwvusxb ; /usr/bin/python3
Feb 01 08:06:09 np0005604215.localdomain sudo[64003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:06:09 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:09 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:09 np0005604215.localdomain podman[64004]: 2026-02-01 08:06:09.898859167 +0000 UTC m=+0.107208139 container init 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, com.redhat.component=openstack-rsyslog-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T22:10:09Z, distribution-scope=public, release=1766032510, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, container_name=rsyslog, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:06:09 np0005604215.localdomain podman[64004]: 2026-02-01 08:06:09.910057277 +0000 UTC m=+0.118406239 container start 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:09Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, url=https://www.redhat.com, container_name=rsyslog, architecture=x86_64, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-rsyslog, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, distribution-scope=public)
Feb 01 08:06:09 np0005604215.localdomain podman[64004]: rsyslog
Feb 01 08:06:09 np0005604215.localdomain systemd[1]: Started rsyslog container.
Feb 01 08:06:09 np0005604215.localdomain sudo[64026]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:06:09 np0005604215.localdomain sudo[64026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:06:09 np0005604215.localdomain python3[64015]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks3.json short_hostname=np0005604215 step=3 update_config_hash_only=False
Feb 01 08:06:09 np0005604215.localdomain sudo[64026]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:09 np0005604215.localdomain sudo[64003]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: libpod-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:06:10 np0005604215.localdomain podman[64029]: 2026-02-01 08:06:10.041012309 +0000 UTC m=+0.030010931 container died 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, com.redhat.component=openstack-rsyslog-container, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:09Z)
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tmp-crun.Y1zWfX.mount: Deactivated successfully.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893-userdata-shm.mount: Deactivated successfully.
Feb 01 08:06:10 np0005604215.localdomain podman[64029]: 2026-02-01 08:06:10.086410771 +0000 UTC m=+0.075409413 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=rsyslog, build-date=2026-01-12T22:10:09Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-rsyslog-container, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:09Z)
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:06:10 np0005604215.localdomain podman[64043]: 2026-02-01 08:06:10.174473128 +0000 UTC m=+0.056120308 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, com.redhat.component=openstack-rsyslog-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T22:10:09Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, vcs-type=git, container_name=rsyslog, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.created=2026-01-12T22:10:09Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 01 08:06:10 np0005604215.localdomain podman[64043]: rsyslog
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:06:10 np0005604215.localdomain podman[64055]: 2026-02-01 08:06:10.284914178 +0000 UTC m=+0.083496256 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=starting, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, container_name=collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public)
Feb 01 08:06:10 np0005604215.localdomain podman[64055]: 2026-02-01 08:06:10.300695201 +0000 UTC m=+0.099277259 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd)
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 4.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Stopped rsyslog container.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Starting rsyslog container...
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:06:10 np0005604215.localdomain sudo[64099]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buxtihtihzwddffhsqsiskunpsvrvgvv ; /usr/bin/python3
Feb 01 08:06:10 np0005604215.localdomain sudo[64099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:06:10 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/lib/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:10 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0/merged/var/log/rsyslog supports timestamps until 2038 (0x7fffffff)
Feb 01 08:06:10 np0005604215.localdomain podman[64075]: 2026-02-01 08:06:10.451383071 +0000 UTC m=+0.113429943 container init 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, container_name=rsyslog, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 rsyslog, description=Red Hat OpenStack Platform 17.1 rsyslog, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:09Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-rsyslog, tcib_managed=true, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Feb 01 08:06:10 np0005604215.localdomain podman[64075]: 2026-02-01 08:06:10.460630951 +0000 UTC m=+0.122677823 container start 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=rsyslog, io.openshift.expose-services=, build-date=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 rsyslog, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-rsyslog, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:09Z, summary=Red Hat OpenStack Platform 17.1 rsyslog, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']})
Feb 01 08:06:10 np0005604215.localdomain podman[64075]: rsyslog
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Started rsyslog container.
Feb 01 08:06:10 np0005604215.localdomain sudo[64108]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:06:10 np0005604215.localdomain sudo[64108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:06:10 np0005604215.localdomain sudo[64108]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: libpod-11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893.scope: Deactivated successfully.
Feb 01 08:06:10 np0005604215.localdomain python3[64104]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:06:10 np0005604215.localdomain sudo[64099]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:10 np0005604215.localdomain podman[64111]: 2026-02-01 08:06:10.641461994 +0000 UTC m=+0.073907895 container died 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, summary=Red Hat OpenStack Platform 17.1 rsyslog, build-date=2026-01-12T22:10:09Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:09Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-rsyslog-container, container_name=rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, name=rhosp-rhel9/openstack-rsyslog, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510)
Feb 01 08:06:10 np0005604215.localdomain podman[64111]: 2026-02-01 08:06:10.66111117 +0000 UTC m=+0.093557041 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, description=Red Hat OpenStack Platform 17.1 rsyslog, vendor=Red Hat, Inc., release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, com.redhat.component=openstack-rsyslog-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 rsyslog, architecture=x86_64, build-date=2026-01-12T22:10:09Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=rsyslog, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, tcib_managed=true, name=rhosp-rhel9/openstack-rsyslog, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git)
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:06:10 np0005604215.localdomain podman[64124]: 2026-02-01 08:06:10.752789941 +0000 UTC m=+0.058185323 container cleanup 11539428b9cd8a9a78f770d67692d57fdd3335d9c7969cf7a5a0f92decd1e893 (image=registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1, name=rsyslog, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-rsyslog, architecture=x86_64, distribution-scope=public, tcib_managed=true, container_name=rsyslog, version=17.1.13, build-date=2026-01-12T22:10:09Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 rsyslog, name=rhosp-rhel9/openstack-rsyslog, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-rsyslog-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 rsyslog, release=1766032510, summary=Red Hat OpenStack Platform 17.1 rsyslog, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 rsyslog, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '52a7bad153b9a3530edb4c6869c1fe7c'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-rsyslog:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/rsyslog.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/rsyslog:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:ro', '/var/log/containers/rsyslog:/var/log/rsyslog:rw,z', '/var/log:/var/log/host:ro', '/var/lib/rsyslog.container:/var/lib/rsyslog:rw,z']}, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:09Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:06:10 np0005604215.localdomain podman[64124]: rsyslog
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 01 08:06:10 np0005604215.localdomain sudo[64150]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkqwnnqpgbwosqouiwbotipfrixptoxh ; /usr/bin/python3
Feb 01 08:06:10 np0005604215.localdomain sudo[64150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0-merged.mount: Deactivated successfully.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Scheduled restart job, restart counter is at 5.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Stopped rsyslog container.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Start request repeated too quickly.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: tripleo_rsyslog.service: Failed with result 'exit-code'.
Feb 01 08:06:10 np0005604215.localdomain systemd[1]: Failed to start rsyslog container.
Feb 01 08:06:11 np0005604215.localdomain python3[64152]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_3 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 01 08:06:11 np0005604215.localdomain sudo[64150]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:06:11 np0005604215.localdomain podman[64153]: 2026-02-01 08:06:11.858605774 +0000 UTC m=+0.074771893 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 01 08:06:11 np0005604215.localdomain podman[64153]: 2026-02-01 08:06:11.872698996 +0000 UTC m=+0.088865135 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, version=17.1.13, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z)
Feb 01 08:06:11 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:06:20 np0005604215.localdomain sudo[64172]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:06:20 np0005604215.localdomain sudo[64172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:06:20 np0005604215.localdomain sudo[64172]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:20 np0005604215.localdomain sudo[64187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:06:20 np0005604215.localdomain sudo[64187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:06:20 np0005604215.localdomain sudo[64187]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:21 np0005604215.localdomain sudo[64234]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:06:21 np0005604215.localdomain sudo[64234]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:06:21 np0005604215.localdomain sudo[64234]: pam_unix(sudo:session): session closed for user root
Feb 01 08:06:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:06:30 np0005604215.localdomain systemd[1]: tmp-crun.pEXJrP.mount: Deactivated successfully.
Feb 01 08:06:30 np0005604215.localdomain podman[64249]: 2026-02-01 08:06:30.863353912 +0000 UTC m=+0.082665951 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, vcs-type=git, config_id=tripleo_step1)
Feb 01 08:06:31 np0005604215.localdomain podman[64249]: 2026-02-01 08:06:31.078911443 +0000 UTC m=+0.298223452 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 01 08:06:31 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:06:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:06:40 np0005604215.localdomain podman[64278]: 2026-02-01 08:06:40.860006395 +0000 UTC m=+0.077687314 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, container_name=collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 01 08:06:40 np0005604215.localdomain podman[64278]: 2026-02-01 08:06:40.868735658 +0000 UTC m=+0.086416557 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 01 08:06:40 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:06:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:06:42 np0005604215.localdomain systemd[1]: tmp-crun.DzxCa5.mount: Deactivated successfully.
Feb 01 08:06:42 np0005604215.localdomain podman[64299]: 2026-02-01 08:06:42.865481304 +0000 UTC m=+0.080933426 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, build-date=2026-01-12T22:34:43Z)
Feb 01 08:06:42 np0005604215.localdomain podman[64299]: 2026-02-01 08:06:42.879627247 +0000 UTC m=+0.095079349 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 01 08:06:42 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:07:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:07:01 np0005604215.localdomain podman[64318]: 2026-02-01 08:07:01.861983566 +0000 UTC m=+0.080896105 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., config_id=tripleo_step1, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:07:02 np0005604215.localdomain podman[64318]: 2026-02-01 08:07:02.063818667 +0000 UTC m=+0.282731256 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step1, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:07:02 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:07:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:07:11 np0005604215.localdomain systemd[1]: tmp-crun.1bCTto.mount: Deactivated successfully.
Feb 01 08:07:11 np0005604215.localdomain podman[64347]: 2026-02-01 08:07:11.875653884 +0000 UTC m=+0.088516844 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, container_name=collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container)
Feb 01 08:07:11 np0005604215.localdomain podman[64347]: 2026-02-01 08:07:11.91479675 +0000 UTC m=+0.127659730 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd)
Feb 01 08:07:11 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:07:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:07:13 np0005604215.localdomain podman[64369]: 2026-02-01 08:07:13.866482744 +0000 UTC m=+0.082688200 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:07:13 np0005604215.localdomain podman[64369]: 2026-02-01 08:07:13.87497218 +0000 UTC m=+0.091177596 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:07:13 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:07:21 np0005604215.localdomain sudo[64388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:07:21 np0005604215.localdomain sudo[64388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:07:21 np0005604215.localdomain sudo[64388]: pam_unix(sudo:session): session closed for user root
Feb 01 08:07:21 np0005604215.localdomain sudo[64403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:07:21 np0005604215.localdomain sudo[64403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:07:22 np0005604215.localdomain sudo[64403]: pam_unix(sudo:session): session closed for user root
Feb 01 08:07:22 np0005604215.localdomain sudo[64449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:07:22 np0005604215.localdomain sudo[64449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:07:23 np0005604215.localdomain sudo[64449]: pam_unix(sudo:session): session closed for user root
Feb 01 08:07:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:07:32 np0005604215.localdomain podman[64464]: 2026-02-01 08:07:32.866379412 +0000 UTC m=+0.080836013 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 01 08:07:33 np0005604215.localdomain podman[64464]: 2026-02-01 08:07:33.05664721 +0000 UTC m=+0.271103771 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, container_name=metrics_qdr, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:07:33 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:07:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:07:42 np0005604215.localdomain systemd[1]: tmp-crun.qfVPv1.mount: Deactivated successfully.
Feb 01 08:07:42 np0005604215.localdomain podman[64493]: 2026-02-01 08:07:42.866915188 +0000 UTC m=+0.086910432 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, release=1766032510, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:07:42 np0005604215.localdomain podman[64493]: 2026-02-01 08:07:42.876246181 +0000 UTC m=+0.096241475 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, release=1766032510, config_id=tripleo_step3, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:07:42 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:07:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:07:44 np0005604215.localdomain systemd[1]: tmp-crun.IVH0VS.mount: Deactivated successfully.
Feb 01 08:07:44 np0005604215.localdomain podman[64513]: 2026-02-01 08:07:44.866764722 +0000 UTC m=+0.082749943 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, config_id=tripleo_step3, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, tcib_managed=true)
Feb 01 08:07:44 np0005604215.localdomain podman[64513]: 2026-02-01 08:07:44.90470392 +0000 UTC m=+0.120689151 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:34:43Z)
Feb 01 08:07:44 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:08:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:08:03 np0005604215.localdomain systemd[1]: tmp-crun.0xpmL8.mount: Deactivated successfully.
Feb 01 08:08:03 np0005604215.localdomain podman[64531]: 2026-02-01 08:08:03.862859599 +0000 UTC m=+0.079399827 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:08:04 np0005604215.localdomain podman[64531]: 2026-02-01 08:08:04.057759054 +0000 UTC m=+0.274299282 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, config_id=tripleo_step1, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 01 08:08:04 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:08:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:08:13 np0005604215.localdomain podman[64560]: 2026-02-01 08:08:13.862257929 +0000 UTC m=+0.078435617 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:08:13 np0005604215.localdomain podman[64560]: 2026-02-01 08:08:13.87091151 +0000 UTC m=+0.087089268 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13)
Feb 01 08:08:13 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:08:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:08:15 np0005604215.localdomain systemd[1]: tmp-crun.WXcbD0.mount: Deactivated successfully.
Feb 01 08:08:15 np0005604215.localdomain podman[64580]: 2026-02-01 08:08:15.865405326 +0000 UTC m=+0.081221694 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 01 08:08:15 np0005604215.localdomain podman[64580]: 2026-02-01 08:08:15.874656536 +0000 UTC m=+0.090472924 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, config_id=tripleo_step3, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, version=17.1.13)
Feb 01 08:08:15 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:08:23 np0005604215.localdomain sudo[64597]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:08:23 np0005604215.localdomain sudo[64597]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:08:23 np0005604215.localdomain sudo[64597]: pam_unix(sudo:session): session closed for user root
Feb 01 08:08:23 np0005604215.localdomain sudo[64612]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:08:23 np0005604215.localdomain sudo[64612]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:08:23 np0005604215.localdomain sudo[64612]: pam_unix(sudo:session): session closed for user root
Feb 01 08:08:24 np0005604215.localdomain sudo[64658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:08:24 np0005604215.localdomain sudo[64658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:08:24 np0005604215.localdomain sudo[64658]: pam_unix(sudo:session): session closed for user root
Feb 01 08:08:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:08:34 np0005604215.localdomain systemd[1]: tmp-crun.PA9IEb.mount: Deactivated successfully.
Feb 01 08:08:34 np0005604215.localdomain podman[64673]: 2026-02-01 08:08:34.873332545 +0000 UTC m=+0.089787354 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:08:35 np0005604215.localdomain podman[64673]: 2026-02-01 08:08:35.042557764 +0000 UTC m=+0.259012483 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:08:35 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:08:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:08:44 np0005604215.localdomain podman[64702]: 2026-02-01 08:08:44.871586159 +0000 UTC m=+0.083716063 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team)
Feb 01 08:08:44 np0005604215.localdomain podman[64702]: 2026-02-01 08:08:44.888681544 +0000 UTC m=+0.100811488 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=collectd)
Feb 01 08:08:44 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:08:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:08:46 np0005604215.localdomain podman[64722]: 2026-02-01 08:08:46.863880295 +0000 UTC m=+0.082815125 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, build-date=2026-01-12T22:34:43Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5)
Feb 01 08:08:46 np0005604215.localdomain podman[64722]: 2026-02-01 08:08:46.902632819 +0000 UTC m=+0.121567629 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., container_name=iscsid, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:08:46 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:09:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:09:05 np0005604215.localdomain podman[64741]: 2026-02-01 08:09:05.879252446 +0000 UTC m=+0.091454255 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:09:06 np0005604215.localdomain podman[64741]: 2026-02-01 08:09:06.070164545 +0000 UTC m=+0.282366334 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:09:06 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:09:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:09:15 np0005604215.localdomain systemd[1]: tmp-crun.ItcxUi.mount: Deactivated successfully.
Feb 01 08:09:15 np0005604215.localdomain podman[64770]: 2026-02-01 08:09:15.873236926 +0000 UTC m=+0.086416897 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container)
Feb 01 08:09:15 np0005604215.localdomain podman[64770]: 2026-02-01 08:09:15.881619619 +0000 UTC m=+0.094799590 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5)
Feb 01 08:09:15 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:09:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:09:17 np0005604215.localdomain podman[64790]: 2026-02-01 08:09:17.864937535 +0000 UTC m=+0.080778091 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, url=https://www.redhat.com)
Feb 01 08:09:17 np0005604215.localdomain podman[64790]: 2026-02-01 08:09:17.895367098 +0000 UTC m=+0.111207624 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13)
Feb 01 08:09:17 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:09:24 np0005604215.localdomain sudo[64809]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:09:24 np0005604215.localdomain sudo[64809]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:09:24 np0005604215.localdomain sudo[64809]: pam_unix(sudo:session): session closed for user root
Feb 01 08:09:24 np0005604215.localdomain sudo[64824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:09:24 np0005604215.localdomain sudo[64824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:09:25 np0005604215.localdomain sudo[64824]: pam_unix(sudo:session): session closed for user root
Feb 01 08:09:30 np0005604215.localdomain sudo[64872]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:09:30 np0005604215.localdomain sudo[64872]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:09:30 np0005604215.localdomain sudo[64872]: pam_unix(sudo:session): session closed for user root
Feb 01 08:09:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:09:36 np0005604215.localdomain podman[64887]: 2026-02-01 08:09:36.872843847 +0000 UTC m=+0.085889577 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510)
Feb 01 08:09:37 np0005604215.localdomain podman[64887]: 2026-02-01 08:09:37.054493147 +0000 UTC m=+0.267538847 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public)
Feb 01 08:09:37 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:09:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:09:46 np0005604215.localdomain systemd[1]: tmp-crun.NvkLYr.mount: Deactivated successfully.
Feb 01 08:09:46 np0005604215.localdomain podman[64917]: 2026-02-01 08:09:46.881559696 +0000 UTC m=+0.093473960 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:09:46 np0005604215.localdomain podman[64917]: 2026-02-01 08:09:46.917803845 +0000 UTC m=+0.129718159 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, batch=17.1_20260112.1)
Feb 01 08:09:46 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:09:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:09:48 np0005604215.localdomain systemd[1]: tmp-crun.Gese5p.mount: Deactivated successfully.
Feb 01 08:09:48 np0005604215.localdomain podman[64938]: 2026-02-01 08:09:48.871925662 +0000 UTC m=+0.084116802 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container)
Feb 01 08:09:48 np0005604215.localdomain podman[64938]: 2026-02-01 08:09:48.886760476 +0000 UTC m=+0.098951586 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z)
Feb 01 08:09:48 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:10:04 np0005604215.localdomain sudo[65004]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrxqeuensedqtedaebzyjipouobcwocy ; /usr/bin/python3
Feb 01 08:10:04 np0005604215.localdomain sudo[65004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:04 np0005604215.localdomain python3[65006]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:04 np0005604215.localdomain sudo[65004]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:04 np0005604215.localdomain sudo[65049]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqqnfoyyifguhqfyhzpgixdkrhhmzqsv ; /usr/bin/python3
Feb 01 08:10:04 np0005604215.localdomain sudo[65049]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:04 np0005604215.localdomain python3[65051]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933403.9138386-106831-103831838187883/source _original_basename=tmpoxx7rek_ follow=False checksum=ee48fb03297eb703b1954c8852d0f67fab51dac1 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:04 np0005604215.localdomain sudo[65049]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:05 np0005604215.localdomain sudo[65111]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dvrwwagkheszyrgkmdazrtlljhoweudy ; /usr/bin/python3
Feb 01 08:10:05 np0005604215.localdomain sudo[65111]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:05 np0005604215.localdomain python3[65113]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/recover_tripleo_nova_virtqemud.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:05 np0005604215.localdomain sudo[65111]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:05 np0005604215.localdomain sudo[65154]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ddidlswndivlgfakjyibehzcbzdakclw ; /usr/bin/python3
Feb 01 08:10:05 np0005604215.localdomain sudo[65154]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:05 np0005604215.localdomain python3[65156]: ansible-ansible.legacy.copy Invoked with dest=/usr/libexec/recover_tripleo_nova_virtqemud.sh mode=0755 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933405.2245994-106907-141639186815934/source _original_basename=tmpdxsbcle5 follow=False checksum=922b8aa8342176110bffc2e39abdccc2b39e53a9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:05 np0005604215.localdomain sudo[65154]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:06 np0005604215.localdomain sudo[65216]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcjsphwenvtpmjcjdyojrpnytosrzwqk ; /usr/bin/python3
Feb 01 08:10:06 np0005604215.localdomain sudo[65216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:06 np0005604215.localdomain python3[65218]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:06 np0005604215.localdomain sudo[65216]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:06 np0005604215.localdomain sudo[65259]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-joiawsunljuitpcwkxxrnecaiahcomzk ; /usr/bin/python3
Feb 01 08:10:06 np0005604215.localdomain sudo[65259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:06 np0005604215.localdomain python3[65261]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.service mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933406.1655345-106959-280975021446585/source _original_basename=tmphfovkidy follow=False checksum=92f73544b703afc85885fa63ab07bdf8f8671554 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:06 np0005604215.localdomain sudo[65259]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:07 np0005604215.localdomain sudo[65321]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poglbjfaazjfzlojpaernetdosjudpal ; /usr/bin/python3
Feb 01 08:10:07 np0005604215.localdomain sudo[65321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:10:07 np0005604215.localdomain systemd[1]: tmp-crun.28GsRV.mount: Deactivated successfully.
Feb 01 08:10:07 np0005604215.localdomain podman[65324]: 2026-02-01 08:10:07.46891553 +0000 UTC m=+0.124163462 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=)
Feb 01 08:10:07 np0005604215.localdomain python3[65323]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:07 np0005604215.localdomain sudo[65321]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:07 np0005604215.localdomain podman[65324]: 2026-02-01 08:10:07.63058098 +0000 UTC m=+0.285828912 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com)
Feb 01 08:10:07 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:10:07 np0005604215.localdomain sudo[65393]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glztpzhoyhofrcoigawuyfcoshhpslor ; /usr/bin/python3
Feb 01 08:10:07 np0005604215.localdomain sudo[65393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:07 np0005604215.localdomain python3[65395]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_virtqemud_recover.timer mode=0644 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933407.1606317-107019-216220168701269/source _original_basename=tmpr5cxhs_j follow=False checksum=c6e5f76a53c0d6ccaf46c4b48d813dc2891ad8e9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:07 np0005604215.localdomain sudo[65393]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:08 np0005604215.localdomain sudo[65423]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xzoiwqmyqesayskoigkamdrdiacxrpua ; /usr/bin/python3
Feb 01 08:10:08 np0005604215.localdomain sudo[65423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:08 np0005604215.localdomain python3[65425]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.service daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 01 08:10:08 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:08 np0005604215.localdomain systemd-sysv-generator[65452]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:08 np0005604215.localdomain systemd-rc-local-generator[65449]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:08 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:08 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:08 np0005604215.localdomain systemd-sysv-generator[65492]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:08 np0005604215.localdomain systemd-rc-local-generator[65485]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:08 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:08 np0005604215.localdomain sudo[65423]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:09 np0005604215.localdomain sudo[65513]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fncvnnphulztbmgmjuxznistofcrdcvd ; /usr/bin/python3
Feb 01 08:10:09 np0005604215.localdomain sudo[65513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:09 np0005604215.localdomain python3[65515]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_virtqemud_recover.timer state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:10:09 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:09 np0005604215.localdomain systemd-rc-local-generator[65536]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:09 np0005604215.localdomain systemd-sysv-generator[65542]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:09 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:09 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:09 np0005604215.localdomain systemd-rc-local-generator[65581]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:09 np0005604215.localdomain systemd-sysv-generator[65586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:09 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:10 np0005604215.localdomain systemd[1]: Started Check and recover tripleo_nova_virtqemud every 10m.
Feb 01 08:10:10 np0005604215.localdomain sudo[65513]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:10 np0005604215.localdomain sudo[65605]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sivxruwsnnbrzucgxwdpqfvflnbgwfki ; /usr/bin/python3
Feb 01 08:10:10 np0005604215.localdomain sudo[65605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:10 np0005604215.localdomain python3[65607]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl enable --now tripleo_nova_virtqemud_recover.timer _uses_shell=True stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 08:10:10 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:10 np0005604215.localdomain systemd-rc-local-generator[65630]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:10 np0005604215.localdomain systemd-sysv-generator[65634]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:10 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:10 np0005604215.localdomain sudo[65605]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:11 np0005604215.localdomain sudo[65689]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdmmnnaesiaflqhlbcmucnfvdgffklzt ; /usr/bin/python3
Feb 01 08:10:11 np0005604215.localdomain sudo[65689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:11 np0005604215.localdomain python3[65691]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:11 np0005604215.localdomain sudo[65689]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:11 np0005604215.localdomain sudo[65732]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ekradsygyosfkckbrubhxnqutbbajsez ; /usr/bin/python3
Feb 01 08:10:11 np0005604215.localdomain sudo[65732]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:11 np0005604215.localdomain python3[65734]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/tripleo_nova_libvirt.target group=root mode=0644 owner=root src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933410.87873-107161-190547566216215/source _original_basename=tmpi7so8qs_ follow=False checksum=c064b4a8e7d3d1d7c62d1f80a09e350659996afd backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:11 np0005604215.localdomain sudo[65732]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:11 np0005604215.localdomain sudo[65762]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjeobojaojwxewyxphczfqhlqtorxypq ; /usr/bin/python3
Feb 01 08:10:11 np0005604215.localdomain sudo[65762]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:12 np0005604215.localdomain python3[65764]: ansible-systemd Invoked with daemon_reload=True enabled=True name=tripleo_nova_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:10:12 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:12 np0005604215.localdomain systemd-rc-local-generator[65788]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:12 np0005604215.localdomain systemd-sysv-generator[65793]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:12 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:12 np0005604215.localdomain systemd[1]: Reached target tripleo_nova_libvirt.target.
Feb 01 08:10:12 np0005604215.localdomain sudo[65762]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:12 np0005604215.localdomain sudo[65816]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cvaqzxfzehjyttqhnxufotaryejjclzg ; /usr/bin/python3
Feb 01 08:10:12 np0005604215.localdomain sudo[65816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:12 np0005604215.localdomain python3[65818]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_4 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:12 np0005604215.localdomain sudo[65816]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:13 np0005604215.localdomain sudo[65866]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xspbmoerogprxwhwfcvywpchvnxaisxc ; /usr/bin/python3
Feb 01 08:10:13 np0005604215.localdomain sudo[65866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:13 np0005604215.localdomain sudo[65866]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:13 np0005604215.localdomain sudo[65884]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njweqicmhfcfxsgklbermnwdcdcfkzyp ; /usr/bin/python3
Feb 01 08:10:13 np0005604215.localdomain sudo[65884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:13 np0005604215.localdomain sudo[65884]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:14 np0005604215.localdomain sudo[65988]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upeykditfickkqhrkdlanxcsryqtosak ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933413.8575816-107271-33031830807480/async_wrapper.py 568478358014 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933413.8575816-107271-33031830807480/AnsiballZ_command.py _
Feb 01 08:10:14 np0005604215.localdomain sudo[65988]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 08:10:14 np0005604215.localdomain ansible-async_wrapper.py[65990]: Invoked with 568478358014 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933413.8575816-107271-33031830807480/AnsiballZ_command.py _
Feb 01 08:10:14 np0005604215.localdomain ansible-async_wrapper.py[65993]: Starting module and watcher
Feb 01 08:10:14 np0005604215.localdomain ansible-async_wrapper.py[65993]: Start watching 65994 (3600)
Feb 01 08:10:14 np0005604215.localdomain ansible-async_wrapper.py[65994]: Start module (65994)
Feb 01 08:10:14 np0005604215.localdomain ansible-async_wrapper.py[65990]: Return async_wrapper task started.
Feb 01 08:10:14 np0005604215.localdomain sudo[65988]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:14 np0005604215.localdomain sudo[66011]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwirwfmxfdzoulukyyqkupoyqojofhln ; /usr/bin/python3
Feb 01 08:10:14 np0005604215.localdomain sudo[66011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:14 np0005604215.localdomain python3[66014]: ansible-ansible.legacy.async_status Invoked with jid=568478358014.65990 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:10:14 np0005604215.localdomain sudo[66011]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:10:17 np0005604215.localdomain podman[66069]: 2026-02-01 08:10:17.151923362 +0000 UTC m=+0.104891947 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true)
Feb 01 08:10:17 np0005604215.localdomain podman[66069]: 2026-02-01 08:10:17.167607913 +0000 UTC m=+0.120576528 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 01 08:10:17 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:10:17 np0005604215.localdomain puppet-user[66012]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 08:10:17 np0005604215.localdomain puppet-user[66012]:    (file: /etc/puppet/hiera.yaml)
Feb 01 08:10:17 np0005604215.localdomain puppet-user[66012]: Warning: Undefined variable '::deploy_config_name';
Feb 01 08:10:17 np0005604215.localdomain puppet-user[66012]:    (file & line not available)
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (file & line not available)
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 01 08:10:18 np0005604215.localdomain puppet-user[66012]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.23 seconds
Feb 01 08:10:19 np0005604215.localdomain ansible-async_wrapper.py[65993]: 65994 still running (3600)
Feb 01 08:10:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:10:19 np0005604215.localdomain podman[66152]: 2026-02-01 08:10:19.884123643 +0000 UTC m=+0.088072538 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13)
Feb 01 08:10:19 np0005604215.localdomain podman[66152]: 2026-02-01 08:10:19.89562095 +0000 UTC m=+0.099569845 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64)
Feb 01 08:10:19 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:10:24 np0005604215.localdomain ansible-async_wrapper.py[65993]: 65994 still running (3595)
Feb 01 08:10:24 np0005604215.localdomain sudo[66242]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oibjlowmzxqcdyrajdnllnuhkmendvvy ; /usr/bin/python3
Feb 01 08:10:24 np0005604215.localdomain sudo[66242]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:25 np0005604215.localdomain python3[66245]: ansible-ansible.legacy.async_status Invoked with jid=568478358014.65990 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:10:25 np0005604215.localdomain sudo[66242]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:26 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 08:10:26 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 08:10:26 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:27 np0005604215.localdomain systemd-rc-local-generator[66315]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:27 np0005604215.localdomain systemd-sysv-generator[66319]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:27 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:27 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 08:10:27 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 08:10:27 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 08:10:27 np0005604215.localdomain systemd[1]: run-rd1bbb4b69ea4423a9f8628b2a1d8cc41.service: Deactivated successfully.
Feb 01 08:10:28 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/Package[snmpd]/ensure: created
Feb 01 08:10:28 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/File[snmpd.conf]/content: content changed '{sha256}2b743f970e80e2150759bfc66f2d8d0fbd8b31624f79e2991248d1a5ac57494e' to '{sha256}330d40e6e8b4d501b2c8ea095074c170cae10764538f74f185579138e5e168c8'
Feb 01 08:10:28 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/File[snmpd.sysconfig]/content: content changed '{sha256}b63afb2dee7419b6834471f88581d981c8ae5c8b27b9d329ba67a02f3ddd8221' to '{sha256}3917ee8bbc680ad50d77186ad4a1d2705c2025c32fc32f823abbda7f2328dfbd'
Feb 01 08:10:28 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/File[snmptrapd.conf]/content: content changed '{sha256}2e1ca894d609ef337b6243909bf5623c87fd5df98ecbd00c7d4c12cf12f03c4e' to '{sha256}3ecf18da1ba84ea3932607f2b903ee6a038b6f9ac4e1e371e48f3ef61c5052ea'
Feb 01 08:10:28 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/File[snmptrapd.sysconfig]/content: content changed '{sha256}86ee5797ad10cb1ea0f631e9dfa6ae278ecf4f4d16f4c80f831cdde45601b23c' to '{sha256}2244553364afcca151958f8e2003e4c182f5e2ecfbe55405cec73fd818581e97'
Feb 01 08:10:28 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/Service[snmptrapd]: Triggered 'refresh' from 2 events
Feb 01 08:10:29 np0005604215.localdomain ansible-async_wrapper.py[65993]: 65994 still running (3590)
Feb 01 08:10:30 np0005604215.localdomain sudo[67604]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:10:30 np0005604215.localdomain sudo[67604]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:10:30 np0005604215.localdomain sudo[67604]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:30 np0005604215.localdomain sudo[67619]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:10:30 np0005604215.localdomain sudo[67619]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:10:31 np0005604215.localdomain sudo[67619]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:31 np0005604215.localdomain sudo[67667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:10:31 np0005604215.localdomain sudo[67667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:10:31 np0005604215.localdomain sudo[67667]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:31 np0005604215.localdomain sudo[67682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 08:10:31 np0005604215.localdomain sudo[67682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:10:31 np0005604215.localdomain sudo[67682]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:33 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Tripleo::Profile::Base::Snmp/Snmp::Snmpv3_user[ro_snmp_user]/Exec[create-snmpv3-user-ro_snmp_user]/returns: executed successfully
Feb 01 08:10:33 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:33 np0005604215.localdomain systemd-sysv-generator[67744]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:33 np0005604215.localdomain systemd-rc-local-generator[67740]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:33 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:34 np0005604215.localdomain systemd[1]: Starting Simple Network Management Protocol (SNMP) Daemon....
Feb 01 08:10:34 np0005604215.localdomain snmpd[67757]: Can't find directory of RPM packages
Feb 01 08:10:34 np0005604215.localdomain snmpd[67757]: Duplicate IPv4 address detected, some interfaces may not be visible in IP-MIB
Feb 01 08:10:34 np0005604215.localdomain systemd[1]: Started Simple Network Management Protocol (SNMP) Daemon..
Feb 01 08:10:34 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:34 np0005604215.localdomain systemd-sysv-generator[67783]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:34 np0005604215.localdomain systemd-rc-local-generator[67779]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:34 np0005604215.localdomain ansible-async_wrapper.py[65993]: 65994 still running (3585)
Feb 01 08:10:34 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:34 np0005604215.localdomain systemd-sysv-generator[67822]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:34 np0005604215.localdomain systemd-rc-local-generator[67818]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Notice: /Stage[main]/Snmp/Service[snmpd]/ensure: ensure changed 'stopped' to 'running'
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Notice: Applied catalog in 16.64 seconds
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Application:
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:    Initial environment: production
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:    Converged environment: production
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:          Run mode: user
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Changes:
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:             Total: 8
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Events:
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:           Success: 8
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:             Total: 8
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Resources:
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:         Restarted: 1
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:           Changed: 8
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:       Out of sync: 8
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:             Total: 19
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Time:
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:        Filebucket: 0.00
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:          Schedule: 0.00
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:            Augeas: 0.01
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:              File: 0.08
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:    Config retrieval: 0.29
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:           Service: 1.22
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:           Package: 10.09
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:    Transaction evaluation: 16.63
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:    Catalog application: 16.64
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:          Last run: 1769933434
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:              Exec: 5.05
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:             Total: 16.64
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]: Version:
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:            Config: 1769933417
Feb 01 08:10:34 np0005604215.localdomain puppet-user[66012]:            Puppet: 7.10.0
Feb 01 08:10:34 np0005604215.localdomain ansible-async_wrapper.py[65994]: Module complete (65994)
Feb 01 08:10:35 np0005604215.localdomain sudo[67844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtyzfkkngpjuupaiupmynxhjeoqdjgtt ; /usr/bin/python3
Feb 01 08:10:35 np0005604215.localdomain sudo[67844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:35 np0005604215.localdomain python3[67846]: ansible-ansible.legacy.async_status Invoked with jid=568478358014.65990 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:10:35 np0005604215.localdomain sudo[67844]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:35 np0005604215.localdomain sudo[67847]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:10:35 np0005604215.localdomain sudo[67847]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:10:35 np0005604215.localdomain sudo[67847]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:35 np0005604215.localdomain sudo[67875]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gnuilishquzcsxtooscxfxddjvmhscog ; /usr/bin/python3
Feb 01 08:10:35 np0005604215.localdomain sudo[67875]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:36 np0005604215.localdomain python3[67877]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:10:36 np0005604215.localdomain sudo[67875]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:36 np0005604215.localdomain sudo[67891]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eodemdgmlnfoooxqvsplgihjlucmatbo ; /usr/bin/python3
Feb 01 08:10:36 np0005604215.localdomain sudo[67891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:36 np0005604215.localdomain python3[67893]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:36 np0005604215.localdomain sudo[67891]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:36 np0005604215.localdomain sudo[67941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfbicbgmnoegtpeswlqabxwtufqucjdf ; /usr/bin/python3
Feb 01 08:10:36 np0005604215.localdomain sudo[67941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:36 np0005604215.localdomain python3[67943]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:36 np0005604215.localdomain sudo[67941]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:36 np0005604215.localdomain sudo[67959]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwfasbehvptilgvqftmcqmajomuznytp ; /usr/bin/python3
Feb 01 08:10:36 np0005604215.localdomain sudo[67959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:37 np0005604215.localdomain python3[67961]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmpzkg0n_dc recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:10:37 np0005604215.localdomain sudo[67959]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:37 np0005604215.localdomain sudo[67989]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pslamixwuoqgzlhaadrnymkdnmpavtzp ; /usr/bin/python3
Feb 01 08:10:37 np0005604215.localdomain sudo[67989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:37 np0005604215.localdomain python3[67991]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:37 np0005604215.localdomain sudo[67989]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:37 np0005604215.localdomain sudo[68005]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tolijtskarmcufdwhtsinxjnxrtwowaq ; /usr/bin/python3
Feb 01 08:10:37 np0005604215.localdomain sudo[68005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:10:37 np0005604215.localdomain podman[68007]: 2026-02-01 08:10:37.924509592 +0000 UTC m=+0.146090173 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr)
Feb 01 08:10:38 np0005604215.localdomain podman[68007]: 2026-02-01 08:10:38.124765336 +0000 UTC m=+0.346345927 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr)
Feb 01 08:10:38 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:10:38 np0005604215.localdomain sudo[68005]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:38 np0005604215.localdomain sudo[68121]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bkacbilfmbjsublhjaiixgsqiabdfihs ; /usr/bin/python3
Feb 01 08:10:38 np0005604215.localdomain sudo[68121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:38 np0005604215.localdomain python3[68123]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 01 08:10:38 np0005604215.localdomain sudo[68121]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:39 np0005604215.localdomain sudo[68140]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scaqnwsyphgkqrjidcugzfgnnckyzplu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:39 np0005604215.localdomain sudo[68140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:39 np0005604215.localdomain python3[68142]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:39 np0005604215.localdomain sudo[68140]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:39 np0005604215.localdomain ansible-async_wrapper.py[65993]: Done in kid B.
Feb 01 08:10:39 np0005604215.localdomain sudo[68156]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfuuktwbdvtyzkvtllqzcjslwtxohoeu ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:39 np0005604215.localdomain sudo[68156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:39 np0005604215.localdomain sudo[68156]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:40 np0005604215.localdomain sudo[68172]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cizonvglwqvuqsbjkpmjhvaxiamdzssd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:40 np0005604215.localdomain sudo[68172]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:40 np0005604215.localdomain python3[68174]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:40 np0005604215.localdomain sudo[68172]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:40 np0005604215.localdomain sudo[68222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrbbjbwirdvuozvirkqzamjlxkrngjpk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:40 np0005604215.localdomain sudo[68222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:40 np0005604215.localdomain python3[68224]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:40 np0005604215.localdomain sudo[68222]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:40 np0005604215.localdomain sudo[68240]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rprldhhxxbadacnhjrgpxzytrwwxadsx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:40 np0005604215.localdomain sudo[68240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:41 np0005604215.localdomain python3[68242]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:41 np0005604215.localdomain sudo[68240]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:41 np0005604215.localdomain sudo[68302]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-egidhfdnacivjhlbcuaaeatpskxviqjx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:41 np0005604215.localdomain sudo[68302]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:41 np0005604215.localdomain python3[68304]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:41 np0005604215.localdomain sudo[68302]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:41 np0005604215.localdomain sudo[68320]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftvterdrgafmnocmzcvgdgblbroeacsd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:41 np0005604215.localdomain sudo[68320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:41 np0005604215.localdomain python3[68322]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:41 np0005604215.localdomain sudo[68320]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:42 np0005604215.localdomain sudo[68382]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byqhzxtdhxnlzpcavstqsqvkgamxrzyg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:42 np0005604215.localdomain sudo[68382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:42 np0005604215.localdomain python3[68384]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:42 np0005604215.localdomain sudo[68382]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:42 np0005604215.localdomain sudo[68400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uoqqibgmiukmpltxccqmlwduovbqpvqh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:42 np0005604215.localdomain sudo[68400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:42 np0005604215.localdomain python3[68402]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:42 np0005604215.localdomain sudo[68400]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:42 np0005604215.localdomain sudo[68462]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iodvbdrifblqjluwvzbrwncsrjnfcenx ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:42 np0005604215.localdomain sudo[68462]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:43 np0005604215.localdomain python3[68464]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:43 np0005604215.localdomain sudo[68462]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:43 np0005604215.localdomain sudo[68480]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hubezewbrrhpirgxoxomuyyeqampnthp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:43 np0005604215.localdomain sudo[68480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:43 np0005604215.localdomain python3[68482]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:43 np0005604215.localdomain sudo[68480]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:43 np0005604215.localdomain sudo[68510]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lydfeahekpllkcrzlfxrhjknrcgdachs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:43 np0005604215.localdomain sudo[68510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:43 np0005604215.localdomain python3[68512]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:10:43 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:43 np0005604215.localdomain systemd-rc-local-generator[68535]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:43 np0005604215.localdomain systemd-sysv-generator[68538]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:45 np0005604215.localdomain sudo[68510]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:45 np0005604215.localdomain sudo[68596]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iabkklwazvvudbjgzfrlficbkktayumd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:45 np0005604215.localdomain sudo[68596]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:10:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4946 writes, 22K keys, 4946 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4946 writes, 558 syncs, 8.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 86 writes, 124 keys, 86 commit groups, 1.0 writes per commit group, ingest: 0.03 MB, 0.00 MB/s
                                                          Interval WAL: 86 writes, 43 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:10:45 np0005604215.localdomain python3[68598]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:45 np0005604215.localdomain sudo[68596]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:45 np0005604215.localdomain sudo[68614]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmzsjounkhzfqkdslkdyoifdpjjbsbeh ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:45 np0005604215.localdomain sudo[68614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:45 np0005604215.localdomain python3[68616]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:46 np0005604215.localdomain sudo[68614]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:46 np0005604215.localdomain sudo[68676]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcjvrpmkmwtcdcreagdkwrizlaxtlhhk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:46 np0005604215.localdomain sudo[68676]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:46 np0005604215.localdomain python3[68678]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:10:46 np0005604215.localdomain sudo[68676]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:46 np0005604215.localdomain sudo[68694]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpvvglahmtovtotstnatzdqawkyfqsem ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:46 np0005604215.localdomain sudo[68694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:46 np0005604215.localdomain python3[68696]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:46 np0005604215.localdomain sudo[68694]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:47 np0005604215.localdomain sudo[68724]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzsrlygvbrsjfoouzktahmlrqprtbtzw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:47 np0005604215.localdomain sudo[68724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:47 np0005604215.localdomain python3[68726]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:10:47 np0005604215.localdomain systemd-rc-local-generator[68766]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:10:47 np0005604215.localdomain systemd-sysv-generator[68769]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:10:47 np0005604215.localdomain podman[68728]: 2026-02-01 08:10:47.412545539 +0000 UTC m=+0.092503469 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:10:47 np0005604215.localdomain podman[68728]: 2026-02-01 08:10:47.445906976 +0000 UTC m=+0.125864906 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, architecture=x86_64, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 08:10:47 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 08:10:47 np0005604215.localdomain sudo[68724]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:48 np0005604215.localdomain sudo[68801]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yftiwkqawwsfesyhfmuoqrklovytwfaf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:48 np0005604215.localdomain sudo[68801]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:48 np0005604215.localdomain python3[68803]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 01 08:10:48 np0005604215.localdomain sudo[68801]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:48 np0005604215.localdomain sudo[68817]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnticgqdgyatflfxazcxuxdtrwhmckeg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:48 np0005604215.localdomain sudo[68817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:48 np0005604215.localdomain sudo[68817]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:49 np0005604215.localdomain sudo[68860]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwrherlyonazgbpfixjsurvbsujagfps ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:49 np0005604215.localdomain sudo[68860]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:50 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step4 config_dir=/var/lib/tripleo-config/container-startup-config/step_4 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: tmp-crun.Dhsyqv.mount: Deactivated successfully.
Feb 01 08:10:50 np0005604215.localdomain podman[68893]: 2026-02-01 08:10:50.205989069 +0000 UTC m=+0.086999733 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, distribution-scope=public, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container)
Feb 01 08:10:50 np0005604215.localdomain podman[68893]: 2026-02-01 08:10:50.24669183 +0000 UTC m=+0.127702524 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:10:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:10:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 1800.1 total, 600.0 interval
                                                          Cumulative writes: 4734 writes, 21K keys, 4734 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4734 writes, 481 syncs, 9.84 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 26 writes, 51 keys, 26 commit groups, 1.0 writes per commit group, ingest: 0.01 MB, 0.00 MB/s
                                                          Interval WAL: 26 writes, 13 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:10:50 np0005604215.localdomain podman[69028]: 2026-02-01 08:10:50.392881705 +0000 UTC m=+0.102284671 container create 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5)
Feb 01 08:10:50 np0005604215.localdomain podman[69040]: 2026-02-01 08:10:50.416483451 +0000 UTC m=+0.109844674 container create 2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=nova_libvirt_init_secret, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step4, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.scope.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d2793eb0d727691e97e5e2f52ec5e9822efebe0b6bf32e0fb26a5897fd53d53c/merged/var/log/containers supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a.scope.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/901f926467172f87fed8e093a0c623b4edfdf674c0cbe61bc939afde2d57f8c6/merged/etc/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/901f926467172f87fed8e093a0c623b4edfdf674c0cbe61bc939afde2d57f8c6/merged/etc/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/901f926467172f87fed8e093a0c623b4edfdf674c0cbe61bc939afde2d57f8c6/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:50 np0005604215.localdomain podman[69040]: 2026-02-01 08:10:50.453914278 +0000 UTC m=+0.147275501 container init 2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, architecture=x86_64, container_name=nova_libvirt_init_secret, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:10:50 np0005604215.localdomain podman[69028]: 2026-02-01 08:10:50.460003612 +0000 UTC m=+0.169406588 container init 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:10:50 np0005604215.localdomain podman[69028]: 2026-02-01 08:10:50.363203406 +0000 UTC m=+0.072606392 image pull  registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 01 08:10:50 np0005604215.localdomain podman[69040]: 2026-02-01 08:10:50.465226599 +0000 UTC m=+0.158587822 container start 2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_libvirt_init_secret, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:10:50 np0005604215.localdomain podman[69040]: 2026-02-01 08:10:50.465407475 +0000 UTC m=+0.158768708 container attach 2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, io.buildah.version=1.41.5, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=nova_libvirt_init_secret, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:10:50 np0005604215.localdomain podman[69040]: 2026-02-01 08:10:50.372346949 +0000 UTC m=+0.065708172 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:10:50 np0005604215.localdomain podman[69072]: 2026-02-01 08:10:50.475187018 +0000 UTC m=+0.107719726 container create 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:10:50 np0005604215.localdomain podman[69028]: 2026-02-01 08:10:50.478662149 +0000 UTC m=+0.188065125 container start 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-type=git, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team)
Feb 01 08:10:50 np0005604215.localdomain sudo[69127]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:10:50 np0005604215.localdomain sudo[69127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:10:50 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name logrotate_crond --conmon-pidfile /run/logrotate_crond.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=53ed83bb0cae779ff95edb2002262c6f --healthcheck-command /usr/share/openstack-tripleo-common/healthcheck/cron --label config_id=tripleo_step4 --label container_name=logrotate_crond --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/logrotate_crond.log --network none --pid host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro --volume /var/log/containers:/var/log/containers:z registry.redhat.io/rhosp-rhel9/openstack-cron:17.1
Feb 01 08:10:50 np0005604215.localdomain podman[69019]: 2026-02-01 08:10:50.386604225 +0000 UTC m=+0.091774876 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 01 08:10:50 np0005604215.localdomain sudo[69127]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:50 np0005604215.localdomain crond[69125]: (CRON) STARTUP (1.5.7)
Feb 01 08:10:50 np0005604215.localdomain crond[69125]: (CRON) INFO (RANDOM_DELAY will be scaled with factor 53% if used.)
Feb 01 08:10:50 np0005604215.localdomain crond[69125]: (CRON) INFO (running with inotify support)
Feb 01 08:10:50 np0005604215.localdomain podman[69072]: 2026-02-01 08:10:50.446554093 +0000 UTC m=+0.079086831 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.scope.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: libpod-2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a.scope: Deactivated successfully.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:50 np0005604215.localdomain podman[69040]: 2026-02-01 08:10:50.566099875 +0000 UTC m=+0.259461108 container died 2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:10:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19867aa9ce07feb42ab4d071eed0ec581b8be5de4a737b08d8913c4970e7b3a5/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:10:50 np0005604215.localdomain podman[69072]: 2026-02-01 08:10:50.590351561 +0000 UTC m=+0.222884269 container init 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1)
Feb 01 08:10:50 np0005604215.localdomain podman[69096]: 2026-02-01 08:10:50.492735989 +0000 UTC m=+0.086138786 image pull  registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 01 08:10:50 np0005604215.localdomain sudo[69199]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:10:50 np0005604215.localdomain sudo[69199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:10:50 np0005604215.localdomain podman[69072]: 2026-02-01 08:10:50.616965992 +0000 UTC m=+0.249498700 container start 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:10:50 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_ipmi --conmon-pidfile /run/ceilometer_agent_ipmi.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=63e53a2f3cd2422147592f2c2c6c2f61 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_ipmi --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_ipmi.log --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1
Feb 01 08:10:50 np0005604215.localdomain podman[69096]: 2026-02-01 08:10:50.626787037 +0000 UTC m=+0.220189804 container create 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:10:50 np0005604215.localdomain sudo[69199]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.scope.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:50 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e8a1138cbb1c83236f4de65652beadb5bc0b1f3b8c525083bd1db3fda89ebbe0/merged/var/log/ceilometer supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:50 np0005604215.localdomain podman[69201]: 2026-02-01 08:10:50.696490506 +0000 UTC m=+0.074139233 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:10:50 np0005604215.localdomain podman[69201]: 2026-02-01 08:10:50.706143254 +0000 UTC m=+0.083791981 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:10:50 np0005604215.localdomain podman[69201]: unhealthy
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:10:50 np0005604215.localdomain podman[69019]: 2026-02-01 08:10:50.72822047 +0000 UTC m=+0.433391111 container create a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=configure_cms_options, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88.scope.
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:50 np0005604215.localdomain podman[69133]: 2026-02-01 08:10:50.800390799 +0000 UTC m=+0.318272750 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=starting, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:10:50 np0005604215.localdomain podman[69096]: 2026-02-01 08:10:50.805616696 +0000 UTC m=+0.399019483 container init 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible)
Feb 01 08:10:50 np0005604215.localdomain podman[69133]: 2026-02-01 08:10:50.807394692 +0000 UTC m=+0.325276733 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:10:50 np0005604215.localdomain sudo[69292]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:10:50 np0005604215.localdomain sudo[69292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:10:50 np0005604215.localdomain podman[69019]: 2026-02-01 08:10:50.860650876 +0000 UTC m=+0.565821537 container init a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=configure_cms_options, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:10:50 np0005604215.localdomain sudo[69292]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:50 np0005604215.localdomain podman[69183]: 2026-02-01 08:10:50.886952667 +0000 UTC m=+0.307754663 container cleanup 2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_libvirt_init_secret, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, vcs-type=git, container_name=nova_libvirt_init_secret, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:10:50 np0005604215.localdomain systemd[1]: libpod-conmon-2b56af8b4399e7db68243b7ae3ed38a6383683d877af84d3c6702a7134a03d7a.scope: Deactivated successfully.
Feb 01 08:10:50 np0005604215.localdomain podman[69019]: 2026-02-01 08:10:50.922531356 +0000 UTC m=+0.627701997 container start a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=configure_cms_options, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:10:50 np0005604215.localdomain podman[69019]: 2026-02-01 08:10:50.922765023 +0000 UTC m=+0.627935734 container attach a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=configure_cms_options, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:10:50 np0005604215.localdomain podman[69269]: 2026-02-01 08:10:50.828627772 +0000 UTC m=+0.038480692 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:10:50 np0005604215.localdomain podman[69269]: 2026-02-01 08:10:50.974726844 +0000 UTC m=+0.184579744 container create 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13)
Feb 01 08:10:50 np0005604215.localdomain ovs-vsctl[69326]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . external_ids ovn-cms-options
Feb 01 08:10:51 np0005604215.localdomain podman[69096]: 2026-02-01 08:10:51.002919306 +0000 UTC m=+0.596322103 container start 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:10:51 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=63e53a2f3cd2422147592f2c2c6c2f61 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ceilometer_agent_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ceilometer_agent_compute.log --network host --privileged=False --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/log/containers/ceilometer:/var/log/ceilometer:z registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: Started libpod-conmon-080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.scope.
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: libpod-a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88.scope: Deactivated successfully.
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:51 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fb1968646de61e5d6c5b7938dce54da276edc06f0bc75651b588722ba09cba1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:51 np0005604215.localdomain podman[69297]: 2026-02-01 08:10:51.041421807 +0000 UTC m=+0.193929663 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64)
Feb 01 08:10:51 np0005604215.localdomain podman[69297]: 2026-02-01 08:10:51.053347239 +0000 UTC m=+0.205855075 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, tcib_managed=true, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:10:51 np0005604215.localdomain podman[69297]: unhealthy
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Failed with result 'exit-code'.
Feb 01 08:10:51 np0005604215.localdomain podman[69269]: 2026-02-01 08:10:51.063025559 +0000 UTC m=+0.272878489 container init 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, batch=17.1_20260112.1, release=1766032510, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T23:32:04Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:10:51 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_libvirt_init_secret --cgroupns=host --conmon-pidfile /run/nova_libvirt_init_secret.pid --detach=False --env LIBVIRT_DEFAULT_URI=qemu:///system --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step4 --label container_name=nova_libvirt_init_secret --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'command': '/nova_libvirt_init_secret.sh ceph:openstack', 'detach': False, 'environment': {'LIBVIRT_DEFAULT_URI': 'qemu:///system', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'privileged': False, 'security_opt': ['label=disable'], 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova', '/etc/libvirt:/etc/libvirt', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro', '/var/lib/tripleo-config/ceph:/etc/ceph:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_libvirt_init_secret.log --network host --privileged=False --security-opt label=disable --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt/etc/nova:/etc/nova --volume /etc/libvirt:/etc/libvirt --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /var/lib/container-config-scripts/nova_libvirt_init_secret.sh:/nova_libvirt_init_secret.sh:ro --volume /var/lib/tripleo-config/ceph:/etc/ceph:ro registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1 /nova_libvirt_init_secret.sh ceph:openstack
Feb 01 08:10:51 np0005604215.localdomain sudo[69374]:     root : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:10:51 np0005604215.localdomain sudo[69374]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:10:51 np0005604215.localdomain podman[69019]: 2026-02-01 08:10:51.106614912 +0000 UTC m=+0.811785543 container died a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, release=1766032510, distribution-scope=public, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, container_name=configure_cms_options, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:10:51 np0005604215.localdomain podman[69269]: 2026-02-01 08:10:51.15970926 +0000 UTC m=+0.369562170 container start 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, container_name=nova_migration_target, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git)
Feb 01 08:10:51 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_migration_target --conmon-pidfile /run/nova_migration_target.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=nova_migration_target --label managed_by=tripleo_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_migration_target.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /etc/ssh:/host-ssh:ro --volume /run/libvirt:/run/libvirt:shared,z --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:10:51 np0005604215.localdomain sudo[69374]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:51 np0005604215.localdomain sshd[69422]: Server listening on 0.0.0.0 port 2022.
Feb 01 08:10:51 np0005604215.localdomain sshd[69422]: Server listening on :: port 2022.
Feb 01 08:10:51 np0005604215.localdomain podman[69393]: 2026-02-01 08:10:51.179478012 +0000 UTC m=+0.067368285 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=starting, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_migration_target, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:10:51 np0005604215.localdomain podman[69328]: 2026-02-01 08:10:51.268477259 +0000 UTC m=+0.250221163 container cleanup a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=configure_cms_options, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=configure_cms_options, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: libpod-conmon-a0f2464871378e941563090e4710cf646859b636bf3611034b45ded0159cbf88.scope: Deactivated successfully.
Feb 01 08:10:51 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name configure_cms_options --conmon-pidfile /run/configure_cms_options.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step4 --label container_name=configure_cms_options --label managed_by=tripleo_ansible --label config_data={'command': ['/bin/bash', '-c', 'CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/configure_cms_options.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1 /bin/bash -c CMS_OPTS=$(hiera ovn::controller::ovn_cms_options -c /etc/puppet/hiera.yaml);  if [ X"$CMS_OPTS" !=  X ]; then ovs-vsctl set open . external_ids:ovn-cms-options=$CMS_OPTS;else ovs-vsctl remove open . external_ids ovn-cms-options; fi
Feb 01 08:10:51 np0005604215.localdomain podman[69460]: 2026-02-01 08:10:51.327011201 +0000 UTC m=+0.068489292 container create 8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, container_name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: Started libpod-conmon-8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d.scope.
Feb 01 08:10:51 np0005604215.localdomain podman[69460]: 2026-02-01 08:10:51.284938576 +0000 UTC m=+0.026416657 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:51 np0005604215.localdomain podman[69460]: 2026-02-01 08:10:51.433328591 +0000 UTC m=+0.174806682 container init 8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510)
Feb 01 08:10:51 np0005604215.localdomain podman[69460]: 2026-02-01 08:10:51.442246457 +0000 UTC m=+0.183724548 container start 8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, container_name=setup_ovs_manager, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git)
Feb 01 08:10:51 np0005604215.localdomain podman[69460]: 2026-02-01 08:10:51.442776413 +0000 UTC m=+0.184254504 container attach 8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=setup_ovs_manager, distribution-scope=public, release=1766032510, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 01 08:10:51 np0005604215.localdomain podman[69393]: 2026-02-01 08:10:51.542043779 +0000 UTC m=+0.429934112 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 01 08:10:51 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:10:51 np0005604215.localdomain sudo[69515]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/bin/ceilometer-rootwrap /etc/ceilometer/rootwrap.conf privsep-helper --privsep_context ceilometer.privsep.sys_admin_pctxt --privsep_sock_path /tmp/tmpj5j7yuee/privsep.sock
Feb 01 08:10:51 np0005604215.localdomain sudo[69515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 01 08:10:52 np0005604215.localdomain kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Feb 01 08:10:52 np0005604215.localdomain sudo[69515]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:54 np0005604215.localdomain ovs-vsctl[69640]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: libpod-8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d.scope: Deactivated successfully.
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: libpod-8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d.scope: Consumed 2.850s CPU time.
Feb 01 08:10:54 np0005604215.localdomain podman[69641]: 2026-02-01 08:10:54.396401157 +0000 UTC m=+0.053539574 container died 8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, container_name=setup_ovs_manager, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510)
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d-userdata-shm.mount: Deactivated successfully.
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-877c65e867b205f11a32fcdb99f229d7cc1aad0815e744014cf57490bce97673-merged.mount: Deactivated successfully.
Feb 01 08:10:54 np0005604215.localdomain podman[69641]: 2026-02-01 08:10:54.439955399 +0000 UTC m=+0.097093816 container cleanup 8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=setup_ovs_manager, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']}, architecture=x86_64, container_name=setup_ovs_manager, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: libpod-conmon-8753f2d44977c485d36fd45e7d5b92f1e769f0230bda34ef2ee7941aa029294d.scope: Deactivated successfully.
Feb 01 08:10:54 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name setup_ovs_manager --conmon-pidfile /run/setup_ovs_manager.pid --detach=False --env TRIPLEO_DEPLOY_IDENTIFIER=1769931690 --label config_id=tripleo_step4 --label container_name=setup_ovs_manager --label managed_by=tripleo_ansible --label config_data={'command': ['/container_puppet_apply.sh', '4', 'exec', 'include tripleo::profile::base::neutron::ovn_metadata'], 'detach': False, 'environment': {'TRIPLEO_DEPLOY_IDENTIFIER': '1769931690'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'privileged': True, 'start_order': 0, 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro', '/etc/puppet:/tmp/puppet-etc:ro', '/usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/setup_ovs_manager.log --network host --privileged=True --user root --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/container-config-scripts/container_puppet_apply.sh:/container_puppet_apply.sh:ro --volume /etc/puppet:/tmp/puppet-etc:ro --volume /usr/share/openstack-puppet/modules:/usr/share/openstack-puppet/modules:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1 /container_puppet_apply.sh 4 exec include tripleo::profile::base::neutron::ovn_metadata
Feb 01 08:10:54 np0005604215.localdomain podman[69747]: 2026-02-01 08:10:54.88241431 +0000 UTC m=+0.078445620 container create e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 01 08:10:54 np0005604215.localdomain podman[69758]: 2026-02-01 08:10:54.929561358 +0000 UTC m=+0.099009067 container create e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.scope.
Feb 01 08:10:54 np0005604215.localdomain podman[69747]: 2026-02-01 08:10:54.841651477 +0000 UTC m=+0.037682827 image pull  registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.scope.
Feb 01 08:10:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae6e92d81edd57130eba0dea91809d1be824b840176ebe669287b6264f5d2d37/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae6e92d81edd57130eba0dea91809d1be824b840176ebe669287b6264f5d2d37/merged/var/log/ovn supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae6e92d81edd57130eba0dea91809d1be824b840176ebe669287b6264f5d2d37/merged/var/log/openvswitch supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:10:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d506918155a93476a6405c9e2c98cb06d7e575d23557b96e2d10a36860f0cb4c/merged/var/log/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d506918155a93476a6405c9e2c98cb06d7e575d23557b96e2d10a36860f0cb4c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d506918155a93476a6405c9e2c98cb06d7e575d23557b96e2d10a36860f0cb4c/merged/etc/neutron/kill_scripts supports timestamps until 2038 (0x7fffffff)
Feb 01 08:10:54 np0005604215.localdomain podman[69758]: 2026-02-01 08:10:54.883711232 +0000 UTC m=+0.053158981 image pull  registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 01 08:10:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:10:54 np0005604215.localdomain podman[69747]: 2026-02-01 08:10:54.993924427 +0000 UTC m=+0.189955717 container init e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510)
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:10:55 np0005604215.localdomain podman[69758]: 2026-02-01 08:10:55.015353532 +0000 UTC m=+0.184801251 container init e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:10:55 np0005604215.localdomain podman[69747]: 2026-02-01 08:10:55.037895833 +0000 UTC m=+0.233927133 container start e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, distribution-scope=public, release=1766032510)
Feb 01 08:10:55 np0005604215.localdomain sudo[69792]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:10:55 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:10:55 np0005604215.localdomain sudo[69792]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 01 08:10:55 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck 6642 --label config_id=tripleo_step4 --label container_name=ovn_controller --label managed_by=tripleo_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_controller.log --network host --privileged=True --user root --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/log/containers/openvswitch:/var/log/openvswitch:z --volume /var/log/containers/openvswitch:/var/log/ovn:z registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 01 08:10:55 np0005604215.localdomain podman[69758]: 2026-02-01 08:10:55.081838568 +0000 UTC m=+0.251286277 container start e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:10:55 np0005604215.localdomain python3[68862]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env TRIPLEO_CONFIG_HASH=08ca8fb8877681656a098784127ead43 --healthcheck-command /openstack/healthcheck --label config_id=tripleo_step4 --label container_name=ovn_metadata_agent --label managed_by=tripleo_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/ovn_metadata_agent.log --network host --pid host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/neutron:/var/log/neutron:z --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro --volume /lib/modules:/lib/modules:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /run/netns:/run/netns:shared --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 01 08:10:55 np0005604215.localdomain sudo[69792]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:10:55 np0005604215.localdomain podman[69809]: 2026-02-01 08:10:55.170222315 +0000 UTC m=+0.079101331 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, container_name=ovn_metadata_agent, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, url=https://www.redhat.com)
Feb 01 08:10:55 np0005604215.localdomain podman[69809]: 2026-02-01 08:10:55.181464365 +0000 UTC m=+0.090343361 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git)
Feb 01 08:10:55 np0005604215.localdomain podman[69809]: unhealthy
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:10:55 np0005604215.localdomain podman[69793]: 2026-02-01 08:10:55.152419135 +0000 UTC m=+0.104265155 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, distribution-scope=public, container_name=ovn_controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:10:55 np0005604215.localdomain podman[69793]: 2026-02-01 08:10:55.239516321 +0000 UTC m=+0.191362331 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, container_name=ovn_controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible)
Feb 01 08:10:55 np0005604215.localdomain podman[69793]: unhealthy
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Queued start job for default target Main User Target.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Created slice User Application Slice.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Reached target Paths.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Reached target Timers.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Starting D-Bus User Message Bus Socket...
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Starting Create User's Volatile Files and Directories...
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Finished Create User's Volatile Files and Directories.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Listening on D-Bus User Message Bus Socket.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Reached target Sockets.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Reached target Basic System.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Reached target Main User Target.
Feb 01 08:10:55 np0005604215.localdomain systemd[69818]: Startup finished in 161ms.
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Started User Manager for UID 0.
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: Started Session c9 of User root.
Feb 01 08:10:55 np0005604215.localdomain systemd[1]: session-c9.scope: Deactivated successfully.
Feb 01 08:10:55 np0005604215.localdomain kernel: device br-int entered promiscuous mode
Feb 01 08:10:55 np0005604215.localdomain NetworkManager[5972]: <info>  [1769933455.4220] manager: (br-int): new Generic device (/org/freedesktop/NetworkManager/Devices/11)
Feb 01 08:10:55 np0005604215.localdomain sudo[68860]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:55 np0005604215.localdomain systemd-udevd[69897]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 08:10:55 np0005604215.localdomain sudo[69915]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swhqgtbepvnoyapigjasmggtrxrtcqzc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:55 np0005604215.localdomain sudo[69915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:56 np0005604215.localdomain python3[69917]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:56 np0005604215.localdomain sudo[69915]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:56 np0005604215.localdomain sudo[69931]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-csaencwfetzrwdiiprddkvgvjcftpicb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:56 np0005604215.localdomain sudo[69931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:56 np0005604215.localdomain python3[69933]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:56 np0005604215.localdomain sudo[69931]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:56 np0005604215.localdomain sudo[69947]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jgfzuuzraikqvgdohhuursqqsxtmrjyq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:56 np0005604215.localdomain sudo[69947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:56 np0005604215.localdomain kernel: device genev_sys_6081 entered promiscuous mode
Feb 01 08:10:56 np0005604215.localdomain systemd-udevd[69899]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 08:10:56 np0005604215.localdomain NetworkManager[5972]: <info>  [1769933456.4924] device (genev_sys_6081): carrier: link connected
Feb 01 08:10:56 np0005604215.localdomain NetworkManager[5972]: <info>  [1769933456.4931] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/12)
Feb 01 08:10:56 np0005604215.localdomain python3[69949]: ansible-file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:56 np0005604215.localdomain sudo[69947]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:56 np0005604215.localdomain sudo[69961]:  neutron : PWD=/ ; USER=root ; COMMAND=/usr/bin/neutron-rootwrap /etc/neutron/rootwrap.conf privsep-helper --config-file /etc/neutron/neutron.conf --config-file /etc/neutron/plugins/networking-ovn/networking-ovn-metadata-agent.ini --privsep_context neutron.privileged.default --privsep_sock_path /tmp/tmpqnbnbt2d/privsep.sock
Feb 01 08:10:56 np0005604215.localdomain sudo[69961]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42435)
Feb 01 08:10:56 np0005604215.localdomain sudo[69968]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjebsiipktcqcoupekghsadpltzilqmj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:56 np0005604215.localdomain sudo[69968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:56 np0005604215.localdomain python3[69971]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:56 np0005604215.localdomain sudo[69968]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:56 np0005604215.localdomain sudo[69986]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlfnidczrtachzafwydwzlmjgtaqqitd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:56 np0005604215.localdomain sudo[69986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:57 np0005604215.localdomain python3[69988]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:57 np0005604215.localdomain sudo[69986]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:57 np0005604215.localdomain sudo[70002]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqwutryqlweagbsercsxnmruiumolmwp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:57 np0005604215.localdomain sudo[70002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:57 np0005604215.localdomain sudo[69961]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:57 np0005604215.localdomain python3[70004]: ansible-file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:57 np0005604215.localdomain sudo[70002]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:57 np0005604215.localdomain sudo[70020]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiunfzyzspkxbzjdyjvmwdlauvrkfwlw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:57 np0005604215.localdomain sudo[70020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:57 np0005604215.localdomain python3[70022]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:57 np0005604215.localdomain sudo[70020]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:57 np0005604215.localdomain sudo[70038]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gakwlmfagcddsgkuarqfkjohixiehidq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:57 np0005604215.localdomain sudo[70038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:57 np0005604215.localdomain python3[70040]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:57 np0005604215.localdomain sudo[70038]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:57 np0005604215.localdomain sudo[70054]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjptizeinzazbbewrbalpbtpgccekept ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:57 np0005604215.localdomain sudo[70054]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:58 np0005604215.localdomain python3[70056]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_logrotate_crond_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:58 np0005604215.localdomain sudo[70054]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:58 np0005604215.localdomain sudo[70070]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kciuwlhmpdjalpmpashpetjtafrvkuze ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:58 np0005604215.localdomain sudo[70070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:58 np0005604215.localdomain python3[70072]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_migration_target_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:58 np0005604215.localdomain sudo[70070]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:58 np0005604215.localdomain sudo[70086]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-amjjcekzbbyxwroqmdyhiafhdxapwffy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:58 np0005604215.localdomain sudo[70086]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:58 np0005604215.localdomain python3[70088]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_controller_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:58 np0005604215.localdomain sudo[70086]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:58 np0005604215.localdomain sudo[70102]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyrooeyebftjbojaeakrjvelthcccwbg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:58 np0005604215.localdomain sudo[70102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:58 np0005604215.localdomain python3[70104]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:10:58 np0005604215.localdomain sudo[70102]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:59 np0005604215.localdomain sudo[70163]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymymvymiqfkahhqwjyoysnxxnpymztyc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:59 np0005604215.localdomain sudo[70163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:59 np0005604215.localdomain python3[70165]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933458.95486-108569-263935627041416/source dest=/etc/systemd/system/tripleo_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:59 np0005604215.localdomain sudo[70163]: pam_unix(sudo:session): session closed for user root
Feb 01 08:10:59 np0005604215.localdomain sudo[70193]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-umjrzgqifeljygolgnwasourjtmashlj ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:10:59 np0005604215.localdomain sudo[70193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:10:59 np0005604215.localdomain python3[70195]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933458.95486-108569-263935627041416/source dest=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:10:59 np0005604215.localdomain sudo[70193]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:00 np0005604215.localdomain sudo[70222]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyhcgnlzensjiraszpqsrnxnkpabrscc ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:00 np0005604215.localdomain sudo[70222]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:00 np0005604215.localdomain python3[70224]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933458.95486-108569-263935627041416/source dest=/etc/systemd/system/tripleo_logrotate_crond.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:11:00 np0005604215.localdomain sudo[70222]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:00 np0005604215.localdomain sudo[70251]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eevvjazptvywbijzjmpkudzlvfmbcdum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:00 np0005604215.localdomain sudo[70251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:00 np0005604215.localdomain python3[70253]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933458.95486-108569-263935627041416/source dest=/etc/systemd/system/tripleo_nova_migration_target.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:11:00 np0005604215.localdomain sudo[70251]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:01 np0005604215.localdomain sudo[70281]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qtkirfbrdimjpveficpmxfntvqtgjqxy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:01 np0005604215.localdomain sudo[70281]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:01 np0005604215.localdomain python3[70283]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933458.95486-108569-263935627041416/source dest=/etc/systemd/system/tripleo_ovn_controller.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:11:01 np0005604215.localdomain sudo[70281]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:01 np0005604215.localdomain sudo[70310]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-agkoulrlglhfsovkmnsskxdacspttohd ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:01 np0005604215.localdomain sudo[70310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:01 np0005604215.localdomain python3[70312]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933458.95486-108569-263935627041416/source dest=/etc/systemd/system/tripleo_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:11:01 np0005604215.localdomain sudo[70310]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:02 np0005604215.localdomain sudo[70326]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdcilmxtjuhpfbrzqueudxwsbkcgyrpt ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:02 np0005604215.localdomain sudo[70326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:02 np0005604215.localdomain python3[70328]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 08:11:02 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:02 np0005604215.localdomain systemd-sysv-generator[70355]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:02 np0005604215.localdomain systemd-rc-local-generator[70352]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:02 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:02 np0005604215.localdomain sudo[70326]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:03 np0005604215.localdomain sudo[70378]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvpjxjavbfehitdvagyhjnliwlwwqawb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:03 np0005604215.localdomain sudo[70378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:03 np0005604215.localdomain python3[70380]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:11:03 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:03 np0005604215.localdomain systemd-rc-local-generator[70409]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:03 np0005604215.localdomain systemd-sysv-generator[70414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:03 np0005604215.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Feb 01 08:11:03 np0005604215.localdomain tripleo-start-podman-container[70420]: Creating additional drop-in dependency for "ceilometer_agent_compute" (35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9)
Feb 01 08:11:03 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:03 np0005604215.localdomain systemd-rc-local-generator[70474]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:03 np0005604215.localdomain systemd-sysv-generator[70477]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:04 np0005604215.localdomain systemd[1]: Started ceilometer_agent_compute container.
Feb 01 08:11:04 np0005604215.localdomain sudo[70378]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:04 np0005604215.localdomain sudo[70501]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntqabimvptjkczcyqjdgkauhyoqhdvnn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:04 np0005604215.localdomain sudo[70501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:04 np0005604215.localdomain python3[70503]: ansible-systemd Invoked with state=restarted name=tripleo_ceilometer_agent_ipmi.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:11:04 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:04 np0005604215.localdomain systemd-sysv-generator[70536]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:04 np0005604215.localdomain systemd-rc-local-generator[70533]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Starting ceilometer_agent_ipmi container...
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Started ceilometer_agent_ipmi container.
Feb 01 08:11:05 np0005604215.localdomain sudo[70501]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Activating special unit Exit the Session...
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped target Main User Target.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped target Basic System.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped target Paths.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped target Sockets.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped target Timers.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Closed D-Bus User Message Bus Socket.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Stopped Create User's Volatile Files and Directories.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Removed slice User Application Slice.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Reached target Shutdown.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Finished Exit the Session.
Feb 01 08:11:05 np0005604215.localdomain systemd[69818]: Reached target Exit the Session.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 01 08:11:05 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 01 08:11:05 np0005604215.localdomain sudo[70572]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjwjlihbmbdabtgnndncflyosgnkiadw ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:05 np0005604215.localdomain sudo[70572]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:06 np0005604215.localdomain python3[70574]: ansible-systemd Invoked with state=restarted name=tripleo_logrotate_crond.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:11:06 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:06 np0005604215.localdomain systemd-rc-local-generator[70602]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:06 np0005604215.localdomain systemd-sysv-generator[70606]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:06 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:06 np0005604215.localdomain systemd[1]: Starting logrotate_crond container...
Feb 01 08:11:06 np0005604215.localdomain systemd[1]: Started logrotate_crond container.
Feb 01 08:11:06 np0005604215.localdomain sudo[70572]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:06 np0005604215.localdomain sudo[70639]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qvhpqkpmlhbhmwjpsgwihtdetyfezxic ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:06 np0005604215.localdomain sudo[70639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:07 np0005604215.localdomain python3[70641]: ansible-systemd Invoked with state=restarted name=tripleo_nova_migration_target.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:11:07 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:07 np0005604215.localdomain systemd-rc-local-generator[70673]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:07 np0005604215.localdomain systemd-sysv-generator[70676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:07 np0005604215.localdomain systemd[1]: Starting nova_migration_target container...
Feb 01 08:11:07 np0005604215.localdomain systemd[1]: Started nova_migration_target container.
Feb 01 08:11:07 np0005604215.localdomain sudo[70639]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:08 np0005604215.localdomain sudo[70707]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kfhetipvwhgjausagrelpthxlpfduyds ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:08 np0005604215.localdomain sudo[70707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:08 np0005604215.localdomain python3[70709]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:08 np0005604215.localdomain systemd-rc-local-generator[70744]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:08 np0005604215.localdomain systemd-sysv-generator[70748]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:08 np0005604215.localdomain podman[70711]: 2026-02-01 08:11:08.438266365 +0000 UTC m=+0.104510723 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: Starting ovn_controller container...
Feb 01 08:11:08 np0005604215.localdomain podman[70711]: 2026-02-01 08:11:08.648887731 +0000 UTC m=+0.315132119 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510)
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:11:08 np0005604215.localdomain tripleo-start-podman-container[70776]: Creating additional drop-in dependency for "ovn_controller" (e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257)
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:08 np0005604215.localdomain systemd-sysv-generator[70835]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:08 np0005604215.localdomain systemd-rc-local-generator[70830]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:08 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:09 np0005604215.localdomain systemd[1]: Started ovn_controller container.
Feb 01 08:11:09 np0005604215.localdomain sudo[70707]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:09 np0005604215.localdomain sudo[70859]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jlbqnufyhbtezrbsesmnbhjsrdozhobv ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:11:09 np0005604215.localdomain sudo[70859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:09 np0005604215.localdomain python3[70861]: ansible-systemd Invoked with state=restarted name=tripleo_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:11:09 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:11:09 np0005604215.localdomain systemd-sysv-generator[70889]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:11:09 np0005604215.localdomain systemd-rc-local-generator[70886]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:11:10 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:11:10 np0005604215.localdomain systemd[1]: Starting ovn_metadata_agent container...
Feb 01 08:11:10 np0005604215.localdomain systemd[1]: Started ovn_metadata_agent container.
Feb 01 08:11:10 np0005604215.localdomain sudo[70859]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:10 np0005604215.localdomain sudo[70941]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiybmriqzgykcwzwbgdvajolitvcroit ; /usr/bin/python3
Feb 01 08:11:10 np0005604215.localdomain sudo[70941]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:10 np0005604215.localdomain python3[70943]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks4.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:11:10 np0005604215.localdomain sudo[70941]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:11 np0005604215.localdomain sudo[70989]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mfcbocotybaeylbydkltwekycyfvgdrf ; /usr/bin/python3
Feb 01 08:11:11 np0005604215.localdomain sudo[70989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:11 np0005604215.localdomain sudo[70989]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:11 np0005604215.localdomain sudo[71032]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjaxztgbyxmenvtwcpbwtcojpxrovxbf ; /usr/bin/python3
Feb 01 08:11:11 np0005604215.localdomain sudo[71032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:11 np0005604215.localdomain sudo[71032]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:11 np0005604215.localdomain sudo[71062]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkbfozwhcstrszbdqzumzbrkwibduxnn ; /usr/bin/python3
Feb 01 08:11:11 np0005604215.localdomain sudo[71062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:12 np0005604215.localdomain python3[71064]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks4.json short_hostname=np0005604215 step=4 update_config_hash_only=False
Feb 01 08:11:12 np0005604215.localdomain sudo[71062]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:12 np0005604215.localdomain sudo[71078]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aoglioemlegzbhovwtijtaiudbbezynx ; /usr/bin/python3
Feb 01 08:11:12 np0005604215.localdomain sudo[71078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:12 np0005604215.localdomain python3[71080]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:11:12 np0005604215.localdomain sudo[71078]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:12 np0005604215.localdomain sudo[71094]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gswgtrsuxwjodmkndayvwnvigsreadsl ; /usr/bin/python3
Feb 01 08:11:12 np0005604215.localdomain sudo[71094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:11:12 np0005604215.localdomain python3[71096]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_4 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 01 08:11:12 np0005604215.localdomain sudo[71094]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:11:17 np0005604215.localdomain systemd[1]: tmp-crun.16ISi1.mount: Deactivated successfully.
Feb 01 08:11:17 np0005604215.localdomain podman[71098]: 2026-02-01 08:11:17.87310671 +0000 UTC m=+0.088785791 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:11:17 np0005604215.localdomain podman[71098]: 2026-02-01 08:11:17.93565132 +0000 UTC m=+0.151330351 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git)
Feb 01 08:11:17 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:11:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:11:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:11:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:11:20 np0005604215.localdomain podman[71119]: 2026-02-01 08:11:20.894673967 +0000 UTC m=+0.109035298 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, container_name=iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3)
Feb 01 08:11:20 np0005604215.localdomain systemd[1]: tmp-crun.N2KxfN.mount: Deactivated successfully.
Feb 01 08:11:20 np0005604215.localdomain systemd[1]: tmp-crun.6oV0cG.mount: Deactivated successfully.
Feb 01 08:11:20 np0005604215.localdomain podman[71120]: 2026-02-01 08:11:20.937677512 +0000 UTC m=+0.150599558 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=starting, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team)
Feb 01 08:11:20 np0005604215.localdomain podman[71119]: 2026-02-01 08:11:20.955661667 +0000 UTC m=+0.170023018 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, com.redhat.component=openstack-iscsid-container)
Feb 01 08:11:20 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:11:21 np0005604215.localdomain podman[71148]: 2026-02-01 08:11:21.038263439 +0000 UTC m=+0.135605709 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:11:21 np0005604215.localdomain podman[71120]: 2026-02-01 08:11:21.044688964 +0000 UTC m=+0.257611010 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:11:21 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:11:21 np0005604215.localdomain podman[71148]: 2026-02-01 08:11:21.074685164 +0000 UTC m=+0.172027454 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:11:21 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:11:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:11:21 np0005604215.localdomain podman[71181]: 2026-02-01 08:11:21.177686548 +0000 UTC m=+0.073470081 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=starting, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:11:21 np0005604215.localdomain podman[71181]: 2026-02-01 08:11:21.213805092 +0000 UTC m=+0.109588655 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, version=17.1.13, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:11:21 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:11:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:11:21 np0005604215.localdomain podman[71208]: 2026-02-01 08:11:21.858908065 +0000 UTC m=+0.077899723 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T23:32:04Z, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:11:22 np0005604215.localdomain podman[71208]: 2026-02-01 08:11:22.202689949 +0000 UTC m=+0.421681577 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:11:22 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:11:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:11:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:11:25 np0005604215.localdomain podman[71231]: 2026-02-01 08:11:25.871562957 +0000 UTC m=+0.081147457 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=starting, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, config_id=tripleo_step4, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510)
Feb 01 08:11:25 np0005604215.localdomain podman[71230]: 2026-02-01 08:11:25.930254284 +0000 UTC m=+0.142111176 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=starting, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 01 08:11:25 np0005604215.localdomain podman[71231]: 2026-02-01 08:11:25.950518142 +0000 UTC m=+0.160102662 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=ovn_controller, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 01 08:11:25 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:11:26 np0005604215.localdomain podman[71230]: 2026-02-01 08:11:26.000900003 +0000 UTC m=+0.212756845 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:11:26 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:11:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 08:11:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 08:11:35 np0005604215.localdomain sudo[71279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:11:35 np0005604215.localdomain sudo[71279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:35 np0005604215.localdomain sudo[71279]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:35 np0005604215.localdomain sudo[71294]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 08:11:35 np0005604215.localdomain sudo[71294]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:35 np0005604215.localdomain sudo[71294]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:36 np0005604215.localdomain sudo[71331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:11:36 np0005604215.localdomain sudo[71331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:36 np0005604215.localdomain sudo[71331]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:36 np0005604215.localdomain sudo[71346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:11:36 np0005604215.localdomain sudo[71346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:36 np0005604215.localdomain sudo[71346]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:37 np0005604215.localdomain sudo[71392]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:11:37 np0005604215.localdomain sudo[71392]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:37 np0005604215.localdomain sudo[71392]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:37 np0005604215.localdomain sudo[71407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- inventory --format=json-pretty --filter-for-batch
Feb 01 08:11:37 np0005604215.localdomain sudo[71407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 2026-02-01 08:11:37.69056746 +0000 UTC m=+0.087311153 container create 5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_galileo, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 08:11:37 np0005604215.localdomain systemd[1]: Started libpod-conmon-5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e.scope.
Feb 01 08:11:37 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 2026-02-01 08:11:37.652242494 +0000 UTC m=+0.048986287 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 2026-02-01 08:11:37.758997229 +0000 UTC m=+0.155740922 container init 5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_galileo, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, version=7, release=1764794109, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True)
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 2026-02-01 08:11:37.768032698 +0000 UTC m=+0.164776391 container start 5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_galileo, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, io.openshift.expose-services=, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True, io.buildah.version=1.41.4, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, name=rhceph)
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 2026-02-01 08:11:37.768230154 +0000 UTC m=+0.164973877 container attach 5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_galileo, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, description=Red Hat Ceph Storage 7, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 08:11:37 np0005604215.localdomain vibrant_galileo[71477]: 167 167
Feb 01 08:11:37 np0005604215.localdomain systemd[1]: libpod-5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e.scope: Deactivated successfully.
Feb 01 08:11:37 np0005604215.localdomain podman[71462]: 2026-02-01 08:11:37.771469137 +0000 UTC m=+0.168212870 container died 5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_galileo, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, name=rhceph, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 08:11:37 np0005604215.localdomain podman[71482]: 2026-02-01 08:11:37.852497889 +0000 UTC m=+0.069498574 container remove 5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vibrant_galileo, io.buildah.version=1.41.4, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.component=rhceph-container, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 08:11:37 np0005604215.localdomain systemd[1]: libpod-conmon-5a15f0085ce89435566d8ba897baad1bb098ea7c11f0f155254e8c0194ab996e.scope: Deactivated successfully.
Feb 01 08:11:38 np0005604215.localdomain podman[71504]: 
Feb 01 08:11:38 np0005604215.localdomain podman[71504]: 2026-02-01 08:11:38.07076283 +0000 UTC m=+0.074366410 container create 9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_clarke, vcs-type=git, CEPH_POINT_RELEASE=, version=7, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109)
Feb 01 08:11:38 np0005604215.localdomain systemd[1]: Started libpod-conmon-9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629.scope.
Feb 01 08:11:38 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:11:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e956024c4ea0a50cd3fc62ced1d5cc76cf0f4c15a20dc99218f407d2180e04d/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 08:11:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e956024c4ea0a50cd3fc62ced1d5cc76cf0f4c15a20dc99218f407d2180e04d/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:11:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3e956024c4ea0a50cd3fc62ced1d5cc76cf0f4c15a20dc99218f407d2180e04d/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 08:11:38 np0005604215.localdomain podman[71504]: 2026-02-01 08:11:38.042253148 +0000 UTC m=+0.045856778 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 08:11:38 np0005604215.localdomain podman[71504]: 2026-02-01 08:11:38.142136002 +0000 UTC m=+0.145739582 container init 9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_clarke, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 08:11:38 np0005604215.localdomain podman[71504]: 2026-02-01 08:11:38.151188082 +0000 UTC m=+0.154791652 container start 9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_clarke, release=1764794109, ceph=True, vendor=Red Hat, Inc., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.buildah.version=1.41.4, architecture=x86_64)
Feb 01 08:11:38 np0005604215.localdomain podman[71504]: 2026-02-01 08:11:38.15142332 +0000 UTC m=+0.155026900 container attach 9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_clarke, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, distribution-scope=public, release=1764794109, vendor=Red Hat, Inc., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, GIT_BRANCH=main, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 08:11:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-353506b63b93b75ce8538100a86eed5b102bad6aeda38da254d7ef347a8b2b60-merged.mount: Deactivated successfully.
Feb 01 08:11:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:11:38 np0005604215.localdomain podman[71683]: 2026-02-01 08:11:38.798434362 +0000 UTC m=+0.071931811 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:11:39 np0005604215.localdomain podman[71683]: 2026-02-01 08:11:39.000696081 +0000 UTC m=+0.274193550 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container)
Feb 01 08:11:39 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]: [
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:     {
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "available": false,
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "ceph_device": false,
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "lsm_data": {},
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "lvs": [],
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "path": "/dev/sr0",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "rejected_reasons": [
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "Insufficient space (<5GB)",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "Has a FileSystem"
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         ],
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         "sys_api": {
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "actuators": null,
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "device_nodes": "sr0",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "human_readable_size": "482.00 KB",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "id_bus": "ata",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "model": "QEMU DVD-ROM",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "nr_requests": "2",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "partitions": {},
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "path": "/dev/sr0",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "removable": "1",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "rev": "2.5+",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "ro": "0",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "rotational": "1",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "sas_address": "",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "sas_device_handle": "",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "scheduler_mode": "mq-deadline",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "sectors": 0,
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "sectorsize": "2048",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "size": 493568.0,
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "support_discard": "0",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "type": "disk",
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:             "vendor": "QEMU"
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:         }
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]:     }
Feb 01 08:11:39 np0005604215.localdomain tender_clarke[71519]: ]
Feb 01 08:11:39 np0005604215.localdomain systemd[1]: libpod-9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629.scope: Deactivated successfully.
Feb 01 08:11:39 np0005604215.localdomain systemd[1]: libpod-9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629.scope: Consumed 1.093s CPU time.
Feb 01 08:11:39 np0005604215.localdomain podman[71504]: 2026-02-01 08:11:39.23520846 +0000 UTC m=+1.238812060 container died 9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_clarke, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, distribution-scope=public, GIT_BRANCH=main, ceph=True, architecture=x86_64, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 08:11:39 np0005604215.localdomain podman[73566]: 2026-02-01 08:11:39.354976001 +0000 UTC m=+0.109562565 container remove 9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=tender_clarke, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_BRANCH=main, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, GIT_CLEAN=True, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=rhceph-container, release=1764794109, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7)
Feb 01 08:11:39 np0005604215.localdomain systemd[1]: libpod-conmon-9c041cfcdfac3e8e83e127562a1757b4394b8eb619eeedd29c2cf3217b05c629.scope: Deactivated successfully.
Feb 01 08:11:39 np0005604215.localdomain sudo[71407]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3e956024c4ea0a50cd3fc62ced1d5cc76cf0f4c15a20dc99218f407d2180e04d-merged.mount: Deactivated successfully.
Feb 01 08:11:39 np0005604215.localdomain sudo[73579]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:11:39 np0005604215.localdomain sudo[73579]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:11:39 np0005604215.localdomain sudo[73579]: pam_unix(sudo:session): session closed for user root
Feb 01 08:11:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:11:48 np0005604215.localdomain systemd[1]: tmp-crun.3jHdLq.mount: Deactivated successfully.
Feb 01 08:11:48 np0005604215.localdomain podman[73594]: 2026-02-01 08:11:48.881129967 +0000 UTC m=+0.096713375 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, config_id=tripleo_step3, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Feb 01 08:11:48 np0005604215.localdomain podman[73594]: 2026-02-01 08:11:48.892019135 +0000 UTC m=+0.107602533 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container)
Feb 01 08:11:48 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:11:51 np0005604215.localdomain podman[73616]: 2026-02-01 08:11:51.875702709 +0000 UTC m=+0.082957045 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5)
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: tmp-crun.5IqZ5U.mount: Deactivated successfully.
Feb 01 08:11:51 np0005604215.localdomain podman[73615]: 2026-02-01 08:11:51.933548718 +0000 UTC m=+0.140753352 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 01 08:11:51 np0005604215.localdomain podman[73615]: 2026-02-01 08:11:51.942799215 +0000 UTC m=+0.150003869 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:34:43Z, distribution-scope=public, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:11:51 np0005604215.localdomain podman[73616]: 2026-02-01 08:11:51.958803756 +0000 UTC m=+0.166058142 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:11:51 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:11:52 np0005604215.localdomain podman[73617]: 2026-02-01 08:11:52.038252118 +0000 UTC m=+0.241582828 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.buildah.version=1.41.5)
Feb 01 08:11:52 np0005604215.localdomain podman[73614]: 2026-02-01 08:11:52.084184846 +0000 UTC m=+0.294217550 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:11:52 np0005604215.localdomain podman[73614]: 2026-02-01 08:11:52.089465435 +0000 UTC m=+0.299498189 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:11:52 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:11:52 np0005604215.localdomain podman[73617]: 2026-02-01 08:11:52.108502044 +0000 UTC m=+0.311832694 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:11:52 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:11:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:11:54 np0005604215.localdomain podman[73706]: 2026-02-01 08:11:54.026177075 +0000 UTC m=+0.075300169 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:11:54 np0005604215.localdomain podman[73706]: 2026-02-01 08:11:54.406701695 +0000 UTC m=+0.455824799 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:11:54 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:11:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:11:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:11:56 np0005604215.localdomain podman[73730]: 2026-02-01 08:11:56.854152578 +0000 UTC m=+0.070382602 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=ovn_controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team)
Feb 01 08:11:56 np0005604215.localdomain podman[73730]: 2026-02-01 08:11:56.877609218 +0000 UTC m=+0.093839252 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:11:56 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:11:56 np0005604215.localdomain podman[73729]: 2026-02-01 08:11:56.964524168 +0000 UTC m=+0.179152441 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:11:57 np0005604215.localdomain podman[73729]: 2026-02-01 08:11:57.042501942 +0000 UTC m=+0.257130165 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13)
Feb 01 08:11:57 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:12:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:12:09 np0005604215.localdomain podman[73778]: 2026-02-01 08:12:09.866381086 +0000 UTC m=+0.082344075 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:12:10 np0005604215.localdomain podman[73778]: 2026-02-01 08:12:10.08381454 +0000 UTC m=+0.299777539 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, distribution-scope=public, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team)
Feb 01 08:12:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:12:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:12:19 np0005604215.localdomain systemd[1]: tmp-crun.TqhkHn.mount: Deactivated successfully.
Feb 01 08:12:19 np0005604215.localdomain podman[73808]: 2026-02-01 08:12:19.869104632 +0000 UTC m=+0.084482643 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, architecture=x86_64, config_id=tripleo_step3, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:12:19 np0005604215.localdomain podman[73808]: 2026-02-01 08:12:19.905604059 +0000 UTC m=+0.120982040 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 01 08:12:19 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: tmp-crun.MU4ZH8.mount: Deactivated successfully.
Feb 01 08:12:22 np0005604215.localdomain podman[73838]: 2026-02-01 08:12:22.891777783 +0000 UTC m=+0.085445803 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, architecture=x86_64, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: tmp-crun.QHbXGK.mount: Deactivated successfully.
Feb 01 08:12:22 np0005604215.localdomain podman[73831]: 2026-02-01 08:12:22.935529793 +0000 UTC m=+0.136471906 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:12:22 np0005604215.localdomain podman[73838]: 2026-02-01 08:12:22.939245572 +0000 UTC m=+0.132913552 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public)
Feb 01 08:12:22 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:12:22 np0005604215.localdomain podman[73829]: 2026-02-01 08:12:22.863767188 +0000 UTC m=+0.071211868 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1)
Feb 01 08:12:22 np0005604215.localdomain podman[73830]: 2026-02-01 08:12:22.979929243 +0000 UTC m=+0.179214812 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.expose-services=, container_name=iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z)
Feb 01 08:12:22 np0005604215.localdomain podman[73830]: 2026-02-01 08:12:22.991696389 +0000 UTC m=+0.190982028 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, vendor=Red Hat, Inc.)
Feb 01 08:12:23 np0005604215.localdomain podman[73829]: 2026-02-01 08:12:23.000898473 +0000 UTC m=+0.208343173 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 01 08:12:23 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:12:23 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:12:23 np0005604215.localdomain podman[73831]: 2026-02-01 08:12:23.042887946 +0000 UTC m=+0.243830079 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:12:23 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:12:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:12:24 np0005604215.localdomain systemd[1]: tmp-crun.UYY8vW.mount: Deactivated successfully.
Feb 01 08:12:24 np0005604215.localdomain podman[73916]: 2026-02-01 08:12:24.859698321 +0000 UTC m=+0.079100690 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:12:25 np0005604215.localdomain podman[73916]: 2026-02-01 08:12:25.236552923 +0000 UTC m=+0.455955212 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:12:25 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:12:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:12:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:12:27 np0005604215.localdomain podman[73939]: 2026-02-01 08:12:27.860399359 +0000 UTC m=+0.079732121 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_metadata_agent)
Feb 01 08:12:27 np0005604215.localdomain systemd[1]: tmp-crun.gpYMXx.mount: Deactivated successfully.
Feb 01 08:12:27 np0005604215.localdomain podman[73940]: 2026-02-01 08:12:27.912398732 +0000 UTC m=+0.130855265 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible)
Feb 01 08:12:27 np0005604215.localdomain podman[73940]: 2026-02-01 08:12:27.958684703 +0000 UTC m=+0.177141266 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Feb 01 08:12:27 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:12:28 np0005604215.localdomain podman[73939]: 2026-02-01 08:12:28.015575872 +0000 UTC m=+0.234908654 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5)
Feb 01 08:12:28 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:12:40 np0005604215.localdomain sudo[73988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:12:40 np0005604215.localdomain sudo[73988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:12:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:12:40 np0005604215.localdomain sudo[73988]: pam_unix(sudo:session): session closed for user root
Feb 01 08:12:40 np0005604215.localdomain sudo[74009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:12:40 np0005604215.localdomain sudo[74009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:12:40 np0005604215.localdomain systemd[1]: tmp-crun.suJi5q.mount: Deactivated successfully.
Feb 01 08:12:40 np0005604215.localdomain podman[74002]: 2026-02-01 08:12:40.232476813 +0000 UTC m=+0.098599645 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Feb 01 08:12:40 np0005604215.localdomain podman[74002]: 2026-02-01 08:12:40.421973903 +0000 UTC m=+0.288096735 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:12:40 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:12:40 np0005604215.localdomain sudo[74009]: pam_unix(sudo:session): session closed for user root
Feb 01 08:12:42 np0005604215.localdomain sudo[74078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:12:42 np0005604215.localdomain sudo[74078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:12:42 np0005604215.localdomain sudo[74078]: pam_unix(sudo:session): session closed for user root
Feb 01 08:12:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:12:50 np0005604215.localdomain podman[74093]: 2026-02-01 08:12:50.894048021 +0000 UTC m=+0.097420037 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, version=17.1.13, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, release=1766032510, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:12:50 np0005604215.localdomain podman[74093]: 2026-02-01 08:12:50.935708294 +0000 UTC m=+0.139080260 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, container_name=collectd, architecture=x86_64, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:12:50 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: tmp-crun.oqzvO5.mount: Deactivated successfully.
Feb 01 08:12:53 np0005604215.localdomain podman[74115]: 2026-02-01 08:12:53.908904332 +0000 UTC m=+0.112406815 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:12:53 np0005604215.localdomain podman[74113]: 2026-02-01 08:12:53.874143041 +0000 UTC m=+0.087168318 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-cron, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond)
Feb 01 08:12:53 np0005604215.localdomain podman[74115]: 2026-02-01 08:12:53.933260482 +0000 UTC m=+0.136762945 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:12:53 np0005604215.localdomain podman[74113]: 2026-02-01 08:12:53.957831818 +0000 UTC m=+0.170857085 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, url=https://www.redhat.com, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:12:53 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:12:54 np0005604215.localdomain podman[74114]: 2026-02-01 08:12:54.034171349 +0000 UTC m=+0.243699145 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T22:34:43Z)
Feb 01 08:12:54 np0005604215.localdomain podman[74114]: 2026-02-01 08:12:54.044596983 +0000 UTC m=+0.254124799 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:12:54 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:12:54 np0005604215.localdomain podman[74116]: 2026-02-01 08:12:54.099960534 +0000 UTC m=+0.302613279 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5)
Feb 01 08:12:54 np0005604215.localdomain podman[74116]: 2026-02-01 08:12:54.130660445 +0000 UTC m=+0.333313180 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, tcib_managed=true, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:12:54 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:12:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:12:56 np0005604215.localdomain podman[74200]: 2026-02-01 08:12:56.017277493 +0000 UTC m=+0.079523144 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5)
Feb 01 08:12:56 np0005604215.localdomain podman[74200]: 2026-02-01 08:12:56.343498756 +0000 UTC m=+0.405744407 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, tcib_managed=true, vcs-type=git, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5)
Feb 01 08:12:56 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:12:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:12:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:12:58 np0005604215.localdomain systemd[1]: tmp-crun.QWpGqS.mount: Deactivated successfully.
Feb 01 08:12:58 np0005604215.localdomain podman[74224]: 2026-02-01 08:12:58.862475028 +0000 UTC m=+0.081528329 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510)
Feb 01 08:12:58 np0005604215.localdomain podman[74225]: 2026-02-01 08:12:58.913557252 +0000 UTC m=+0.131429565 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, release=1766032510, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:12:58 np0005604215.localdomain podman[74224]: 2026-02-01 08:12:58.924784251 +0000 UTC m=+0.143837512 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com)
Feb 01 08:12:58 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:12:58 np0005604215.localdomain podman[74225]: 2026-02-01 08:12:58.936833096 +0000 UTC m=+0.154705429 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git)
Feb 01 08:12:58 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:13:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:13:10 np0005604215.localdomain podman[74272]: 2026-02-01 08:13:10.863477693 +0000 UTC m=+0.079556876 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=metrics_qdr, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z)
Feb 01 08:13:11 np0005604215.localdomain podman[74272]: 2026-02-01 08:13:11.060672669 +0000 UTC m=+0.276751872 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, config_id=tripleo_step1, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:13:11 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:13:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:13:21 np0005604215.localdomain podman[74300]: 2026-02-01 08:13:21.867462921 +0000 UTC m=+0.082386406 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-type=git)
Feb 01 08:13:21 np0005604215.localdomain podman[74300]: 2026-02-01 08:13:21.902655517 +0000 UTC m=+0.117579022 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public)
Feb 01 08:13:21 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: tmp-crun.MKupGY.mount: Deactivated successfully.
Feb 01 08:13:24 np0005604215.localdomain podman[74321]: 2026-02-01 08:13:24.878901493 +0000 UTC m=+0.090127724 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, architecture=x86_64, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:13:24 np0005604215.localdomain podman[74321]: 2026-02-01 08:13:24.891677121 +0000 UTC m=+0.102903382 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:13:24 np0005604215.localdomain podman[74322]: 2026-02-01 08:13:24.933371395 +0000 UTC m=+0.141466336 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, config_id=tripleo_step3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=)
Feb 01 08:13:24 np0005604215.localdomain podman[74322]: 2026-02-01 08:13:24.946113813 +0000 UTC m=+0.154208694 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 01 08:13:24 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:13:25 np0005604215.localdomain podman[74323]: 2026-02-01 08:13:25.000656927 +0000 UTC m=+0.207759916 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com)
Feb 01 08:13:25 np0005604215.localdomain podman[74323]: 2026-02-01 08:13:25.029665754 +0000 UTC m=+0.236768713 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:13:25 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:13:25 np0005604215.localdomain podman[74324]: 2026-02-01 08:13:25.087203865 +0000 UTC m=+0.288558400 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true)
Feb 01 08:13:25 np0005604215.localdomain podman[74324]: 2026-02-01 08:13:25.117797693 +0000 UTC m=+0.319152308 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:13:25 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:13:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:13:26 np0005604215.localdomain podman[74410]: 2026-02-01 08:13:26.864961041 +0000 UTC m=+0.081793367 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:13:27 np0005604215.localdomain podman[74410]: 2026-02-01 08:13:27.258607221 +0000 UTC m=+0.475439507 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team)
Feb 01 08:13:27 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:13:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:13:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:13:29 np0005604215.localdomain systemd[1]: tmp-crun.ZLuQeI.mount: Deactivated successfully.
Feb 01 08:13:29 np0005604215.localdomain podman[74433]: 2026-02-01 08:13:29.878619725 +0000 UTC m=+0.088913914 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 01 08:13:29 np0005604215.localdomain podman[74434]: 2026-02-01 08:13:29.919487983 +0000 UTC m=+0.126307681 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:13:29 np0005604215.localdomain podman[74433]: 2026-02-01 08:13:29.967904251 +0000 UTC m=+0.178198430 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public)
Feb 01 08:13:29 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:13:30 np0005604215.localdomain podman[74434]: 2026-02-01 08:13:30.024323965 +0000 UTC m=+0.231143683 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:13:30 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:13:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:13:41 np0005604215.localdomain podman[74479]: 2026-02-01 08:13:41.878410703 +0000 UTC m=+0.091753776 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container)
Feb 01 08:13:42 np0005604215.localdomain podman[74479]: 2026-02-01 08:13:42.078734989 +0000 UTC m=+0.292078102 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:13:42 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:13:42 np0005604215.localdomain sudo[74508]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:13:42 np0005604215.localdomain sudo[74508]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:13:42 np0005604215.localdomain sudo[74508]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:42 np0005604215.localdomain sudo[74523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 08:13:42 np0005604215.localdomain sudo[74523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:13:43 np0005604215.localdomain podman[74609]: 2026-02-01 08:13:43.465512281 +0000 UTC m=+0.099766431 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., ceph=True, version=7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:13:43 np0005604215.localdomain podman[74609]: 2026-02-01 08:13:43.596746329 +0000 UTC m=+0.231000469 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.buildah.version=1.41.4, RELEASE=main, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, GIT_CLEAN=True, ceph=True, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, description=Red Hat Ceph Storage 7, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main)
Feb 01 08:13:43 np0005604215.localdomain sudo[74523]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:43 np0005604215.localdomain sudo[74673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:13:43 np0005604215.localdomain sudo[74673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:13:43 np0005604215.localdomain sudo[74673]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:44 np0005604215.localdomain sudo[74688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:13:44 np0005604215.localdomain sudo[74688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:13:44 np0005604215.localdomain sudo[74688]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:45 np0005604215.localdomain sudo[74735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:13:45 np0005604215.localdomain sudo[74735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:13:45 np0005604215.localdomain sudo[74735]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:13:52 np0005604215.localdomain podman[74750]: 2026-02-01 08:13:52.904708763 +0000 UTC m=+0.117183601 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.13, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:13:52 np0005604215.localdomain podman[74750]: 2026-02-01 08:13:52.917701577 +0000 UTC m=+0.130176375 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.buildah.version=1.41.5, container_name=collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:13:52 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:13:55 np0005604215.localdomain podman[74769]: 2026-02-01 08:13:55.855568298 +0000 UTC m=+0.066302617 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 01 08:13:55 np0005604215.localdomain podman[74769]: 2026-02-01 08:13:55.887824148 +0000 UTC m=+0.098558457 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, architecture=x86_64, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5)
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: tmp-crun.u2fwxg.mount: Deactivated successfully.
Feb 01 08:13:55 np0005604215.localdomain podman[74771]: 2026-02-01 08:13:55.911501374 +0000 UTC m=+0.115324072 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-type=git, build-date=2026-01-12T23:07:47Z, distribution-scope=public, config_id=tripleo_step4, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:13:55 np0005604215.localdomain podman[74771]: 2026-02-01 08:13:55.956595823 +0000 UTC m=+0.160418561 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ceilometer_agent_compute, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:13:55 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:13:55 np0005604215.localdomain podman[74770]: 2026-02-01 08:13:55.980214447 +0000 UTC m=+0.190028926 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, vcs-type=git, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=)
Feb 01 08:13:56 np0005604215.localdomain podman[74770]: 2026-02-01 08:13:56.014738738 +0000 UTC m=+0.224553177 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 01 08:13:56 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:13:56 np0005604215.localdomain podman[74777]: 2026-02-01 08:13:56.03106312 +0000 UTC m=+0.231177990 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:13:56 np0005604215.localdomain podman[74777]: 2026-02-01 08:13:56.081412297 +0000 UTC m=+0.281527157 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:13:56 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:13:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:13:57 np0005604215.localdomain podman[74856]: 2026-02-01 08:13:57.865107352 +0000 UTC m=+0.081555914 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:13:58 np0005604215.localdomain sudo[74924]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlrayrfzszxvkzfunaldpfbrocbokpkr ; /usr/bin/python3
Feb 01 08:13:58 np0005604215.localdomain sudo[74924]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:13:58 np0005604215.localdomain python3[74926]: ansible-ansible.legacy.stat Invoked with path=/etc/puppet/hieradata/config_step.json follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:13:58 np0005604215.localdomain sudo[74924]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:58 np0005604215.localdomain podman[74856]: 2026-02-01 08:13:58.247665551 +0000 UTC m=+0.464114153 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:13:58 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:13:58 np0005604215.localdomain sudo[74970]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fpzjyyprymrepbostohkailbxmbfqsjz ; /usr/bin/python3
Feb 01 08:13:58 np0005604215.localdomain sudo[74970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:13:58 np0005604215.localdomain python3[74972]: ansible-ansible.legacy.copy Invoked with dest=/etc/puppet/hieradata/config_step.json force=True mode=0600 src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933637.8956897-112806-229797675245333/source _original_basename=tmpcshnf26x follow=False checksum=039e0b234f00fbd1242930f0d5dc67e8b4c067fe backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:13:58 np0005604215.localdomain sudo[74970]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:59 np0005604215.localdomain sudo[75000]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogxlyexdchqndezjoeytxyouipkkbgds ; /usr/bin/python3
Feb 01 08:13:59 np0005604215.localdomain sudo[75000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:13:59 np0005604215.localdomain python3[75002]: ansible-stat Invoked with path=/var/lib/tripleo-config/container-startup-config/step_5 follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:13:59 np0005604215.localdomain sudo[75000]: pam_unix(sudo:session): session closed for user root
Feb 01 08:13:59 np0005604215.localdomain sudo[75050]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzoyrxmmoctjlrvddyvybmazfdbkwnky ; /usr/bin/python3
Feb 01 08:14:00 np0005604215.localdomain sudo[75050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:14:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:14:00 np0005604215.localdomain systemd[1]: tmp-crun.i0BZhm.mount: Deactivated successfully.
Feb 01 08:14:00 np0005604215.localdomain podman[75053]: 2026-02-01 08:14:00.12231632 +0000 UTC m=+0.098951949 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510)
Feb 01 08:14:00 np0005604215.localdomain podman[75053]: 2026-02-01 08:14:00.178673279 +0000 UTC m=+0.155308908 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:14:00 np0005604215.localdomain podman[75070]: 2026-02-01 08:14:00.195731473 +0000 UTC m=+0.072672820 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:14:00 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:14:00 np0005604215.localdomain sudo[75050]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:00 np0005604215.localdomain podman[75070]: 2026-02-01 08:14:00.217698714 +0000 UTC m=+0.094640071 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, tcib_managed=true, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 01 08:14:00 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:14:00 np0005604215.localdomain sudo[75116]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpzilhlhqeoouyujoitxyzeazfmqqoxa ; /usr/bin/python3
Feb 01 08:14:00 np0005604215.localdomain sudo[75116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:00 np0005604215.localdomain sudo[75116]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:01 np0005604215.localdomain sudo[75220]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdcrszovcnzpmxulpnpreakrkykervsl ; ANSIBLE_ASYNC_DIR=/tmp/.ansible_async /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933640.7088668-113062-14324234785392/async_wrapper.py 203038787111 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933640.7088668-113062-14324234785392/AnsiballZ_command.py _
Feb 01 08:14:01 np0005604215.localdomain sudo[75220]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 08:14:01 np0005604215.localdomain ansible-async_wrapper.py[75222]: Invoked with 203038787111 3600 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933640.7088668-113062-14324234785392/AnsiballZ_command.py _
Feb 01 08:14:01 np0005604215.localdomain ansible-async_wrapper.py[75225]: Starting module and watcher
Feb 01 08:14:01 np0005604215.localdomain ansible-async_wrapper.py[75225]: Start watching 75226 (3600)
Feb 01 08:14:01 np0005604215.localdomain ansible-async_wrapper.py[75226]: Start module (75226)
Feb 01 08:14:01 np0005604215.localdomain ansible-async_wrapper.py[75222]: Return async_wrapper task started.
Feb 01 08:14:01 np0005604215.localdomain sudo[75220]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:01 np0005604215.localdomain sudo[75241]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kubwjtzfssauekjrccdgsmjzrpkmchok ; /usr/bin/python3
Feb 01 08:14:01 np0005604215.localdomain sudo[75241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:01 np0005604215.localdomain python3[75246]: ansible-ansible.legacy.async_status Invoked with jid=203038787111.75222 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:14:01 np0005604215.localdomain sudo[75241]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]: Warning: /etc/puppet/hiera.yaml: Use of 'hiera.yaml' version 3 is deprecated. It should be converted to version 5
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]:    (file: /etc/puppet/hiera.yaml)
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]: Warning: Undefined variable '::deploy_config_name';
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]:    (file & line not available)
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]: Warning: The function 'hiera' is deprecated in favor of using 'lookup'. See https://puppet.com/docs/puppet/7.10/deprecated_language.html
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]:    (file & line not available)
Feb 01 08:14:04 np0005604215.localdomain puppet-user[75245]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/profile/base/database/mysql/client.pp, line: 89, column: 8)
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: This method is deprecated, please use match expressions with Stdlib::Compat::String instead. They are described at https://docs.puppet.com/puppet/latest/reference/lang_data_type.html#match-expressions. at ["/etc/puppet/modules/snmp/manifests/params.pp", 310]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:                     with Stdlib::Compat::Bool. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 358]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:                     with Stdlib::Compat::Array. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 367]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:                     with Stdlib::Compat::String. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 382]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:                     with Stdlib::Compat::Numeric. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 388]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: This method is deprecated, please use the stdlib validate_legacy function,
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:                     with Pattern[]. There is further documentation for validate_legacy function in the README. at ["/etc/puppet/modules/snmp/manifests/init.pp", 393]:["/var/lib/tripleo-config/puppet_step_config.pp", 4]
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    (location: /etc/puppet/modules/stdlib/lib/puppet/functions/deprecation.rb:34:in `deprecation')
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Warning: Unknown variable: '::deployment_type'. (file: /etc/puppet/modules/tripleo/manifests/packages.pp, line: 39, column: 69)
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Notice: Compiled catalog for np0005604215.localdomain in environment production in 0.22 seconds
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Notice: Applied catalog in 0.23 seconds
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Application:
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    Initial environment: production
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    Converged environment: production
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:          Run mode: user
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Changes:
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Events:
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Resources:
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:             Total: 19
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Time:
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:          Schedule: 0.00
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:           Package: 0.00
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:              Exec: 0.01
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:            Augeas: 0.01
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:              File: 0.02
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:           Service: 0.04
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    Transaction evaluation: 0.22
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    Catalog application: 0.23
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:    Config retrieval: 0.28
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:          Last run: 1769933645
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:        Filebucket: 0.00
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:             Total: 0.23
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]: Version:
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:            Config: 1769933644
Feb 01 08:14:05 np0005604215.localdomain puppet-user[75245]:            Puppet: 7.10.0
Feb 01 08:14:05 np0005604215.localdomain ansible-async_wrapper.py[75226]: Module complete (75226)
Feb 01 08:14:06 np0005604215.localdomain ansible-async_wrapper.py[75225]: Done in kid B.
Feb 01 08:14:10 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:14:10 np0005604215.localdomain recover_tripleo_nova_virtqemud[75370]: 62016
Feb 01 08:14:10 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:14:10 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:14:11 np0005604215.localdomain sudo[75384]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwvgvkkslfleetwbdsckthflabaumkyu ; /usr/bin/python3
Feb 01 08:14:11 np0005604215.localdomain sudo[75384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:11 np0005604215.localdomain python3[75386]: ansible-ansible.legacy.async_status Invoked with jid=203038787111.75222 mode=status _async_dir=/tmp/.ansible_async
Feb 01 08:14:11 np0005604215.localdomain sudo[75384]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:12 np0005604215.localdomain sudo[75400]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wfjjboequlmunbkptlrvhtxbbhsspcha ; /usr/bin/python3
Feb 01 08:14:12 np0005604215.localdomain sudo[75400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:14:12 np0005604215.localdomain podman[75403]: 2026-02-01 08:14:12.6043233 +0000 UTC m=+0.084504187 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, release=1766032510, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr)
Feb 01 08:14:12 np0005604215.localdomain python3[75402]: ansible-file Invoked with path=/var/lib/container-puppet/puppetlabs state=directory setype=svirt_sandbox_file_t selevel=s0 recurse=True force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:14:12 np0005604215.localdomain sudo[75400]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:12 np0005604215.localdomain podman[75403]: 2026-02-01 08:14:12.76852915 +0000 UTC m=+0.248709967 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:14:12 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:14:12 np0005604215.localdomain sudo[75444]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onwvehczeodkeyylmzkyrbsynfqgbytl ; /usr/bin/python3
Feb 01 08:14:12 np0005604215.localdomain sudo[75444]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:13 np0005604215.localdomain python3[75446]: ansible-stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:14:13 np0005604215.localdomain sudo[75444]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:13 np0005604215.localdomain sudo[75494]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecepaioxnohpxpmhmqubwhsunospjciy ; /usr/bin/python3
Feb 01 08:14:13 np0005604215.localdomain sudo[75494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:13 np0005604215.localdomain python3[75496]: ansible-ansible.legacy.stat Invoked with path=/var/lib/container-puppet/puppetlabs/facter.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:13 np0005604215.localdomain sudo[75494]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:13 np0005604215.localdomain sudo[75512]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tllqqbiugacvdyenhwrawbydplzluzce ; /usr/bin/python3
Feb 01 08:14:13 np0005604215.localdomain sudo[75512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:13 np0005604215.localdomain python3[75514]: ansible-ansible.legacy.file Invoked with setype=svirt_sandbox_file_t selevel=s0 dest=/var/lib/container-puppet/puppetlabs/facter.conf _original_basename=tmptozshck9 recurse=False state=file path=/var/lib/container-puppet/puppetlabs/facter.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 08:14:13 np0005604215.localdomain sudo[75512]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:14 np0005604215.localdomain sudo[75542]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcsivvrqwvlujeonynxecyrekuemtrha ; /usr/bin/python3
Feb 01 08:14:14 np0005604215.localdomain sudo[75542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:14 np0005604215.localdomain python3[75544]: ansible-file Invoked with path=/opt/puppetlabs/facter state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:14 np0005604215.localdomain sudo[75542]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:14 np0005604215.localdomain sudo[75558]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ultqxkocylagzvsmkrimsxkrncjeioee ; /usr/bin/python3
Feb 01 08:14:14 np0005604215.localdomain sudo[75558]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:14 np0005604215.localdomain sudo[75558]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:15 np0005604215.localdomain sudo[75647]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlbiugkyrkgjnetpzkolbucltdgrjiqr ; /usr/bin/python3
Feb 01 08:14:15 np0005604215.localdomain sudo[75647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:15 np0005604215.localdomain python3[75649]: ansible-ansible.posix.synchronize Invoked with src=/opt/puppetlabs/ dest=/var/lib/container-puppet/puppetlabs/ _local_rsync_path=rsync _local_rsync_password=NOT_LOGGING_PARAMETER rsync_path=None delete=False _substitute_controller=False archive=True checksum=False compress=True existing_only=False dirs=False copy_links=False set_remote_user=True rsync_timeout=0 rsync_opts=[] ssh_connection_multiplexing=False partial=False verify_host=False mode=push dest_port=None private_key=None recursive=None links=None perms=None times=None owner=None group=None ssh_args=None link_dest=None
Feb 01 08:14:15 np0005604215.localdomain sudo[75647]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:15 np0005604215.localdomain sudo[75666]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvnaengwyifsqitgouczanwbehffuuum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:15 np0005604215.localdomain sudo[75666]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:16 np0005604215.localdomain python3[75668]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:16 np0005604215.localdomain sudo[75666]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:16 np0005604215.localdomain sudo[75682]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stsmqanhebhnhrewujsohzhjthjqxbss ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:16 np0005604215.localdomain sudo[75682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:16 np0005604215.localdomain sudo[75682]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:16 np0005604215.localdomain sudo[75698]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-crudokqfstfwgqmsizraukyvkmcczdum ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:16 np0005604215.localdomain sudo[75698]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:16 np0005604215.localdomain python3[75700]: ansible-stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:14:16 np0005604215.localdomain sudo[75698]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:17 np0005604215.localdomain sudo[75748]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehyjctqbixedxpghijwubcejqyhwjlme ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:17 np0005604215.localdomain sudo[75748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:17 np0005604215.localdomain python3[75750]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-container-shutdown follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:17 np0005604215.localdomain sudo[75748]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:17 np0005604215.localdomain sudo[75766]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fltsgtkdlyurwwhisryjsodqjpdxoevm ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:17 np0005604215.localdomain sudo[75766]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:17 np0005604215.localdomain python3[75768]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-container-shutdown _original_basename=tripleo-container-shutdown recurse=False state=file path=/usr/libexec/tripleo-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:17 np0005604215.localdomain sudo[75766]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:18 np0005604215.localdomain sudo[75828]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-acnwfacvsidcxbsaoprknycohlojajgp ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:18 np0005604215.localdomain sudo[75828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:18 np0005604215.localdomain python3[75830]: ansible-ansible.legacy.stat Invoked with path=/usr/libexec/tripleo-start-podman-container follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:18 np0005604215.localdomain sudo[75828]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:18 np0005604215.localdomain sudo[75846]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnyggtdxaixhncpkzcjovtkbjarjhgbn ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:18 np0005604215.localdomain sudo[75846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:18 np0005604215.localdomain python3[75848]: ansible-ansible.legacy.file Invoked with mode=0700 owner=root group=root dest=/usr/libexec/tripleo-start-podman-container _original_basename=tripleo-start-podman-container recurse=False state=file path=/usr/libexec/tripleo-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:18 np0005604215.localdomain sudo[75846]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:18 np0005604215.localdomain sudo[75908]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icenvsiynpmyoktdwsyxijntarioutwg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:18 np0005604215.localdomain sudo[75908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:19 np0005604215.localdomain python3[75910]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/tripleo-container-shutdown.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:19 np0005604215.localdomain sudo[75908]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:19 np0005604215.localdomain sudo[75926]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcgogbytcqsrgszdndwrmriplfearqpe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:19 np0005604215.localdomain sudo[75926]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:19 np0005604215.localdomain python3[75928]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/tripleo-container-shutdown.service _original_basename=tripleo-container-shutdown-service recurse=False state=file path=/usr/lib/systemd/system/tripleo-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:19 np0005604215.localdomain sudo[75926]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:19 np0005604215.localdomain sudo[75988]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbcleqwxuuzdtsfenirxkmwgvgfzmorl ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:19 np0005604215.localdomain sudo[75988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:19 np0005604215.localdomain python3[75990]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:19 np0005604215.localdomain sudo[75988]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:20 np0005604215.localdomain sudo[76006]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gisriomdzdkwgttonwyxzizcjianmhlz ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:20 np0005604215.localdomain sudo[76006]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:20 np0005604215.localdomain python3[76008]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset _original_basename=91-tripleo-container-shutdown-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-tripleo-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:20 np0005604215.localdomain sudo[76006]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:20 np0005604215.localdomain sudo[76036]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avdmkxlpdvacxvdjgcryqgwrrvuumhpy ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:20 np0005604215.localdomain sudo[76036]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:20 np0005604215.localdomain python3[76038]: ansible-systemd Invoked with name=tripleo-container-shutdown state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:14:20 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:14:20 np0005604215.localdomain systemd-rc-local-generator[76061]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:14:20 np0005604215.localdomain systemd-sysv-generator[76065]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:14:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:14:21 np0005604215.localdomain sudo[76036]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:21 np0005604215.localdomain sudo[76122]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcpzclhspvwlvinoegezxemhscoyoute ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:21 np0005604215.localdomain sudo[76122]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:21 np0005604215.localdomain python3[76124]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system/netns-placeholder.service follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:21 np0005604215.localdomain sudo[76122]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:21 np0005604215.localdomain sudo[76140]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aiqjrfmxgfpxtgromrhlstcsallrtlax ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:21 np0005604215.localdomain sudo[76140]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:21 np0005604215.localdomain python3[76142]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/usr/lib/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:21 np0005604215.localdomain sudo[76140]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:22 np0005604215.localdomain sudo[76202]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwgyrjpaxiyisyjhzrkmicjskxeurtjf ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:22 np0005604215.localdomain sudo[76202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:22 np0005604215.localdomain python3[76204]: ansible-ansible.legacy.stat Invoked with path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Feb 01 08:14:22 np0005604215.localdomain sudo[76202]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:22 np0005604215.localdomain sudo[76220]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxcayrnhupkrvnlywnrumejwmnvzgoql ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:22 np0005604215.localdomain sudo[76220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:22 np0005604215.localdomain python3[76222]: ansible-ansible.legacy.file Invoked with mode=0644 owner=root group=root dest=/usr/lib/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/usr/lib/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:14:22 np0005604215.localdomain sudo[76220]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:22 np0005604215.localdomain sudo[76250]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfbrfbuueedlqsyrzewoepkiwegturvs ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:22 np0005604215.localdomain sudo[76250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:23 np0005604215.localdomain python3[76252]: ansible-systemd Invoked with name=netns-placeholder state=started enabled=True daemon_reload=True daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:14:23 np0005604215.localdomain systemd-rc-local-generator[76288]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:14:23 np0005604215.localdomain systemd-sysv-generator[76294]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:14:23 np0005604215.localdomain podman[76254]: 2026-02-01 08:14:23.200459242 +0000 UTC m=+0.110745695 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:14:23 np0005604215.localdomain podman[76254]: 2026-02-01 08:14:23.214568982 +0000 UTC m=+0.124855405 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible)
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 08:14:23 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 08:14:23 np0005604215.localdomain sudo[76250]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:23 np0005604215.localdomain sudo[76325]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bjouqxfcctxedermiubcpwfzhqcxjchb ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:23 np0005604215.localdomain sudo[76325]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:24 np0005604215.localdomain python3[76327]: ansible-container_puppet_config Invoked with update_config_hash_only=True no_archive=True check_mode=False config_vol_prefix=/var/lib/config-data debug=False net_host=True puppet_config= short_hostname= step=6
Feb 01 08:14:24 np0005604215.localdomain sudo[76325]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:24 np0005604215.localdomain sudo[76341]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-idtuhlkuxiuxztqckbrngaxsskqaufvg ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:24 np0005604215.localdomain sudo[76341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:24 np0005604215.localdomain sudo[76341]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:25 np0005604215.localdomain sudo[76383]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mioxotnhzheopelkjtclprqvjynofaip ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:14:25 np0005604215.localdomain sudo[76383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:14:26 np0005604215.localdomain python3[76385]: ansible-tripleo_container_manage Invoked with config_id=tripleo_step5 config_dir=/var/lib/tripleo-config/container-startup-config/step_5 config_patterns=*.json config_overrides={} concurrency=5 log_base_path=/var/log/containers/stdouts debug=False
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:14:26 np0005604215.localdomain podman[76394]: 2026-02-01 08:14:26.202980687 +0000 UTC m=+0.085309463 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, io.buildah.version=1.41.5)
Feb 01 08:14:26 np0005604215.localdomain podman[76388]: 2026-02-01 08:14:26.26197977 +0000 UTC m=+0.156094463 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 01 08:14:26 np0005604215.localdomain podman[76388]: 2026-02-01 08:14:26.297313998 +0000 UTC m=+0.191428651 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git)
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:14:26 np0005604215.localdomain podman[76390]: 2026-02-01 08:14:26.319899288 +0000 UTC m=+0.208634200 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:14:26 np0005604215.localdomain podman[76390]: 2026-02-01 08:14:26.352499738 +0000 UTC m=+0.241234640 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:14:26 np0005604215.localdomain podman[76389]: 2026-02-01 08:14:26.375848884 +0000 UTC m=+0.267799268 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5)
Feb 01 08:14:26 np0005604215.localdomain podman[76394]: 2026-02-01 08:14:26.386452083 +0000 UTC m=+0.268780919 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi)
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:14:26 np0005604215.localdomain podman[76389]: 2026-02-01 08:14:26.413522916 +0000 UTC m=+0.305473220 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.buildah.version=1.41.5, container_name=iscsid, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:14:26 np0005604215.localdomain podman[76508]: 2026-02-01 08:14:26.574391661 +0000 UTC m=+0.082123323 container create 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public)
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started libpod-conmon-1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.scope.
Feb 01 08:14:26 np0005604215.localdomain podman[76508]: 2026-02-01 08:14:26.535136948 +0000 UTC m=+0.042868580 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:14:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66409f2cae0cc3fdf46266cf7a9b4ef7f2208d64cf24e912c16b5d672be00b92/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66409f2cae0cc3fdf46266cf7a9b4ef7f2208d64cf24e912c16b5d672be00b92/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66409f2cae0cc3fdf46266cf7a9b4ef7f2208d64cf24e912c16b5d672be00b92/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66409f2cae0cc3fdf46266cf7a9b4ef7f2208d64cf24e912c16b5d672be00b92/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/66409f2cae0cc3fdf46266cf7a9b4ef7f2208d64cf24e912c16b5d672be00b92/merged/var/lib/kolla/config_files/src-ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:14:26 np0005604215.localdomain podman[76508]: 2026-02-01 08:14:26.678899816 +0000 UTC m=+0.186631468 container init 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5)
Feb 01 08:14:26 np0005604215.localdomain sudo[76529]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:14:26 np0005604215.localdomain podman[76508]: 2026-02-01 08:14:26.722484526 +0000 UTC m=+0.230216178 container start 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, release=1766032510, version=17.1.13, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public)
Feb 01 08:14:26 np0005604215.localdomain systemd-logind[761]: Existing logind session ID 28 used by new audit session, ignoring.
Feb 01 08:14:26 np0005604215.localdomain python3[76385]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_compute --conmon-pidfile /run/nova_compute.pid --detach=True --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env LIBGUESTFS_BACKEND=direct --env TRIPLEO_CONFIG_HASH=848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7 --healthcheck-command /openstack/healthcheck 5672 --ipc host --label config_id=tripleo_step5 --label container_name=nova_compute --label managed_by=tripleo_ansible --label config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_compute.log --network host --privileged=True --ulimit nofile=131072 --ulimit memlock=67108864 --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/log/containers/nova:/var/log/nova --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro --volume /var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro --volume /var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z --volume /dev:/dev --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /run/nova:/run/nova:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/libvirt:/var/lib/libvirt:shared --volume /sys/class/net:/sys/class/net --volume /sys/bus/pci:/sys/bus/pci --volume /boot:/boot:ro --volume /var/lib/nova:/var/lib/nova:shared registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 01 08:14:26 np0005604215.localdomain podman[76530]: 2026-02-01 08:14:26.826666902 +0000 UTC m=+0.094530148 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, container_name=nova_compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:14:26 np0005604215.localdomain podman[76530]: 2026-02-01 08:14:26.888754643 +0000 UTC m=+0.156617969 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute)
Feb 01 08:14:26 np0005604215.localdomain podman[76530]: unhealthy
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Queued start job for default target Main User Target.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Created slice User Application Slice.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Reached target Paths.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Reached target Timers.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Starting D-Bus User Message Bus Socket...
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Starting Create User's Volatile Files and Directories...
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Listening on D-Bus User Message Bus Socket.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Reached target Sockets.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Finished Create User's Volatile Files and Directories.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Reached target Basic System.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Reached target Main User Target.
Feb 01 08:14:26 np0005604215.localdomain systemd[76547]: Startup finished in 138ms.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started User Manager for UID 0.
Feb 01 08:14:26 np0005604215.localdomain systemd[1]: Started Session c10 of User root.
Feb 01 08:14:26 np0005604215.localdomain sudo[76529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 01 08:14:27 np0005604215.localdomain sudo[76529]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:27 np0005604215.localdomain systemd[1]: session-c10.scope: Deactivated successfully.
Feb 01 08:14:27 np0005604215.localdomain podman[76629]: 2026-02-01 08:14:27.241569553 +0000 UTC m=+0.077995940 container create 09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, architecture=x86_64)
Feb 01 08:14:27 np0005604215.localdomain systemd[1]: Started libpod-conmon-09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1.scope.
Feb 01 08:14:27 np0005604215.localdomain podman[76629]: 2026-02-01 08:14:27.198384144 +0000 UTC m=+0.034810511 image pull  registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:14:27 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:14:27 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7de42bd2ef28ab6d43ca2881ed0bac026c1f46d7bf355b9a366b5c9ec93a4c0/merged/container-config-scripts supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:27 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f7de42bd2ef28ab6d43ca2881ed0bac026c1f46d7bf355b9a366b5c9ec93a4c0/merged/var/log/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 08:14:27 np0005604215.localdomain podman[76629]: 2026-02-01 08:14:27.325240074 +0000 UTC m=+0.161666461 container init 09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, container_name=nova_wait_for_compute_service, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:14:27 np0005604215.localdomain podman[76629]: 2026-02-01 08:14:27.341096179 +0000 UTC m=+0.177522566 container start 09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, container_name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 01 08:14:27 np0005604215.localdomain podman[76629]: 2026-02-01 08:14:27.341521613 +0000 UTC m=+0.177948000 container attach 09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_wait_for_compute_service, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Feb 01 08:14:27 np0005604215.localdomain sudo[76648]:     nova : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 08:14:27 np0005604215.localdomain sudo[76648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42436)
Feb 01 08:14:27 np0005604215.localdomain sudo[76648]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:14:28 np0005604215.localdomain podman[76652]: 2026-02-01 08:14:28.870819259 +0000 UTC m=+0.084993833 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:14:29 np0005604215.localdomain podman[76652]: 2026-02-01 08:14:29.238715071 +0000 UTC m=+0.452889645 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:14:29 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:14:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:14:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:14:30 np0005604215.localdomain systemd[1]: tmp-crun.xB27rr.mount: Deactivated successfully.
Feb 01 08:14:30 np0005604215.localdomain podman[76676]: 2026-02-01 08:14:30.883110742 +0000 UTC m=+0.097376479 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:14:30 np0005604215.localdomain systemd[1]: tmp-crun.O9YqNO.mount: Deactivated successfully.
Feb 01 08:14:30 np0005604215.localdomain podman[76677]: 2026-02-01 08:14:30.937454276 +0000 UTC m=+0.149676867 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 01 08:14:30 np0005604215.localdomain podman[76676]: 2026-02-01 08:14:30.944678737 +0000 UTC m=+0.158944474 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, batch=17.1_20260112.1, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:14:30 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:14:30 np0005604215.localdomain podman[76677]: 2026-02-01 08:14:30.973785976 +0000 UTC m=+0.186008577 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13)
Feb 01 08:14:30 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Activating special unit Exit the Session...
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped target Main User Target.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped target Basic System.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped target Paths.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped target Sockets.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped target Timers.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Closed D-Bus User Message Bus Socket.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Stopped Create User's Volatile Files and Directories.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Removed slice User Application Slice.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Reached target Shutdown.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Finished Exit the Session.
Feb 01 08:14:37 np0005604215.localdomain systemd[76547]: Reached target Exit the Session.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 01 08:14:37 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 01 08:14:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:14:43 np0005604215.localdomain podman[76724]: 2026-02-01 08:14:43.862129264 +0000 UTC m=+0.076645788 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:14:44 np0005604215.localdomain podman[76724]: 2026-02-01 08:14:44.08983513 +0000 UTC m=+0.304351734 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z)
Feb 01 08:14:44 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:14:45 np0005604215.localdomain sudo[76755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:14:45 np0005604215.localdomain sudo[76755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:14:45 np0005604215.localdomain sudo[76755]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:45 np0005604215.localdomain sudo[76770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:14:45 np0005604215.localdomain sudo[76770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:14:46 np0005604215.localdomain sudo[76770]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:46 np0005604215.localdomain sudo[76816]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:14:46 np0005604215.localdomain sudo[76816]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:14:46 np0005604215.localdomain sudo[76816]: pam_unix(sudo:session): session closed for user root
Feb 01 08:14:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:14:53 np0005604215.localdomain podman[76831]: 2026-02-01 08:14:53.891544329 +0000 UTC m=+0.101945196 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-collectd-container, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:14:53 np0005604215.localdomain podman[76831]: 2026-02-01 08:14:53.908637784 +0000 UTC m=+0.119038661 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, url=https://www.redhat.com, container_name=collectd)
Feb 01 08:14:53 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: tmp-crun.XpUcTi.mount: Deactivated successfully.
Feb 01 08:14:56 np0005604215.localdomain podman[76851]: 2026-02-01 08:14:56.868638211 +0000 UTC m=+0.083244688 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron)
Feb 01 08:14:56 np0005604215.localdomain podman[76853]: 2026-02-01 08:14:56.924617968 +0000 UTC m=+0.132954045 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:14:56 np0005604215.localdomain podman[76854]: 2026-02-01 08:14:56.892987708 +0000 UTC m=+0.098883237 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 01 08:14:56 np0005604215.localdomain podman[76851]: 2026-02-01 08:14:56.953556311 +0000 UTC m=+0.168162728 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, batch=17.1_20260112.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:14:56 np0005604215.localdomain podman[76854]: 2026-02-01 08:14:56.979589282 +0000 UTC m=+0.185484791 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.)
Feb 01 08:14:56 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:14:57 np0005604215.localdomain podman[76853]: 2026-02-01 08:14:57.009515997 +0000 UTC m=+0.217852074 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.openshift.expose-services=)
Feb 01 08:14:57 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:14:57 np0005604215.localdomain podman[76924]: 2026-02-01 08:14:57.080944086 +0000 UTC m=+0.139311356 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.openshift.expose-services=, container_name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:14:57 np0005604215.localdomain podman[76852]: 2026-02-01 08:14:57.136002264 +0000 UTC m=+0.344247428 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, release=1766032510, config_id=tripleo_step3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:14:57 np0005604215.localdomain podman[76924]: 2026-02-01 08:14:57.140670163 +0000 UTC m=+0.199037433 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=nova_compute, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:14:57 np0005604215.localdomain podman[76924]: unhealthy
Feb 01 08:14:57 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:14:57 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:14:57 np0005604215.localdomain podman[76852]: 2026-02-01 08:14:57.169566385 +0000 UTC m=+0.377811499 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step3, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible)
Feb 01 08:14:57 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:14:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:14:59 np0005604215.localdomain podman[76963]: 2026-02-01 08:14:59.851754766 +0000 UTC m=+0.067663520 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:15:00 np0005604215.localdomain podman[76963]: 2026-02-01 08:15:00.172629787 +0000 UTC m=+0.388538501 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:15:00 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:15:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:15:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:15:01 np0005604215.localdomain podman[76986]: 2026-02-01 08:15:01.875835205 +0000 UTC m=+0.086647227 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, version=17.1.13, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 01 08:15:01 np0005604215.localdomain podman[76986]: 2026-02-01 08:15:01.931246483 +0000 UTC m=+0.142058525 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, tcib_managed=true)
Feb 01 08:15:01 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:15:01 np0005604215.localdomain podman[76987]: 2026-02-01 08:15:01.93272721 +0000 UTC m=+0.141552498 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team)
Feb 01 08:15:02 np0005604215.localdomain podman[76987]: 2026-02-01 08:15:02.011585627 +0000 UTC m=+0.220410895 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true)
Feb 01 08:15:02 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:15:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:15:14 np0005604215.localdomain podman[77034]: 2026-02-01 08:15:14.866138118 +0000 UTC m=+0.082875597 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 01 08:15:15 np0005604215.localdomain podman[77034]: 2026-02-01 08:15:15.054953763 +0000 UTC m=+0.271691262 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, container_name=metrics_qdr)
Feb 01 08:15:15 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:15:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:15:24 np0005604215.localdomain podman[77064]: 2026-02-01 08:15:24.848186321 +0000 UTC m=+0.067193176 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, com.redhat.component=openstack-collectd-container, container_name=collectd, managed_by=tripleo_ansible, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:15:24 np0005604215.localdomain podman[77064]: 2026-02-01 08:15:24.860840415 +0000 UTC m=+0.079847290 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, build-date=2026-01-12T22:10:15Z, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:15:24 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: tmp-crun.Dl3v5J.mount: Deactivated successfully.
Feb 01 08:15:27 np0005604215.localdomain podman[77088]: 2026-02-01 08:15:27.883228013 +0000 UTC m=+0.094242600 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:15:27 np0005604215.localdomain podman[77086]: 2026-02-01 08:15:27.913668034 +0000 UTC m=+0.127665836 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 01 08:15:27 np0005604215.localdomain podman[77088]: 2026-02-01 08:15:27.9217102 +0000 UTC m=+0.132724777 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com)
Feb 01 08:15:27 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:15:27 np0005604215.localdomain podman[77090]: 2026-02-01 08:15:27.983843854 +0000 UTC m=+0.193876780 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:15:28 np0005604215.localdomain podman[77090]: 2026-02-01 08:15:28.012433166 +0000 UTC m=+0.222466042 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z)
Feb 01 08:15:28 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:15:28 np0005604215.localdomain podman[77086]: 2026-02-01 08:15:28.051705169 +0000 UTC m=+0.265702991 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, managed_by=tripleo_ansible)
Feb 01 08:15:28 np0005604215.localdomain podman[77087]: 2026-02-01 08:15:27.848620448 +0000 UTC m=+0.062293200 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=starting, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:15:28 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:15:28 np0005604215.localdomain podman[77089]: 2026-02-01 08:15:28.100396303 +0000 UTC m=+0.311205363 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 01 08:15:28 np0005604215.localdomain podman[77089]: 2026-02-01 08:15:28.134723968 +0000 UTC m=+0.345533058 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:15:28 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:15:28 np0005604215.localdomain podman[77087]: 2026-02-01 08:15:28.184788306 +0000 UTC m=+0.398461138 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., container_name=nova_compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13)
Feb 01 08:15:28 np0005604215.localdomain podman[77087]: unhealthy
Feb 01 08:15:28 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:15:28 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:15:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:15:30 np0005604215.localdomain podman[77195]: 2026-02-01 08:15:30.867705551 +0000 UTC m=+0.084296021 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 01 08:15:31 np0005604215.localdomain podman[77195]: 2026-02-01 08:15:31.197792876 +0000 UTC m=+0.414383386 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.expose-services=, version=17.1.13, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:15:31 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:15:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:15:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:15:32 np0005604215.localdomain systemd[1]: tmp-crun.xV0C4X.mount: Deactivated successfully.
Feb 01 08:15:32 np0005604215.localdomain podman[77218]: 2026-02-01 08:15:32.873406732 +0000 UTC m=+0.086167161 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, tcib_managed=true, architecture=x86_64)
Feb 01 08:15:32 np0005604215.localdomain podman[77218]: 2026-02-01 08:15:32.917095987 +0000 UTC m=+0.129856366 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, release=1766032510, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:15:32 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:15:32 np0005604215.localdomain podman[77219]: 2026-02-01 08:15:32.926888309 +0000 UTC m=+0.136012181 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, architecture=x86_64, tcib_managed=true, release=1766032510, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:15:32 np0005604215.localdomain podman[77219]: 2026-02-01 08:15:32.955701618 +0000 UTC m=+0.164825510 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 01 08:15:32 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:15:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:15:45 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:15:45 np0005604215.localdomain recover_tripleo_nova_virtqemud[77275]: 62016
Feb 01 08:15:45 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:15:45 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:15:45 np0005604215.localdomain podman[77268]: 2026-02-01 08:15:45.858710475 +0000 UTC m=+0.075840962 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:15:46 np0005604215.localdomain podman[77268]: 2026-02-01 08:15:46.029386412 +0000 UTC m=+0.246516929 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step1, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:15:46 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:15:47 np0005604215.localdomain sudo[77301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:15:47 np0005604215.localdomain sudo[77301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:15:47 np0005604215.localdomain sudo[77301]: pam_unix(sudo:session): session closed for user root
Feb 01 08:15:47 np0005604215.localdomain sudo[77316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:15:47 np0005604215.localdomain sudo[77316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:15:47 np0005604215.localdomain sudo[77316]: pam_unix(sudo:session): session closed for user root
Feb 01 08:15:48 np0005604215.localdomain sudo[77364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:15:48 np0005604215.localdomain sudo[77364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:15:48 np0005604215.localdomain sudo[77364]: pam_unix(sudo:session): session closed for user root
Feb 01 08:15:50 np0005604215.localdomain sshd[35456]: Received disconnect from 192.168.122.100 port 49824:11: disconnected by user
Feb 01 08:15:50 np0005604215.localdomain sshd[35456]: Disconnected from user zuul 192.168.122.100 port 49824
Feb 01 08:15:50 np0005604215.localdomain sshd[35453]: pam_unix(sshd:session): session closed for user zuul
Feb 01 08:15:50 np0005604215.localdomain systemd[1]: session-27.scope: Deactivated successfully.
Feb 01 08:15:50 np0005604215.localdomain systemd[1]: session-27.scope: Consumed 2.982s CPU time.
Feb 01 08:15:50 np0005604215.localdomain systemd-logind[761]: Session 27 logged out. Waiting for processes to exit.
Feb 01 08:15:50 np0005604215.localdomain systemd-logind[761]: Removed session 27.
Feb 01 08:15:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:15:55 np0005604215.localdomain systemd[1]: tmp-crun.wVR3xq.mount: Deactivated successfully.
Feb 01 08:15:55 np0005604215.localdomain podman[77379]: 2026-02-01 08:15:55.899852944 +0000 UTC m=+0.108195735 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:15:55 np0005604215.localdomain podman[77379]: 2026-02-01 08:15:55.912072383 +0000 UTC m=+0.120415174 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container)
Feb 01 08:15:55 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:15:58 np0005604215.localdomain podman[77401]: 2026-02-01 08:15:58.868955592 +0000 UTC m=+0.080992096 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true)
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: tmp-crun.0wuDNM.mount: Deactivated successfully.
Feb 01 08:15:58 np0005604215.localdomain podman[77413]: 2026-02-01 08:15:58.888166874 +0000 UTC m=+0.087559455 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, container_name=ceilometer_agent_ipmi, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Feb 01 08:15:58 np0005604215.localdomain podman[77401]: 2026-02-01 08:15:58.921731465 +0000 UTC m=+0.133768009 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute)
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: tmp-crun.fUJiw5.mount: Deactivated successfully.
Feb 01 08:15:58 np0005604215.localdomain podman[77401]: unhealthy
Feb 01 08:15:58 np0005604215.localdomain podman[77402]: 2026-02-01 08:15:58.933791851 +0000 UTC m=+0.142368195 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, vcs-type=git, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:15:58 np0005604215.localdomain podman[77413]: 2026-02-01 08:15:58.971622297 +0000 UTC m=+0.171014818 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:15:58 np0005604215.localdomain podman[77402]: 2026-02-01 08:15:58.971911147 +0000 UTC m=+0.180487491 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:15:58 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:15:59 np0005604215.localdomain podman[77400]: 2026-02-01 08:15:58.976341808 +0000 UTC m=+0.191752501 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, name=rhosp-rhel9/openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5)
Feb 01 08:15:59 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:15:59 np0005604215.localdomain podman[77404]: 2026-02-01 08:15:59.038944296 +0000 UTC m=+0.240101484 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 01 08:15:59 np0005604215.localdomain podman[77400]: 2026-02-01 08:15:59.05973147 +0000 UTC m=+0.275142183 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc.)
Feb 01 08:15:59 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:15:59 np0005604215.localdomain podman[77404]: 2026-02-01 08:15:59.07260074 +0000 UTC m=+0.273757888 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, tcib_managed=true, vcs-type=git)
Feb 01 08:15:59 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:16:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:16:01 np0005604215.localdomain podman[77516]: 2026-02-01 08:16:01.848139421 +0000 UTC m=+0.066977389 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:16:02 np0005604215.localdomain podman[77516]: 2026-02-01 08:16:02.222750247 +0000 UTC m=+0.441588205 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=, vcs-type=git, container_name=nova_migration_target, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 01 08:16:02 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:16:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:16:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:16:03 np0005604215.localdomain podman[77541]: 2026-02-01 08:16:03.878169989 +0000 UTC m=+0.089972172 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, release=1766032510)
Feb 01 08:16:03 np0005604215.localdomain podman[77541]: 2026-02-01 08:16:03.928562838 +0000 UTC m=+0.140365011 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:16:03 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:16:03 np0005604215.localdomain podman[77542]: 2026-02-01 08:16:03.932247735 +0000 UTC m=+0.141133175 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13)
Feb 01 08:16:04 np0005604215.localdomain podman[77542]: 2026-02-01 08:16:04.017692601 +0000 UTC m=+0.226578011 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1)
Feb 01 08:16:04 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:16:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:16:16 np0005604215.localdomain podman[77590]: 2026-02-01 08:16:16.868880056 +0000 UTC m=+0.082324729 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, distribution-scope=public)
Feb 01 08:16:17 np0005604215.localdomain podman[77590]: 2026-02-01 08:16:17.068732084 +0000 UTC m=+0.282176807 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public)
Feb 01 08:16:17 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:16:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:16:26 np0005604215.localdomain podman[77619]: 2026-02-01 08:16:26.867988253 +0000 UTC m=+0.085574221 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, release=1766032510, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., container_name=collectd, io.buildah.version=1.41.5)
Feb 01 08:16:26 np0005604215.localdomain podman[77619]: 2026-02-01 08:16:26.882816046 +0000 UTC m=+0.100401994 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=collectd)
Feb 01 08:16:26 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:16:29 np0005604215.localdomain podman[77640]: 2026-02-01 08:16:29.885570718 +0000 UTC m=+0.098227676 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, release=1766032510, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 01 08:16:29 np0005604215.localdomain podman[77640]: 2026-02-01 08:16:29.894345888 +0000 UTC m=+0.107002856 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true)
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:16:29 np0005604215.localdomain systemd[1]: tmp-crun.KifgIF.mount: Deactivated successfully.
Feb 01 08:16:29 np0005604215.localdomain podman[77641]: 2026-02-01 08:16:29.946769261 +0000 UTC m=+0.155869805 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:16:29 np0005604215.localdomain podman[77643]: 2026-02-01 08:16:29.998120351 +0000 UTC m=+0.194931433 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, version=17.1.13, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:16:30 np0005604215.localdomain podman[77641]: 2026-02-01 08:16:30.033927233 +0000 UTC m=+0.243027787 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, container_name=nova_compute, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:16:30 np0005604215.localdomain podman[77641]: unhealthy
Feb 01 08:16:30 np0005604215.localdomain podman[77650]: 2026-02-01 08:16:30.04479615 +0000 UTC m=+0.241289142 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:16:30 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:16:30 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:16:30 np0005604215.localdomain podman[77650]: 2026-02-01 08:16:30.074575751 +0000 UTC m=+0.271068753 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:16:30 np0005604215.localdomain podman[77642]: 2026-02-01 08:16:30.085118767 +0000 UTC m=+0.292124645 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1766032510, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:16:30 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:16:30 np0005604215.localdomain podman[77643]: 2026-02-01 08:16:30.106953773 +0000 UTC m=+0.303764835 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true)
Feb 01 08:16:30 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:16:30 np0005604215.localdomain podman[77642]: 2026-02-01 08:16:30.120904079 +0000 UTC m=+0.327909957 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, container_name=iscsid, release=1766032510, maintainer=OpenStack TripleO Team)
Feb 01 08:16:30 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:16:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:16:32 np0005604215.localdomain podman[77754]: 2026-02-01 08:16:32.878840878 +0000 UTC m=+0.091976967 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1)
Feb 01 08:16:33 np0005604215.localdomain podman[77754]: 2026-02-01 08:16:33.250544791 +0000 UTC m=+0.463680860 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:16:33 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:16:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:16:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:16:34 np0005604215.localdomain podman[77778]: 2026-02-01 08:16:34.862528457 +0000 UTC m=+0.073618030 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:16:34 np0005604215.localdomain podman[77778]: 2026-02-01 08:16:34.903551956 +0000 UTC m=+0.114641489 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git)
Feb 01 08:16:34 np0005604215.localdomain podman[77779]: 2026-02-01 08:16:34.916362525 +0000 UTC m=+0.123952577 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:16:34 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:16:34 np0005604215.localdomain podman[77779]: 2026-02-01 08:16:34.938697767 +0000 UTC m=+0.146287849 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ovn_controller, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com)
Feb 01 08:16:34 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:16:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:16:47 np0005604215.localdomain podman[77827]: 2026-02-01 08:16:47.875707047 +0000 UTC m=+0.084756226 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, version=17.1.13, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:16:48 np0005604215.localdomain podman[77827]: 2026-02-01 08:16:48.091138093 +0000 UTC m=+0.300187232 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, distribution-scope=public, architecture=x86_64)
Feb 01 08:16:48 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:16:48 np0005604215.localdomain sudo[77856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:16:48 np0005604215.localdomain sudo[77856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:16:48 np0005604215.localdomain sudo[77856]: pam_unix(sudo:session): session closed for user root
Feb 01 08:16:48 np0005604215.localdomain sudo[77871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:16:48 np0005604215.localdomain sudo[77871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:16:49 np0005604215.localdomain sudo[77871]: pam_unix(sudo:session): session closed for user root
Feb 01 08:16:50 np0005604215.localdomain sudo[77918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:16:50 np0005604215.localdomain sudo[77918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:16:50 np0005604215.localdomain sudo[77918]: pam_unix(sudo:session): session closed for user root
Feb 01 08:16:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:16:57 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:16:57 np0005604215.localdomain recover_tripleo_nova_virtqemud[77938]: 62016
Feb 01 08:16:57 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:16:57 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:16:57 np0005604215.localdomain podman[77933]: 2026-02-01 08:16:57.893266623 +0000 UTC m=+0.104344991 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, release=1766032510, tcib_managed=true, container_name=collectd, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 01 08:16:57 np0005604215.localdomain podman[77933]: 2026-02-01 08:16:57.931514914 +0000 UTC m=+0.142593292 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., container_name=collectd, version=17.1.13, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Feb 01 08:16:57 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:17:00 np0005604215.localdomain podman[77957]: 2026-02-01 08:17:00.870792099 +0000 UTC m=+0.083721403 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: tmp-crun.jwdHRO.mount: Deactivated successfully.
Feb 01 08:17:00 np0005604215.localdomain podman[77957]: 2026-02-01 08:17:00.922765318 +0000 UTC m=+0.135694602 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, container_name=nova_compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 08:17:00 np0005604215.localdomain podman[77957]: unhealthy
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:17:00 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:17:00 np0005604215.localdomain podman[77956]: 2026-02-01 08:17:00.925671241 +0000 UTC m=+0.139459192 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, container_name=logrotate_crond, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, release=1766032510)
Feb 01 08:17:01 np0005604215.localdomain podman[77959]: 2026-02-01 08:17:00.977642209 +0000 UTC m=+0.183495326 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, version=17.1.13)
Feb 01 08:17:01 np0005604215.localdomain podman[77958]: 2026-02-01 08:17:01.037570422 +0000 UTC m=+0.244188485 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Feb 01 08:17:01 np0005604215.localdomain podman[77958]: 2026-02-01 08:17:01.051652012 +0000 UTC m=+0.258270065 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, architecture=x86_64, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:17:01 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:17:01 np0005604215.localdomain podman[77956]: 2026-02-01 08:17:01.106210253 +0000 UTC m=+0.319998244 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, architecture=x86_64, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z)
Feb 01 08:17:01 np0005604215.localdomain podman[77960]: 2026-02-01 08:17:01.140272189 +0000 UTC m=+0.342660126 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:17:01 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:17:01 np0005604215.localdomain podman[77960]: 2026-02-01 08:17:01.171982501 +0000 UTC m=+0.374370458 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13)
Feb 01 08:17:01 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:17:01 np0005604215.localdomain podman[77959]: 2026-02-01 08:17:01.208958932 +0000 UTC m=+0.414812059 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:17:01 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:17:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:17:03 np0005604215.localdomain podman[78064]: 2026-02-01 08:17:03.869412239 +0000 UTC m=+0.088264117 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, release=1766032510, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:17:04 np0005604215.localdomain podman[78064]: 2026-02-01 08:17:04.240902706 +0000 UTC m=+0.459754604 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, url=https://www.redhat.com, tcib_managed=true)
Feb 01 08:17:04 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:17:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:17:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:17:05 np0005604215.localdomain systemd[1]: tmp-crun.m8FDND.mount: Deactivated successfully.
Feb 01 08:17:05 np0005604215.localdomain podman[78088]: 2026-02-01 08:17:05.885489982 +0000 UTC m=+0.094761975 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:17:05 np0005604215.localdomain podman[78088]: 2026-02-01 08:17:05.936652335 +0000 UTC m=+0.145924298 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible)
Feb 01 08:17:05 np0005604215.localdomain podman[78087]: 2026-02-01 08:17:05.935604231 +0000 UTC m=+0.146555197 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, container_name=ovn_metadata_agent, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:17:05 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:17:06 np0005604215.localdomain podman[78087]: 2026-02-01 08:17:06.020851412 +0000 UTC m=+0.231802398 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:17:06 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:17:06 np0005604215.localdomain systemd[1]: tmp-crun.iC6rIL.mount: Deactivated successfully.
Feb 01 08:17:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:17:18 np0005604215.localdomain podman[78134]: 2026-02-01 08:17:18.875974369 +0000 UTC m=+0.087708170 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git)
Feb 01 08:17:19 np0005604215.localdomain podman[78134]: 2026-02-01 08:17:19.068399501 +0000 UTC m=+0.280133292 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5)
Feb 01 08:17:19 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:17:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:17:28 np0005604215.localdomain podman[78164]: 2026-02-01 08:17:28.850535233 +0000 UTC m=+0.069040625 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, version=17.1.13, io.buildah.version=1.41.5)
Feb 01 08:17:28 np0005604215.localdomain podman[78164]: 2026-02-01 08:17:28.861266026 +0000 UTC m=+0.079771418 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, release=1766032510, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z)
Feb 01 08:17:28 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:17:31 np0005604215.localdomain podman[78257]: 2026-02-01 08:17:31.879682888 +0000 UTC m=+0.092971598 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4)
Feb 01 08:17:31 np0005604215.localdomain podman[78253]: 2026-02-01 08:17:31.92207647 +0000 UTC m=+0.137515259 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-cron-container, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, config_id=tripleo_step4)
Feb 01 08:17:31 np0005604215.localdomain podman[78253]: 2026-02-01 08:17:31.933728902 +0000 UTC m=+0.149167751 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:17:31 np0005604215.localdomain podman[78256]: 2026-02-01 08:17:31.974031569 +0000 UTC m=+0.184641174 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 01 08:17:31 np0005604215.localdomain podman[78257]: 2026-02-01 08:17:31.984137131 +0000 UTC m=+0.197425821 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, distribution-scope=public, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4)
Feb 01 08:17:31 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:17:32 np0005604215.localdomain podman[78255]: 2026-02-01 08:17:32.031572635 +0000 UTC m=+0.242073976 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com)
Feb 01 08:17:32 np0005604215.localdomain podman[78256]: 2026-02-01 08:17:32.052026688 +0000 UTC m=+0.262636283 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z)
Feb 01 08:17:32 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:17:32 np0005604215.localdomain podman[78255]: 2026-02-01 08:17:32.066866602 +0000 UTC m=+0.277367993 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, container_name=iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:17:32 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:17:32 np0005604215.localdomain podman[78254]: 2026-02-01 08:17:32.139163989 +0000 UTC m=+0.355147655 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team)
Feb 01 08:17:32 np0005604215.localdomain podman[78254]: 2026-02-01 08:17:32.174734964 +0000 UTC m=+0.390718660 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, container_name=nova_compute, managed_by=tripleo_ansible)
Feb 01 08:17:32 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:17:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:17:34 np0005604215.localdomain podman[78388]: 2026-02-01 08:17:34.866254013 +0000 UTC m=+0.081331406 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:17:35 np0005604215.localdomain podman[78388]: 2026-02-01 08:17:35.2587659 +0000 UTC m=+0.473843243 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:17:35 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:17:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:17:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:17:36 np0005604215.localdomain podman[78411]: 2026-02-01 08:17:36.873606758 +0000 UTC m=+0.090778629 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:17:36 np0005604215.localdomain systemd[1]: tmp-crun.Jat3ay.mount: Deactivated successfully.
Feb 01 08:17:36 np0005604215.localdomain podman[78412]: 2026-02-01 08:17:36.927625282 +0000 UTC m=+0.142387526 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 01 08:17:36 np0005604215.localdomain podman[78412]: 2026-02-01 08:17:36.956609706 +0000 UTC m=+0.171372010 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, container_name=ovn_controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5)
Feb 01 08:17:36 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:17:37 np0005604215.localdomain podman[78411]: 2026-02-01 08:17:37.007098718 +0000 UTC m=+0.224270549 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:17:37 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:17:39 np0005604215.localdomain systemd[1]: libpod-09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1.scope: Deactivated successfully.
Feb 01 08:17:39 np0005604215.localdomain podman[78458]: 2026-02-01 08:17:39.832564111 +0000 UTC m=+0.060774081 container died 09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_wait_for_compute_service, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5)
Feb 01 08:17:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1-userdata-shm.mount: Deactivated successfully.
Feb 01 08:17:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f7de42bd2ef28ab6d43ca2881ed0bac026c1f46d7bf355b9a366b5c9ec93a4c0-merged.mount: Deactivated successfully.
Feb 01 08:17:39 np0005604215.localdomain podman[78458]: 2026-02-01 08:17:39.863682324 +0000 UTC m=+0.091892264 container cleanup 09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_wait_for_compute_service, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-type=git, container_name=nova_wait_for_compute_service, config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=)
Feb 01 08:17:39 np0005604215.localdomain systemd[1]: libpod-conmon-09302786a32de733d4c559cf633da3e5eeab3ee808df8ed54a333145ffb0dfe1.scope: Deactivated successfully.
Feb 01 08:17:39 np0005604215.localdomain python3[76385]: ansible-tripleo_container_manage PODMAN-CONTAINER-DEBUG: podman run --name nova_wait_for_compute_service --conmon-pidfile /run/nova_wait_for_compute_service.pid --detach=False --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env __OS_DEBUG=true --env TRIPLEO_CONFIG_HASH=1296029e90a465a2201c8dc6f8be17e7 --label config_id=tripleo_step5 --label container_name=nova_wait_for_compute_service --label managed_by=tripleo_ansible --label config_data={'detach': False, 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', '__OS_DEBUG': 'true', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'start_order': 4, 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/log/containers/nova:/var/log/nova', '/var/lib/container-config-scripts:/container-config-scripts']} --log-driver k8s-file --log-opt path=/var/log/containers/stdouts/nova_wait_for_compute_service.log --network host --user nova --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /etc/puppet:/etc/puppet:ro --volume /var/lib/kolla/config_files/nova_compute_wait_for_compute_service.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro --volume /var/log/containers/nova:/var/log/nova --volume /var/lib/container-config-scripts:/container-config-scripts registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1
Feb 01 08:17:40 np0005604215.localdomain sudo[76383]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:40 np0005604215.localdomain sudo[78512]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ruvxefmrjqclzyowmreszzgkogzsoxms ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:17:40 np0005604215.localdomain sudo[78512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:40 np0005604215.localdomain python3[78514]: ansible-file Invoked with path=/etc/systemd/system/tripleo_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:17:40 np0005604215.localdomain sudo[78512]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:40 np0005604215.localdomain sudo[78528]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpopiytzieywtnzjggtfpwhdwnssdkjr ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:17:40 np0005604215.localdomain sudo[78528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:40 np0005604215.localdomain python3[78530]: ansible-stat Invoked with path=/etc/systemd/system/tripleo_nova_compute_healthcheck.timer follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Feb 01 08:17:40 np0005604215.localdomain sudo[78528]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:41 np0005604215.localdomain sudo[78589]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vovsndcprpvvnpgxdwirbqcvrezxmzpe ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:17:41 np0005604215.localdomain sudo[78589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:41 np0005604215.localdomain python3[78591]: ansible-copy Invoked with src=/home/tripleo-admin/.ansible/tmp/ansible-tmp-1769933860.7820203-117803-166392810069579/source dest=/etc/systemd/system/tripleo_nova_compute.service mode=0644 owner=root group=root backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:17:41 np0005604215.localdomain sudo[78589]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:41 np0005604215.localdomain sudo[78605]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbrzyqhzgjakdtrzadpsnohyeuawkfjq ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:17:41 np0005604215.localdomain sudo[78605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:41 np0005604215.localdomain python3[78607]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 08:17:41 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:17:41 np0005604215.localdomain systemd-rc-local-generator[78633]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:17:41 np0005604215.localdomain systemd-sysv-generator[78636]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:17:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:17:42 np0005604215.localdomain sudo[78605]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:42 np0005604215.localdomain sudo[78657]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sxgspaijwvzxjylqldvcjetvljdiqzjk ; TRIPLEO_MINOR_UPDATE=False /usr/bin/python3
Feb 01 08:17:42 np0005604215.localdomain sudo[78657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:42 np0005604215.localdomain python3[78659]: ansible-systemd Invoked with state=restarted name=tripleo_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:17:43 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:17:44 np0005604215.localdomain systemd-rc-local-generator[78685]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:17:44 np0005604215.localdomain systemd-sysv-generator[78688]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:17:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:17:44 np0005604215.localdomain systemd[1]: Starting nova_compute container...
Feb 01 08:17:44 np0005604215.localdomain tripleo-start-podman-container[78699]: Creating additional drop-in dependency for "nova_compute" (1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e)
Feb 01 08:17:44 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:17:44 np0005604215.localdomain systemd-sysv-generator[78757]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:17:44 np0005604215.localdomain systemd-rc-local-generator[78753]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:17:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:17:44 np0005604215.localdomain systemd[1]: Started nova_compute container.
Feb 01 08:17:44 np0005604215.localdomain sudo[78657]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:45 np0005604215.localdomain sudo[78796]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktdjpxolfqapoxntctzwaedkriwwmbcl ; /usr/bin/python3
Feb 01 08:17:45 np0005604215.localdomain sudo[78796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:45 np0005604215.localdomain python3[78798]: ansible-file Invoked with path=/var/lib/container-puppet/container-puppet-tasks5.json state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:17:45 np0005604215.localdomain sudo[78796]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:45 np0005604215.localdomain sudo[78844]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mlypnzmatxehbnhrhlcwschndxvdwgho ; /usr/bin/python3
Feb 01 08:17:45 np0005604215.localdomain sudo[78844]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:46 np0005604215.localdomain sudo[78844]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:46 np0005604215.localdomain sudo[78887]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiimldacccdowyiqngabvatsmcofnhux ; /usr/bin/python3
Feb 01 08:17:46 np0005604215.localdomain sudo[78887]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:46 np0005604215.localdomain sudo[78887]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:46 np0005604215.localdomain sudo[78917]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzcamtxkcrmngefdehjtebyrulrsiadq ; /usr/bin/python3
Feb 01 08:17:46 np0005604215.localdomain sudo[78917]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:46 np0005604215.localdomain python3[78919]: ansible-container_puppet_config Invoked with check_mode=False config_vol_prefix=/var/lib/config-data debug=True net_host=True no_archive=True puppet_config=/var/lib/container-puppet/container-puppet-tasks5.json short_hostname=np0005604215 step=5 update_config_hash_only=False
Feb 01 08:17:46 np0005604215.localdomain sudo[78917]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:47 np0005604215.localdomain sudo[78933]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bovxwmejmaagbyerdddokxqluaxxubss ; /usr/bin/python3
Feb 01 08:17:47 np0005604215.localdomain sudo[78933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:47 np0005604215.localdomain python3[78935]: ansible-file Invoked with path=/var/log/containers/stdouts state=directory owner=root group=root recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 08:17:47 np0005604215.localdomain sudo[78933]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:47 np0005604215.localdomain sudo[78949]: tripleo-admin : PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vkwybjgtkunyjjbnmrdlzxelamhhtbgt ; /usr/bin/python3
Feb 01 08:17:47 np0005604215.localdomain sudo[78949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1003)
Feb 01 08:17:47 np0005604215.localdomain python3[78951]: ansible-container_config_data Invoked with config_path=/var/lib/tripleo-config/container-puppet-config/step_5 config_pattern=container-puppet-*.json config_overrides={} debug=True
Feb 01 08:17:47 np0005604215.localdomain sudo[78949]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:17:49 np0005604215.localdomain podman[78952]: 2026-02-01 08:17:49.894781774 +0000 UTC m=+0.104660082 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, release=1766032510, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, tcib_managed=true, config_id=tripleo_step1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:17:50 np0005604215.localdomain podman[78952]: 2026-02-01 08:17:50.13531629 +0000 UTC m=+0.345194558 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, version=17.1.13, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:17:50 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:17:50 np0005604215.localdomain sudo[78982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:17:50 np0005604215.localdomain sudo[78982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:17:50 np0005604215.localdomain sudo[78982]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:50 np0005604215.localdomain sudo[78997]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:17:50 np0005604215.localdomain sudo[78997]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:17:51 np0005604215.localdomain sudo[78997]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:51 np0005604215.localdomain sudo[79043]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:17:51 np0005604215.localdomain sudo[79043]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:17:51 np0005604215.localdomain sudo[79043]: pam_unix(sudo:session): session closed for user root
Feb 01 08:17:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:17:59 np0005604215.localdomain systemd[1]: tmp-crun.Yc6B22.mount: Deactivated successfully.
Feb 01 08:17:59 np0005604215.localdomain podman[79058]: 2026-02-01 08:17:59.897705643 +0000 UTC m=+0.107026246 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, version=17.1.13, release=1766032510, config_id=tripleo_step3, container_name=collectd, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:17:59 np0005604215.localdomain podman[79058]: 2026-02-01 08:17:59.936741719 +0000 UTC m=+0.146062322 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, container_name=collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, version=17.1.13, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:17:59 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: tmp-crun.TJWvHj.mount: Deactivated successfully.
Feb 01 08:18:02 np0005604215.localdomain podman[79082]: 2026-02-01 08:18:02.873715032 +0000 UTC m=+0.074381055 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, config_id=tripleo_step4, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:18:02 np0005604215.localdomain podman[79080]: 2026-02-01 08:18:02.918215362 +0000 UTC m=+0.124585638 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:18:02 np0005604215.localdomain podman[79082]: 2026-02-01 08:18:02.924814723 +0000 UTC m=+0.125480756 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ceilometer_agent_compute, version=17.1.13, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4)
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:18:02 np0005604215.localdomain podman[79083]: 2026-02-01 08:18:02.893517614 +0000 UTC m=+0.090052285 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 01 08:18:02 np0005604215.localdomain podman[79080]: 2026-02-01 08:18:02.972728882 +0000 UTC m=+0.179099228 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible)
Feb 01 08:18:02 np0005604215.localdomain podman[79079]: 2026-02-01 08:18:02.972608548 +0000 UTC m=+0.178150776 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:18:02 np0005604215.localdomain podman[79083]: 2026-02-01 08:18:02.976648898 +0000 UTC m=+0.173183569 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4)
Feb 01 08:18:02 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:18:02 np0005604215.localdomain podman[79079]: 2026-02-01 08:18:02.986399788 +0000 UTC m=+0.191942026 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step4)
Feb 01 08:18:03 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:18:03 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:18:03 np0005604215.localdomain podman[79081]: 2026-02-01 08:18:03.042471118 +0000 UTC m=+0.244731472 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step3, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team)
Feb 01 08:18:03 np0005604215.localdomain podman[79081]: 2026-02-01 08:18:03.078607371 +0000 UTC m=+0.280867725 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, url=https://www.redhat.com, release=1766032510, container_name=iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container)
Feb 01 08:18:03 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:18:03 np0005604215.localdomain systemd[1]: tmp-crun.Wo5L35.mount: Deactivated successfully.
Feb 01 08:18:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:18:05 np0005604215.localdomain systemd[1]: tmp-crun.T9ukw3.mount: Deactivated successfully.
Feb 01 08:18:05 np0005604215.localdomain podman[79192]: 2026-02-01 08:18:05.915031224 +0000 UTC m=+0.135054112 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:18:06 np0005604215.localdomain podman[79192]: 2026-02-01 08:18:06.272663738 +0000 UTC m=+0.492686646 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public)
Feb 01 08:18:06 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:18:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:18:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:18:07 np0005604215.localdomain podman[79217]: 2026-02-01 08:18:07.868928342 +0000 UTC m=+0.083376782 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, release=1766032510, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=)
Feb 01 08:18:07 np0005604215.localdomain podman[79218]: 2026-02-01 08:18:07.921273353 +0000 UTC m=+0.132324265 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller)
Feb 01 08:18:07 np0005604215.localdomain podman[79217]: 2026-02-01 08:18:07.942766738 +0000 UTC m=+0.157215178 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, batch=17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:18:07 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:18:07 np0005604215.localdomain podman[79218]: 2026-02-01 08:18:07.973772378 +0000 UTC m=+0.184823290 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, version=17.1.13, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4)
Feb 01 08:18:07 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:18:16 np0005604215.localdomain sshd[79265]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:18:16 np0005604215.localdomain sshd[79265]: Accepted publickey for zuul from 192.168.122.100 port 59480 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 08:18:16 np0005604215.localdomain systemd-logind[761]: New session 33 of user zuul.
Feb 01 08:18:16 np0005604215.localdomain systemd[1]: Started Session 33 of User zuul.
Feb 01 08:18:16 np0005604215.localdomain sshd[79265]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 08:18:16 np0005604215.localdomain sudo[79372]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xicqajpjomkdgvpmepetegsiabyikbck ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769933896.3899696-40181-249269604932245/AnsiballZ_setup.py
Feb 01 08:18:16 np0005604215.localdomain sudo[79372]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:18:17 np0005604215.localdomain python3[79374]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 08:18:19 np0005604215.localdomain sudo[79372]: pam_unix(sudo:session): session closed for user root
Feb 01 08:18:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:18:20 np0005604215.localdomain systemd[1]: tmp-crun.ywv4MF.mount: Deactivated successfully.
Feb 01 08:18:20 np0005604215.localdomain podman[79562]: 2026-02-01 08:18:20.882467383 +0000 UTC m=+0.091402022 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Feb 01 08:18:21 np0005604215.localdomain podman[79562]: 2026-02-01 08:18:21.11443054 +0000 UTC m=+0.323365189 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:10:14Z, version=17.1.13, release=1766032510, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:18:21 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:18:24 np0005604215.localdomain sudo[79666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqyqxysdwgqumsmowlidrfqcjmfrdxas ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769933903.758587-40241-250997656598878/AnsiballZ_dnf.py
Feb 01 08:18:24 np0005604215.localdomain sudo[79666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:18:24 np0005604215.localdomain python3[79668]: ansible-ansible.legacy.dnf Invoked with name=['iptables'] allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None state=None
Feb 01 08:18:27 np0005604215.localdomain sudo[79666]: pam_unix(sudo:session): session closed for user root
Feb 01 08:18:28 np0005604215.localdomain sudo[79759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zbagfmioyrilembwsscdzmttmjztmkyh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769933908.170169-40296-148343458236542/AnsiballZ_iptables.py
Feb 01 08:18:28 np0005604215.localdomain sudo[79759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:18:28 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:18:28 np0005604215.localdomain recover_tripleo_nova_virtqemud[79763]: 62016
Feb 01 08:18:28 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:18:28 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:18:28 np0005604215.localdomain python3[79761]: ansible-ansible.builtin.iptables Invoked with action=insert chain=INPUT comment=allow ssh access for zuul executor in_interface=eth0 jump=ACCEPT protocol=tcp source=38.102.83.114 table=filter state=present ip_version=ipv4 match=[] destination_ports=[] ctstate=[] syn=ignore flush=False chain_management=False numeric=False rule_num=None wait=None to_source=None destination=None to_destination=None tcp_flags=None gateway=None log_prefix=None log_level=None goto=None out_interface=None fragment=None set_counters=None source_port=None destination_port=None to_ports=None set_dscp_mark=None set_dscp_mark_class=None src_range=None dst_range=None match_set=None match_set_flags=None limit=None limit_burst=None uid_owner=None gid_owner=None reject_with=None icmp_type=None policy=None
Feb 01 08:18:28 np0005604215.localdomain kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Feb 01 08:18:28 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 81.1 (270 of 333 items), suggesting rotation.
Feb 01 08:18:28 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 08:18:28 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 08:18:28 np0005604215.localdomain sudo[79759]: pam_unix(sudo:session): session closed for user root
Feb 01 08:18:28 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 08:18:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:18:30 np0005604215.localdomain systemd[1]: tmp-crun.YJwner.mount: Deactivated successfully.
Feb 01 08:18:30 np0005604215.localdomain podman[79831]: 2026-02-01 08:18:30.886348987 +0000 UTC m=+0.098268287 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:18:30 np0005604215.localdomain podman[79831]: 2026-02-01 08:18:30.894705488 +0000 UTC m=+0.106624788 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 01 08:18:30 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:18:33 np0005604215.localdomain podman[79856]: 2026-02-01 08:18:33.880612877 +0000 UTC m=+0.083541167 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, architecture=x86_64, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13)
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: tmp-crun.QaFoOZ.mount: Deactivated successfully.
Feb 01 08:18:33 np0005604215.localdomain podman[79855]: 2026-02-01 08:18:33.93391028 +0000 UTC m=+0.140502174 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:18:33 np0005604215.localdomain podman[79856]: 2026-02-01 08:18:33.937864174 +0000 UTC m=+0.140792484 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:18:33 np0005604215.localdomain podman[79855]: 2026-02-01 08:18:33.970654476 +0000 UTC m=+0.177246330 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:18:33 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:18:33 np0005604215.localdomain podman[79854]: 2026-02-01 08:18:33.992534719 +0000 UTC m=+0.202122767 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, version=17.1.13, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., tcib_managed=true)
Feb 01 08:18:34 np0005604215.localdomain podman[79853]: 2026-02-01 08:18:34.038513044 +0000 UTC m=+0.249430303 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, architecture=x86_64, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:18:34 np0005604215.localdomain podman[79853]: 2026-02-01 08:18:34.044534202 +0000 UTC m=+0.255451451 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, url=https://www.redhat.com)
Feb 01 08:18:34 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:18:34 np0005604215.localdomain podman[79867]: 2026-02-01 08:18:34.092419006 +0000 UTC m=+0.289410411 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 01 08:18:34 np0005604215.localdomain podman[79854]: 2026-02-01 08:18:34.098697201 +0000 UTC m=+0.308285179 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:18:34 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:18:34 np0005604215.localdomain podman[79867]: 2026-02-01 08:18:34.116852668 +0000 UTC m=+0.313844113 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi)
Feb 01 08:18:34 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:18:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:18:36 np0005604215.localdomain podman[79971]: 2026-02-01 08:18:36.859375352 +0000 UTC m=+0.076360863 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64)
Feb 01 08:18:37 np0005604215.localdomain podman[79971]: 2026-02-01 08:18:37.236042335 +0000 UTC m=+0.453027876 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=)
Feb 01 08:18:37 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:18:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:18:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:18:38 np0005604215.localdomain podman[79994]: 2026-02-01 08:18:38.864401318 +0000 UTC m=+0.081317819 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:18:38 np0005604215.localdomain podman[79994]: 2026-02-01 08:18:38.911313931 +0000 UTC m=+0.128230442 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:18:38 np0005604215.localdomain podman[79995]: 2026-02-01 08:18:38.925191354 +0000 UTC m=+0.138086668 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:18:38 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:18:38 np0005604215.localdomain podman[79995]: 2026-02-01 08:18:38.954769447 +0000 UTC m=+0.167664841 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, container_name=ovn_controller, io.openshift.expose-services=)
Feb 01 08:18:38 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:18:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:18:51 np0005604215.localdomain podman[80041]: 2026-02-01 08:18:51.878633422 +0000 UTC m=+0.089944257 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_id=tripleo_step1, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:18:51 np0005604215.localdomain sudo[80059]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:18:51 np0005604215.localdomain sudo[80059]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:18:51 np0005604215.localdomain sudo[80059]: pam_unix(sudo:session): session closed for user root
Feb 01 08:18:52 np0005604215.localdomain sudo[80085]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:18:52 np0005604215.localdomain sudo[80085]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:18:52 np0005604215.localdomain podman[80041]: 2026-02-01 08:18:52.099685209 +0000 UTC m=+0.310995954 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr)
Feb 01 08:18:52 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:18:52 np0005604215.localdomain sudo[80085]: pam_unix(sudo:session): session closed for user root
Feb 01 08:18:53 np0005604215.localdomain sudo[80133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:18:53 np0005604215.localdomain sudo[80133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:18:53 np0005604215.localdomain sudo[80133]: pam_unix(sudo:session): session closed for user root
Feb 01 08:19:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:19:01 np0005604215.localdomain podman[80148]: 2026-02-01 08:19:01.870109438 +0000 UTC m=+0.082144133 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., release=1766032510, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git, container_name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, io.buildah.version=1.41.5, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:19:01 np0005604215.localdomain podman[80148]: 2026-02-01 08:19:01.881555687 +0000 UTC m=+0.093590312 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=collectd)
Feb 01 08:19:01 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: tmp-crun.3cWqY4.mount: Deactivated successfully.
Feb 01 08:19:04 np0005604215.localdomain podman[80171]: 2026-02-01 08:19:04.890790932 +0000 UTC m=+0.086740897 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, release=1766032510, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:19:04 np0005604215.localdomain podman[80171]: 2026-02-01 08:19:04.93270775 +0000 UTC m=+0.128657665 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vendor=Red Hat, Inc., container_name=iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:19:04 np0005604215.localdomain podman[80172]: 2026-02-01 08:19:04.94555017 +0000 UTC m=+0.138597785 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:19:04 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:19:04 np0005604215.localdomain podman[80172]: 2026-02-01 08:19:04.980689877 +0000 UTC m=+0.173737492 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, version=17.1.13, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:19:05 np0005604215.localdomain podman[80169]: 2026-02-01 08:19:05.0074066 +0000 UTC m=+0.210580521 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=logrotate_crond, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:19:05 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:19:05 np0005604215.localdomain podman[80178]: 2026-02-01 08:19:05.053357444 +0000 UTC m=+0.245221842 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:07:30Z)
Feb 01 08:19:05 np0005604215.localdomain podman[80169]: 2026-02-01 08:19:05.06989418 +0000 UTC m=+0.273068101 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, release=1766032510, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step4)
Feb 01 08:19:05 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:19:05 np0005604215.localdomain podman[80178]: 2026-02-01 08:19:05.089705868 +0000 UTC m=+0.281570276 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_ipmi, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:19:05 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:19:05 np0005604215.localdomain podman[80170]: 2026-02-01 08:19:05.161736335 +0000 UTC m=+0.361764128 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:19:05 np0005604215.localdomain podman[80170]: 2026-02-01 08:19:05.215876954 +0000 UTC m=+0.415904727 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, config_id=tripleo_step5, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:19:05 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:19:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:19:07 np0005604215.localdomain podman[80286]: 2026-02-01 08:19:07.863496578 +0000 UTC m=+0.078858941 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:19:08 np0005604215.localdomain podman[80286]: 2026-02-01 08:19:08.239794628 +0000 UTC m=+0.455156981 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:19:08 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:19:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:19:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:19:09 np0005604215.localdomain systemd[1]: tmp-crun.Hp0Ndg.mount: Deactivated successfully.
Feb 01 08:19:09 np0005604215.localdomain podman[80309]: 2026-02-01 08:19:09.873957764 +0000 UTC m=+0.088711239 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 01 08:19:09 np0005604215.localdomain podman[80310]: 2026-02-01 08:19:09.937797035 +0000 UTC m=+0.149469244 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:19:09 np0005604215.localdomain podman[80309]: 2026-02-01 08:19:09.95076939 +0000 UTC m=+0.165522855 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, build-date=2026-01-12T22:56:19Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:19:09 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:19:09 np0005604215.localdomain podman[80310]: 2026-02-01 08:19:09.963616051 +0000 UTC m=+0.175288260 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.buildah.version=1.41.5, container_name=ovn_controller, config_id=tripleo_step4, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:19:09 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:19:10 np0005604215.localdomain systemd[1]: tmp-crun.bJALUJ.mount: Deactivated successfully.
Feb 01 08:19:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:19:22 np0005604215.localdomain systemd[1]: tmp-crun.pTNqxQ.mount: Deactivated successfully.
Feb 01 08:19:22 np0005604215.localdomain podman[80357]: 2026-02-01 08:19:22.877203957 +0000 UTC m=+0.089143943 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:19:23 np0005604215.localdomain podman[80357]: 2026-02-01 08:19:23.062727765 +0000 UTC m=+0.274667761 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git)
Feb 01 08:19:23 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:19:28 np0005604215.localdomain sshd[79265]: pam_unix(sshd:session): session closed for user zuul
Feb 01 08:19:28 np0005604215.localdomain systemd[1]: session-33.scope: Deactivated successfully.
Feb 01 08:19:28 np0005604215.localdomain systemd[1]: session-33.scope: Consumed 5.672s CPU time.
Feb 01 08:19:28 np0005604215.localdomain systemd-logind[761]: Session 33 logged out. Waiting for processes to exit.
Feb 01 08:19:28 np0005604215.localdomain systemd-logind[761]: Removed session 33.
Feb 01 08:19:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:19:32 np0005604215.localdomain systemd[1]: tmp-crun.iBGjSO.mount: Deactivated successfully.
Feb 01 08:19:32 np0005604215.localdomain podman[80431]: 2026-02-01 08:19:32.882155734 +0000 UTC m=+0.098144513 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.13, config_id=tripleo_step3, url=https://www.redhat.com)
Feb 01 08:19:32 np0005604215.localdomain podman[80431]: 2026-02-01 08:19:32.891775123 +0000 UTC m=+0.107763892 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, container_name=collectd, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com)
Feb 01 08:19:32 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:19:35 np0005604215.localdomain podman[80467]: 2026-02-01 08:19:35.913554312 +0000 UTC m=+0.112433599 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, architecture=x86_64, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, distribution-scope=public)
Feb 01 08:19:35 np0005604215.localdomain podman[80453]: 2026-02-01 08:19:35.873248944 +0000 UTC m=+0.085608622 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:19:35 np0005604215.localdomain podman[80467]: 2026-02-01 08:19:35.934620868 +0000 UTC m=+0.133500175 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:19:35 np0005604215.localdomain podman[80453]: 2026-02-01 08:19:35.957625486 +0000 UTC m=+0.169985094 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vcs-type=git, io.openshift.expose-services=, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:19:35 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:19:35 np0005604215.localdomain podman[80454]: 2026-02-01 08:19:35.88784123 +0000 UTC m=+0.095341936 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git)
Feb 01 08:19:36 np0005604215.localdomain podman[80455]: 2026-02-01 08:19:36.007547764 +0000 UTC m=+0.209438005 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, distribution-scope=public, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:19:36 np0005604215.localdomain podman[80454]: 2026-02-01 08:19:36.024794752 +0000 UTC m=+0.232295438 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible)
Feb 01 08:19:36 np0005604215.localdomain podman[80452]: 2026-02-01 08:19:35.9808242 +0000 UTC m=+0.192945040 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:19:36 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:19:36 np0005604215.localdomain podman[80455]: 2026-02-01 08:19:36.064532142 +0000 UTC m=+0.266422393 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:19:36 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:19:36 np0005604215.localdomain podman[80452]: 2026-02-01 08:19:36.115179282 +0000 UTC m=+0.327300192 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:19:36 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:19:36 np0005604215.localdomain systemd[1]: tmp-crun.n2pEh3.mount: Deactivated successfully.
Feb 01 08:19:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:19:38 np0005604215.localdomain podman[80563]: 2026-02-01 08:19:38.877107412 +0000 UTC m=+0.094781978 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, url=https://www.redhat.com, container_name=nova_migration_target, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Feb 01 08:19:39 np0005604215.localdomain podman[80563]: 2026-02-01 08:19:39.233080189 +0000 UTC m=+0.450754815 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, distribution-scope=public, container_name=nova_migration_target, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:19:39 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:19:40 np0005604215.localdomain recover_tripleo_nova_virtqemud[80599]: 62016
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:19:40 np0005604215.localdomain podman[80586]: 2026-02-01 08:19:40.860950267 +0000 UTC m=+0.078358016 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, tcib_managed=true, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:19:40 np0005604215.localdomain podman[80587]: 2026-02-01 08:19:40.838356102 +0000 UTC m=+0.057322420 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 01 08:19:40 np0005604215.localdomain podman[80586]: 2026-02-01 08:19:40.894610947 +0000 UTC m=+0.112018716 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:19:40 np0005604215.localdomain podman[80587]: 2026-02-01 08:19:40.922690853 +0000 UTC m=+0.141657211 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:19:40 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:19:43 np0005604215.localdomain sshd[80637]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:19:43 np0005604215.localdomain sshd[80637]: Accepted publickey for zuul from 38.102.83.114 port 39142 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 08:19:43 np0005604215.localdomain systemd-logind[761]: New session 34 of user zuul.
Feb 01 08:19:43 np0005604215.localdomain systemd[1]: Started Session 34 of User zuul.
Feb 01 08:19:43 np0005604215.localdomain sshd[80637]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 08:19:43 np0005604215.localdomain sudo[80654]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tjswjgtfujdzmrrrlmjmuqrlobzdnvvb ; /usr/bin/python3
Feb 01 08:19:43 np0005604215.localdomain sudo[80654]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 08:19:43 np0005604215.localdomain python3[80656]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 08:19:46 np0005604215.localdomain sudo[80654]: pam_unix(sudo:session): session closed for user root
Feb 01 08:19:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:19:53 np0005604215.localdomain sudo[80670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:19:53 np0005604215.localdomain sudo[80670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:19:53 np0005604215.localdomain sudo[80670]: pam_unix(sudo:session): session closed for user root
Feb 01 08:19:54 np0005604215.localdomain systemd[1]: tmp-crun.hKs3q9.mount: Deactivated successfully.
Feb 01 08:19:54 np0005604215.localdomain podman[80658]: 2026-02-01 08:19:54.006996383 +0000 UTC m=+0.216521116 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, version=17.1.13)
Feb 01 08:19:54 np0005604215.localdomain sudo[80687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:19:54 np0005604215.localdomain sudo[80687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:19:54 np0005604215.localdomain podman[80658]: 2026-02-01 08:19:54.198638713 +0000 UTC m=+0.408163336 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:19:54 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:19:54 np0005604215.localdomain sudo[80687]: pam_unix(sudo:session): session closed for user root
Feb 01 08:19:57 np0005604215.localdomain sudo[80749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:19:57 np0005604215.localdomain sudo[80749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:19:57 np0005604215.localdomain sudo[80749]: pam_unix(sudo:session): session closed for user root
Feb 01 08:20:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:20:03 np0005604215.localdomain podman[80764]: 2026-02-01 08:20:03.873716231 +0000 UTC m=+0.083765305 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git)
Feb 01 08:20:03 np0005604215.localdomain podman[80764]: 2026-02-01 08:20:03.910391555 +0000 UTC m=+0.120440639 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, config_id=tripleo_step3, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, architecture=x86_64, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:20:03 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: tmp-crun.n6t1eo.mount: Deactivated successfully.
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: tmp-crun.YbLDjI.mount: Deactivated successfully.
Feb 01 08:20:06 np0005604215.localdomain podman[80786]: 2026-02-01 08:20:06.912146328 +0000 UTC m=+0.122791813 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid)
Feb 01 08:20:06 np0005604215.localdomain podman[80786]: 2026-02-01 08:20:06.922670346 +0000 UTC m=+0.133315871 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.openshift.expose-services=, vcs-type=git, version=17.1.13, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:20:06 np0005604215.localdomain podman[80784]: 2026-02-01 08:20:06.885547698 +0000 UTC m=+0.105785991 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron)
Feb 01 08:20:06 np0005604215.localdomain podman[80784]: 2026-02-01 08:20:06.965152451 +0000 UTC m=+0.185390824 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-cron-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:20:06 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:20:07 np0005604215.localdomain podman[80793]: 2026-02-01 08:20:07.00965773 +0000 UTC m=+0.212959345 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:20:07 np0005604215.localdomain podman[80793]: 2026-02-01 08:20:07.045634573 +0000 UTC m=+0.248936148 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:20:07 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:20:07 np0005604215.localdomain podman[80785]: 2026-02-01 08:20:07.050486884 +0000 UTC m=+0.262073137 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team)
Feb 01 08:20:07 np0005604215.localdomain podman[80792]: 2026-02-01 08:20:07.112365505 +0000 UTC m=+0.319513960 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:20:07 np0005604215.localdomain podman[80785]: 2026-02-01 08:20:07.13464747 +0000 UTC m=+0.346233713 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, distribution-scope=public, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z)
Feb 01 08:20:07 np0005604215.localdomain podman[80792]: 2026-02-01 08:20:07.143607749 +0000 UTC m=+0.350755904 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-type=git, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 01 08:20:07 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:20:07 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:20:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:20:09 np0005604215.localdomain podman[80899]: 2026-02-01 08:20:09.857682237 +0000 UTC m=+0.075878198 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vcs-type=git, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:20:10 np0005604215.localdomain podman[80899]: 2026-02-01 08:20:10.228861247 +0000 UTC m=+0.447057258 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:20:10 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:20:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:20:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:20:11 np0005604215.localdomain systemd[1]: tmp-crun.uUb842.mount: Deactivated successfully.
Feb 01 08:20:11 np0005604215.localdomain podman[80924]: 2026-02-01 08:20:11.901560625 +0000 UTC m=+0.113505512 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:20:11 np0005604215.localdomain podman[80924]: 2026-02-01 08:20:11.921970021 +0000 UTC m=+0.133914958 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z)
Feb 01 08:20:11 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:20:11 np0005604215.localdomain podman[80923]: 2026-02-01 08:20:11.904555918 +0000 UTC m=+0.121185262 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64)
Feb 01 08:20:11 np0005604215.localdomain podman[80923]: 2026-02-01 08:20:11.986432513 +0000 UTC m=+0.203061917 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git)
Feb 01 08:20:11 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:20:14 np0005604215.localdomain sudo[80984]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wafxzizahpssrjhuxncwchnaqjsfexfu ; /usr/bin/python3
Feb 01 08:20:14 np0005604215.localdomain sudo[80984]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 08:20:14 np0005604215.localdomain python3[80986]: ansible-ansible.legacy.dnf Invoked with name=['sos'] state=latest allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Feb 01 08:20:17 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 08:20:17 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 08:20:17 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 08:20:18 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 08:20:18 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 08:20:18 np0005604215.localdomain systemd[1]: run-r0e32251fedf147668532c7f7cd047fa0.service: Deactivated successfully.
Feb 01 08:20:18 np0005604215.localdomain systemd[1]: run-rac78739e8e9b4f298edfd3baedad994e.service: Deactivated successfully.
Feb 01 08:20:18 np0005604215.localdomain sudo[80984]: pam_unix(sudo:session): session closed for user root
Feb 01 08:20:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:20:24 np0005604215.localdomain podman[81137]: 2026-02-01 08:20:24.852760313 +0000 UTC m=+0.070828221 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.expose-services=, container_name=metrics_qdr, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64)
Feb 01 08:20:25 np0005604215.localdomain podman[81137]: 2026-02-01 08:20:25.064979365 +0000 UTC m=+0.283047133 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 01 08:20:25 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:20:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:20:34 np0005604215.localdomain podman[81212]: 2026-02-01 08:20:34.874650638 +0000 UTC m=+0.090065301 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, container_name=collectd, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com)
Feb 01 08:20:34 np0005604215.localdomain podman[81212]: 2026-02-01 08:20:34.909134284 +0000 UTC m=+0.124548907 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com)
Feb 01 08:20:34 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:20:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:20:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:20:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:20:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:20:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:20:37 np0005604215.localdomain podman[81235]: 2026-02-01 08:20:37.870341222 +0000 UTC m=+0.079651086 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, config_id=tripleo_step4, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 01 08:20:37 np0005604215.localdomain podman[81235]: 2026-02-01 08:20:37.924648367 +0000 UTC m=+0.133958201 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:20:37 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:20:37 np0005604215.localdomain podman[81234]: 2026-02-01 08:20:37.915351626 +0000 UTC m=+0.125242098 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, release=1766032510, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, version=17.1.13)
Feb 01 08:20:37 np0005604215.localdomain podman[81233]: 2026-02-01 08:20:37.981465089 +0000 UTC m=+0.192821807 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true)
Feb 01 08:20:37 np0005604215.localdomain podman[81236]: 2026-02-01 08:20:37.941915655 +0000 UTC m=+0.144004804 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 01 08:20:38 np0005604215.localdomain podman[81234]: 2026-02-01 08:20:38.000771062 +0000 UTC m=+0.210661614 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, version=17.1.13, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:20:38 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:20:38 np0005604215.localdomain podman[81233]: 2026-02-01 08:20:38.015606974 +0000 UTC m=+0.226963662 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, batch=17.1_20260112.1)
Feb 01 08:20:38 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:20:38 np0005604215.localdomain podman[81236]: 2026-02-01 08:20:38.072481978 +0000 UTC m=+0.274571157 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, build-date=2026-01-12T23:07:30Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5)
Feb 01 08:20:38 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:20:38 np0005604215.localdomain podman[81232]: 2026-02-01 08:20:38.09080215 +0000 UTC m=+0.303439478 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:20:38 np0005604215.localdomain podman[81232]: 2026-02-01 08:20:38.127777784 +0000 UTC m=+0.340415062 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, url=https://www.redhat.com, tcib_managed=true, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:20:38 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:20:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:20:40 np0005604215.localdomain podman[81345]: 2026-02-01 08:20:40.862511175 +0000 UTC m=+0.078468889 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, version=17.1.13)
Feb 01 08:20:41 np0005604215.localdomain podman[81345]: 2026-02-01 08:20:41.240693364 +0000 UTC m=+0.456651008 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=)
Feb 01 08:20:41 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:20:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:20:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:20:42 np0005604215.localdomain systemd[1]: tmp-crun.9Aph3o.mount: Deactivated successfully.
Feb 01 08:20:42 np0005604215.localdomain podman[81370]: 2026-02-01 08:20:42.862535234 +0000 UTC m=+0.079005535 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-type=git, io.openshift.expose-services=)
Feb 01 08:20:42 np0005604215.localdomain podman[81369]: 2026-02-01 08:20:42.916912561 +0000 UTC m=+0.131193494 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1)
Feb 01 08:20:42 np0005604215.localdomain podman[81370]: 2026-02-01 08:20:42.93612093 +0000 UTC m=+0.152591241 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:20:42 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:20:42 np0005604215.localdomain podman[81369]: 2026-02-01 08:20:42.960931174 +0000 UTC m=+0.175212087 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent)
Feb 01 08:20:42 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:20:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:20:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4946 writes, 22K keys, 4946 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4946 writes, 558 syncs, 8.86 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:20:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:20:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 2400.1 total, 600.0 interval
                                                          Cumulative writes: 4734 writes, 21K keys, 4734 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 4734 writes, 481 syncs, 9.84 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:20:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:20:55 np0005604215.localdomain podman[81417]: 2026-02-01 08:20:55.863622929 +0000 UTC m=+0.076558890 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, container_name=metrics_qdr, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:20:56 np0005604215.localdomain podman[81417]: 2026-02-01 08:20:56.066605072 +0000 UTC m=+0.279541063 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, release=1766032510)
Feb 01 08:20:56 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:20:57 np0005604215.localdomain sudo[81446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:20:57 np0005604215.localdomain sudo[81446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:20:57 np0005604215.localdomain sudo[81446]: pam_unix(sudo:session): session closed for user root
Feb 01 08:20:57 np0005604215.localdomain sudo[81461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:20:57 np0005604215.localdomain sudo[81461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:20:58 np0005604215.localdomain sudo[81461]: pam_unix(sudo:session): session closed for user root
Feb 01 08:20:58 np0005604215.localdomain sudo[81521]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqzdmyqkvfepkkozjwvuhvbltzothsja ; /usr/bin/python3
Feb 01 08:20:58 np0005604215.localdomain sudo[81521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 08:20:58 np0005604215.localdomain python3[81523]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhel-9-for-x86_64-baseos-eus-rpms --disable rhel-9-for-x86_64-appstream-eus-rpms --disable rhel-9-for-x86_64-highavailability-eus-rpms --disable openstack-17.1-for-rhel-9-x86_64-rpms --disable fast-datapath-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 08:20:59 np0005604215.localdomain sudo[81526]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:20:59 np0005604215.localdomain sudo[81526]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:20:59 np0005604215.localdomain sudo[81526]: pam_unix(sudo:session): session closed for user root
Feb 01 08:21:01 np0005604215.localdomain anacron[18977]: Job `cron.monthly' started
Feb 01 08:21:01 np0005604215.localdomain anacron[18977]: Job `cron.monthly' terminated
Feb 01 08:21:01 np0005604215.localdomain anacron[18977]: Normal exit (3 jobs run)
Feb 01 08:21:02 np0005604215.localdomain rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 08:21:02 np0005604215.localdomain rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 08:21:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:21:05 np0005604215.localdomain podman[81730]: 2026-02-01 08:21:05.87096577 +0000 UTC m=+0.086458867 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, distribution-scope=public, vcs-type=git, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:21:05 np0005604215.localdomain podman[81730]: 2026-02-01 08:21:05.88154223 +0000 UTC m=+0.097035247 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true)
Feb 01 08:21:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:21:06 np0005604215.localdomain sudo[81521]: pam_unix(sudo:session): session closed for user root
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: tmp-crun.M5ecwZ.mount: Deactivated successfully.
Feb 01 08:21:08 np0005604215.localdomain podman[81750]: 2026-02-01 08:21:08.887481275 +0000 UTC m=+0.098725031 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-cron-container, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git)
Feb 01 08:21:08 np0005604215.localdomain podman[81750]: 2026-02-01 08:21:08.9257633 +0000 UTC m=+0.137007106 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.13, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5)
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:21:08 np0005604215.localdomain podman[81752]: 2026-02-01 08:21:08.94179929 +0000 UTC m=+0.148091371 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, distribution-scope=public, release=1766032510)
Feb 01 08:21:08 np0005604215.localdomain podman[81752]: 2026-02-01 08:21:08.982676925 +0000 UTC m=+0.188968976 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, com.redhat.component=openstack-iscsid-container, release=1766032510, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible)
Feb 01 08:21:08 np0005604215.localdomain kernel: DROPPING: IN=eth0 OUT= MACSRC=fa:61:25:a2:5a:71 MACDST=fa:16:3e:d0:c8:c4 MACPROTO=0800 SRC=82.147.84.55 DST=38.102.83.164 LEN=40 TOS=0x08 PREC=0x20 TTL=242 ID=61520 PROTO=TCP SPT=53998 DPT=9090 SEQ=1487044726 ACK=0 WINDOW=1024 RES=0x00 SYN URGP=0 
Feb 01 08:21:08 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:21:09 np0005604215.localdomain podman[81753]: 2026-02-01 08:21:08.999394537 +0000 UTC m=+0.205723669 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, container_name=ceilometer_agent_compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1)
Feb 01 08:21:09 np0005604215.localdomain podman[81751]: 2026-02-01 08:21:09.04500908 +0000 UTC m=+0.253294734 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:21:09 np0005604215.localdomain podman[81753]: 2026-02-01 08:21:09.058711247 +0000 UTC m=+0.265040389 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, release=1766032510, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true)
Feb 01 08:21:09 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:21:09 np0005604215.localdomain podman[81751]: 2026-02-01 08:21:09.076664378 +0000 UTC m=+0.284950052 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, version=17.1.13)
Feb 01 08:21:09 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:21:09 np0005604215.localdomain podman[81759]: 2026-02-01 08:21:09.144317639 +0000 UTC m=+0.342630231 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible)
Feb 01 08:21:09 np0005604215.localdomain podman[81759]: 2026-02-01 08:21:09.200738699 +0000 UTC m=+0.399051291 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 08:21:09 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:21:09 np0005604215.localdomain systemd[1]: tmp-crun.4MlpO6.mount: Deactivated successfully.
Feb 01 08:21:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:21:11 np0005604215.localdomain podman[81866]: 2026-02-01 08:21:11.859800729 +0000 UTC m=+0.073879585 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:21:12 np0005604215.localdomain podman[81866]: 2026-02-01 08:21:12.214583029 +0000 UTC m=+0.428661965 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, release=1766032510, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, container_name=nova_migration_target, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Feb 01 08:21:12 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:21:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:21:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:21:13 np0005604215.localdomain systemd[1]: tmp-crun.MWGvA9.mount: Deactivated successfully.
Feb 01 08:21:13 np0005604215.localdomain podman[81890]: 2026-02-01 08:21:13.883495647 +0000 UTC m=+0.090246447 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:21:13 np0005604215.localdomain systemd[1]: tmp-crun.nYPi31.mount: Deactivated successfully.
Feb 01 08:21:13 np0005604215.localdomain podman[81891]: 2026-02-01 08:21:13.941181667 +0000 UTC m=+0.144277552 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 01 08:21:13 np0005604215.localdomain podman[81890]: 2026-02-01 08:21:13.967647143 +0000 UTC m=+0.174397953 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team)
Feb 01 08:21:13 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:21:13 np0005604215.localdomain podman[81891]: 2026-02-01 08:21:13.992737196 +0000 UTC m=+0.195833131 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510)
Feb 01 08:21:14 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:21:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:21:26 np0005604215.localdomain podman[81934]: 2026-02-01 08:21:26.861060818 +0000 UTC m=+0.079639756 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:21:27 np0005604215.localdomain podman[81934]: 2026-02-01 08:21:27.058082115 +0000 UTC m=+0.276661073 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:21:27 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:21:30 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:21:30 np0005604215.localdomain recover_tripleo_nova_virtqemud[81964]: 62016
Feb 01 08:21:30 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:21:30 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:21:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:21:36 np0005604215.localdomain systemd[1]: tmp-crun.v7MfNX.mount: Deactivated successfully.
Feb 01 08:21:36 np0005604215.localdomain podman[82010]: 2026-02-01 08:21:36.873313903 +0000 UTC m=+0.092257549 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, container_name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 01 08:21:36 np0005604215.localdomain podman[82010]: 2026-02-01 08:21:36.890136668 +0000 UTC m=+0.109080324 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, vcs-type=git, distribution-scope=public, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:21:36 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:21:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:21:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:21:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:21:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:21:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:21:39 np0005604215.localdomain podman[82031]: 2026-02-01 08:21:39.887418271 +0000 UTC m=+0.100268529 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:21:39 np0005604215.localdomain podman[82039]: 2026-02-01 08:21:39.940440956 +0000 UTC m=+0.144670525 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510)
Feb 01 08:21:39 np0005604215.localdomain podman[82033]: 2026-02-01 08:21:39.985431009 +0000 UTC m=+0.192958101 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:21:39 np0005604215.localdomain podman[82032]: 2026-02-01 08:21:39.992720267 +0000 UTC m=+0.203118369 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, container_name=iscsid, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, url=https://www.redhat.com, managed_by=tripleo_ansible)
Feb 01 08:21:39 np0005604215.localdomain podman[82031]: 2026-02-01 08:21:39.999039114 +0000 UTC m=+0.211889412 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:21:40 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:21:40 np0005604215.localdomain podman[82039]: 2026-02-01 08:21:40.017099597 +0000 UTC m=+0.221329196 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 01 08:21:40 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:21:40 np0005604215.localdomain podman[82033]: 2026-02-01 08:21:40.041202729 +0000 UTC m=+0.248729861 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1)
Feb 01 08:21:40 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:21:40 np0005604215.localdomain podman[82032]: 2026-02-01 08:21:40.055109773 +0000 UTC m=+0.265507855 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, config_id=tripleo_step3, vcs-type=git, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:21:40 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:21:40 np0005604215.localdomain podman[82030]: 2026-02-01 08:21:39.9213716 +0000 UTC m=+0.137732068 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64)
Feb 01 08:21:40 np0005604215.localdomain podman[82030]: 2026-02-01 08:21:40.101733208 +0000 UTC m=+0.318093676 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 01 08:21:40 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:21:40 np0005604215.localdomain systemd[1]: tmp-crun.r3jBll.mount: Deactivated successfully.
Feb 01 08:21:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:21:42 np0005604215.localdomain podman[82148]: 2026-02-01 08:21:42.861432229 +0000 UTC m=+0.076721965 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=nova_migration_target, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:21:43 np0005604215.localdomain podman[82148]: 2026-02-01 08:21:43.228648866 +0000 UTC m=+0.443938612 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510)
Feb 01 08:21:43 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:21:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:21:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:21:44 np0005604215.localdomain systemd[1]: tmp-crun.tZc4a0.mount: Deactivated successfully.
Feb 01 08:21:44 np0005604215.localdomain podman[82170]: 2026-02-01 08:21:44.879489231 +0000 UTC m=+0.084534608 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, tcib_managed=true, io.buildah.version=1.41.5, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:21:44 np0005604215.localdomain podman[82171]: 2026-02-01 08:21:44.858958701 +0000 UTC m=+0.065221457 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 01 08:21:44 np0005604215.localdomain podman[82171]: 2026-02-01 08:21:44.941704492 +0000 UTC m=+0.147967218 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:21:44 np0005604215.localdomain podman[82170]: 2026-02-01 08:21:44.952426426 +0000 UTC m=+0.157471853 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, container_name=ovn_metadata_agent, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:21:44 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:21:44 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:21:51 np0005604215.localdomain sudo[82231]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrqjgkexzhvarahlvcaizmkyjhopcvbw ; /usr/bin/python3
Feb 01 08:21:51 np0005604215.localdomain sudo[82231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 08:21:51 np0005604215.localdomain python3[82233]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager repos --disable rhceph-7-tools-for-rhel-9-x86_64-rpms _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 08:21:54 np0005604215.localdomain rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 08:21:55 np0005604215.localdomain rhsm-service[6583]: WARNING [subscription_manager.cert_sorter:194] Installed product 479 not present in response from server.
Feb 01 08:21:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:21:57 np0005604215.localdomain systemd[1]: tmp-crun.XOo4jW.mount: Deactivated successfully.
Feb 01 08:21:57 np0005604215.localdomain podman[82364]: 2026-02-01 08:21:57.877177607 +0000 UTC m=+0.092662901 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=metrics_qdr, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, architecture=x86_64)
Feb 01 08:21:58 np0005604215.localdomain podman[82364]: 2026-02-01 08:21:58.071395147 +0000 UTC m=+0.286880381 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, architecture=x86_64, config_id=tripleo_step1, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:21:58 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:21:58 np0005604215.localdomain sudo[82231]: pam_unix(sudo:session): session closed for user root
Feb 01 08:21:59 np0005604215.localdomain sudo[82449]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:21:59 np0005604215.localdomain sudo[82449]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:21:59 np0005604215.localdomain sudo[82449]: pam_unix(sudo:session): session closed for user root
Feb 01 08:21:59 np0005604215.localdomain sudo[82464]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 08:21:59 np0005604215.localdomain sudo[82464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:22:00 np0005604215.localdomain sudo[82464]: pam_unix(sudo:session): session closed for user root
Feb 01 08:22:00 np0005604215.localdomain sudo[82499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:22:00 np0005604215.localdomain sudo[82499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:22:00 np0005604215.localdomain sudo[82499]: pam_unix(sudo:session): session closed for user root
Feb 01 08:22:00 np0005604215.localdomain sudo[82514]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:22:00 np0005604215.localdomain sudo[82514]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:22:00 np0005604215.localdomain sudo[82514]: pam_unix(sudo:session): session closed for user root
Feb 01 08:22:01 np0005604215.localdomain sudo[82561]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:22:01 np0005604215.localdomain sudo[82561]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:22:01 np0005604215.localdomain sudo[82561]: pam_unix(sudo:session): session closed for user root
Feb 01 08:22:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:22:07 np0005604215.localdomain systemd[1]: tmp-crun.YSS0Ws.mount: Deactivated successfully.
Feb 01 08:22:07 np0005604215.localdomain podman[82576]: 2026-02-01 08:22:07.864604559 +0000 UTC m=+0.080359679 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:22:07 np0005604215.localdomain podman[82576]: 2026-02-01 08:22:07.879607937 +0000 UTC m=+0.095363037 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step3, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:22:07 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: tmp-crun.Bi5pbu.mount: Deactivated successfully.
Feb 01 08:22:10 np0005604215.localdomain podman[82596]: 2026-02-01 08:22:10.91758847 +0000 UTC m=+0.132810915 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, name=rhosp-rhel9/openstack-cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team)
Feb 01 08:22:10 np0005604215.localdomain podman[82596]: 2026-02-01 08:22:10.962476261 +0000 UTC m=+0.177698676 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13)
Feb 01 08:22:10 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:22:10 np0005604215.localdomain podman[82598]: 2026-02-01 08:22:10.977649464 +0000 UTC m=+0.184734785 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:22:10 np0005604215.localdomain podman[82598]: 2026-02-01 08:22:10.988831722 +0000 UTC m=+0.195917073 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, maintainer=OpenStack TripleO Team)
Feb 01 08:22:11 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:22:11 np0005604215.localdomain podman[82597]: 2026-02-01 08:22:10.969734257 +0000 UTC m=+0.180344838 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, architecture=x86_64, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1)
Feb 01 08:22:11 np0005604215.localdomain podman[82599]: 2026-02-01 08:22:11.035518379 +0000 UTC m=+0.239719570 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=)
Feb 01 08:22:11 np0005604215.localdomain podman[82605]: 2026-02-01 08:22:10.94099294 +0000 UTC m=+0.142381543 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step4, vcs-type=git, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:22:11 np0005604215.localdomain podman[82597]: 2026-02-01 08:22:11.050735934 +0000 UTC m=+0.261346515 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:22:11 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:22:11 np0005604215.localdomain podman[82605]: 2026-02-01 08:22:11.073585476 +0000 UTC m=+0.274974049 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:22:11 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:22:11 np0005604215.localdomain podman[82599]: 2026-02-01 08:22:11.09067618 +0000 UTC m=+0.294877311 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 01 08:22:11 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:22:11 np0005604215.localdomain systemd[1]: tmp-crun.h9NXoB.mount: Deactivated successfully.
Feb 01 08:22:11 np0005604215.localdomain python3[82723]: ansible-ansible.builtin.slurp Invoked with path=/home/zuul/ansible_hostname src=/home/zuul/ansible_hostname
Feb 01 08:22:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:22:13 np0005604215.localdomain systemd[1]: tmp-crun.XJN8YL.mount: Deactivated successfully.
Feb 01 08:22:13 np0005604215.localdomain podman[82724]: 2026-02-01 08:22:13.869901661 +0000 UTC m=+0.086250732 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:22:14 np0005604215.localdomain podman[82724]: 2026-02-01 08:22:14.233797934 +0000 UTC m=+0.450147015 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, container_name=nova_migration_target, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 01 08:22:14 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:22:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:22:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:22:15 np0005604215.localdomain systemd[1]: tmp-crun.WUaCIb.mount: Deactivated successfully.
Feb 01 08:22:15 np0005604215.localdomain podman[82745]: 2026-02-01 08:22:15.863355604 +0000 UTC m=+0.082631408 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z)
Feb 01 08:22:15 np0005604215.localdomain systemd[1]: tmp-crun.T2ByYo.mount: Deactivated successfully.
Feb 01 08:22:15 np0005604215.localdomain podman[82745]: 2026-02-01 08:22:15.911557208 +0000 UTC m=+0.130833022 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:22:15 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:22:15 np0005604215.localdomain podman[82746]: 2026-02-01 08:22:15.918496475 +0000 UTC m=+0.134048523 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13)
Feb 01 08:22:16 np0005604215.localdomain podman[82746]: 2026-02-01 08:22:16.007776451 +0000 UTC m=+0.223328459 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:22:16 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:22:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:22:28 np0005604215.localdomain systemd[1]: tmp-crun.tz0iIv.mount: Deactivated successfully.
Feb 01 08:22:28 np0005604215.localdomain podman[82793]: 2026-02-01 08:22:28.874671318 +0000 UTC m=+0.089819894 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, distribution-scope=public, batch=17.1_20260112.1, container_name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 01 08:22:29 np0005604215.localdomain podman[82793]: 2026-02-01 08:22:29.066693018 +0000 UTC m=+0.281841584 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step1, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:22:29 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:22:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:22:38 np0005604215.localdomain podman[82867]: 2026-02-01 08:22:38.86916832 +0000 UTC m=+0.084118865 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=collectd, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, tcib_managed=true, vcs-type=git, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 01 08:22:38 np0005604215.localdomain podman[82867]: 2026-02-01 08:22:38.88162055 +0000 UTC m=+0.096571075 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, container_name=collectd, build-date=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:22:38 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:22:41 np0005604215.localdomain podman[82890]: 2026-02-01 08:22:41.881466546 +0000 UTC m=+0.092720400 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:22:41 np0005604215.localdomain podman[82890]: 2026-02-01 08:22:41.918761902 +0000 UTC m=+0.130015716 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:22:41 np0005604215.localdomain podman[82894]: 2026-02-01 08:22:41.934599912 +0000 UTC m=+0.137389315 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:22:41 np0005604215.localdomain podman[82894]: 2026-02-01 08:22:41.970865247 +0000 UTC m=+0.173654670 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:22:41 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:22:42 np0005604215.localdomain podman[82891]: 2026-02-01 08:22:42.050417935 +0000 UTC m=+0.255351022 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, vcs-type=git, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z)
Feb 01 08:22:42 np0005604215.localdomain podman[82888]: 2026-02-01 08:22:42.017988883 +0000 UTC m=+0.230760242 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 01 08:22:42 np0005604215.localdomain podman[82889]: 2026-02-01 08:22:42.084878527 +0000 UTC m=+0.296305796 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z)
Feb 01 08:22:42 np0005604215.localdomain podman[82891]: 2026-02-01 08:22:42.08770005 +0000 UTC m=+0.292633187 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:22:42 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:22:42 np0005604215.localdomain podman[82888]: 2026-02-01 08:22:42.103944072 +0000 UTC m=+0.316715361 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 01 08:22:42 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:22:42 np0005604215.localdomain podman[82889]: 2026-02-01 08:22:42.120606676 +0000 UTC m=+0.332033935 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, url=https://www.redhat.com, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:22:42 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:22:42 np0005604215.localdomain systemd[1]: tmp-crun.FlcBUP.mount: Deactivated successfully.
Feb 01 08:22:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:22:44 np0005604215.localdomain podman[83005]: 2026-02-01 08:22:44.858611661 +0000 UTC m=+0.074911003 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:22:45 np0005604215.localdomain podman[83005]: 2026-02-01 08:22:45.23365067 +0000 UTC m=+0.449950042 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z)
Feb 01 08:22:45 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:22:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:22:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:22:46 np0005604215.localdomain podman[83028]: 2026-02-01 08:22:46.863552477 +0000 UTC m=+0.074432927 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vcs-type=git, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:22:46 np0005604215.localdomain podman[83028]: 2026-02-01 08:22:46.902748339 +0000 UTC m=+0.113628779 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, container_name=ovn_metadata_agent, release=1766032510, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:22:46 np0005604215.localdomain systemd[1]: tmp-crun.y4DxLW.mount: Deactivated successfully.
Feb 01 08:22:46 np0005604215.localdomain podman[83029]: 2026-02-01 08:22:46.922235157 +0000 UTC m=+0.129534601 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, batch=17.1_20260112.1, release=1766032510)
Feb 01 08:22:46 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:22:46 np0005604215.localdomain podman[83029]: 2026-02-01 08:22:46.946068454 +0000 UTC m=+0.153367918 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:22:46 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:22:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:22:59 np0005604215.localdomain systemd[1]: tmp-crun.xwY61N.mount: Deactivated successfully.
Feb 01 08:22:59 np0005604215.localdomain podman[83075]: 2026-02-01 08:22:59.866264064 +0000 UTC m=+0.087747823 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:23:00 np0005604215.localdomain podman[83075]: 2026-02-01 08:23:00.098382536 +0000 UTC m=+0.319866235 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510)
Feb 01 08:23:00 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:23:01 np0005604215.localdomain sudo[83104]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:23:01 np0005604215.localdomain sudo[83104]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:23:01 np0005604215.localdomain sudo[83104]: pam_unix(sudo:session): session closed for user root
Feb 01 08:23:01 np0005604215.localdomain sudo[83119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:23:01 np0005604215.localdomain sudo[83119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:23:02 np0005604215.localdomain sudo[83119]: pam_unix(sudo:session): session closed for user root
Feb 01 08:23:03 np0005604215.localdomain sudo[83166]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:23:03 np0005604215.localdomain sudo[83166]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:23:03 np0005604215.localdomain sudo[83166]: pam_unix(sudo:session): session closed for user root
Feb 01 08:23:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:23:09 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:23:09 np0005604215.localdomain recover_tripleo_nova_virtqemud[83183]: 62016
Feb 01 08:23:09 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:23:09 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:23:09 np0005604215.localdomain podman[83181]: 2026-02-01 08:23:09.875308926 +0000 UTC m=+0.089126324 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:23:09 np0005604215.localdomain podman[83181]: 2026-02-01 08:23:09.890271169 +0000 UTC m=+0.104088567 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, release=1766032510, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git)
Feb 01 08:23:09 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:23:11 np0005604215.localdomain sshd[80640]: Received disconnect from 38.102.83.114 port 39142:11: disconnected by user
Feb 01 08:23:11 np0005604215.localdomain sshd[80640]: Disconnected from user zuul 38.102.83.114 port 39142
Feb 01 08:23:11 np0005604215.localdomain sshd[80637]: pam_unix(sshd:session): session closed for user zuul
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: session-34.scope: Deactivated successfully.
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: session-34.scope: Consumed 18.941s CPU time.
Feb 01 08:23:12 np0005604215.localdomain systemd-logind[761]: Session 34 logged out. Waiting for processes to exit.
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:23:12 np0005604215.localdomain systemd-logind[761]: Removed session 34.
Feb 01 08:23:12 np0005604215.localdomain podman[83205]: 2026-02-01 08:23:12.110439227 +0000 UTC m=+0.083267939 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, architecture=x86_64, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc.)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:23:12 np0005604215.localdomain podman[83205]: 2026-02-01 08:23:12.167468778 +0000 UTC m=+0.140297530 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, config_id=tripleo_step4)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: tmp-crun.GyaFgG.mount: Deactivated successfully.
Feb 01 08:23:12 np0005604215.localdomain podman[83204]: 2026-02-01 08:23:12.178751963 +0000 UTC m=+0.152956337 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:23:12 np0005604215.localdomain podman[83204]: 2026-02-01 08:23:12.217617785 +0000 UTC m=+0.191822169 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=iscsid, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:23:12 np0005604215.localdomain podman[83234]: 2026-02-01 08:23:12.231162017 +0000 UTC m=+0.101643375 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:23:12 np0005604215.localdomain podman[83263]: 2026-02-01 08:23:12.266563206 +0000 UTC m=+0.084700442 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, container_name=nova_compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:23:12 np0005604215.localdomain podman[83235]: 2026-02-01 08:23:12.278239062 +0000 UTC m=+0.143479455 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, version=17.1.13, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1)
Feb 01 08:23:12 np0005604215.localdomain podman[83235]: 2026-02-01 08:23:12.288710763 +0000 UTC m=+0.153951156 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:23:12 np0005604215.localdomain podman[83234]: 2026-02-01 08:23:12.330887574 +0000 UTC m=+0.201368952 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, build-date=2026-01-12T23:07:47Z, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, config_id=tripleo_step4)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:23:12 np0005604215.localdomain podman[83263]: 2026-02-01 08:23:12.343622132 +0000 UTC m=+0.161759398 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step5, io.openshift.expose-services=, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:23:12 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:23:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:23:15 np0005604215.localdomain podman[83314]: 2026-02-01 08:23:15.865772424 +0000 UTC m=+0.082542017 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 01 08:23:16 np0005604215.localdomain podman[83314]: 2026-02-01 08:23:16.223668196 +0000 UTC m=+0.440437789 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, architecture=x86_64, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:23:16 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:23:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:23:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:23:17 np0005604215.localdomain podman[83339]: 2026-02-01 08:23:17.856219424 +0000 UTC m=+0.067218294 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 08:23:17 np0005604215.localdomain podman[83339]: 2026-02-01 08:23:17.907424122 +0000 UTC m=+0.118423032 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, build-date=2026-01-12T22:36:40Z, vcs-type=git, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:23:17 np0005604215.localdomain systemd[1]: tmp-crun.XZgtpW.mount: Deactivated successfully.
Feb 01 08:23:17 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:23:17 np0005604215.localdomain podman[83338]: 2026-02-01 08:23:17.925052894 +0000 UTC m=+0.135165669 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, architecture=x86_64, distribution-scope=public, container_name=ovn_metadata_agent, release=1766032510)
Feb 01 08:23:17 np0005604215.localdomain podman[83338]: 2026-02-01 08:23:17.970301336 +0000 UTC m=+0.180414101 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-type=git, container_name=ovn_metadata_agent, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 01 08:23:17 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:23:20 np0005604215.localdomain sshd[83386]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:23:21 np0005604215.localdomain sshd[83387]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:23:21 np0005604215.localdomain sshd[83387]: error: kex_exchange_identification: read: Connection reset by peer
Feb 01 08:23:21 np0005604215.localdomain sshd[83387]: Connection reset by 176.120.22.52 port 49013
Feb 01 08:23:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:23:30 np0005604215.localdomain podman[83388]: 2026-02-01 08:23:30.875930883 +0000 UTC m=+0.090784112 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:23:31 np0005604215.localdomain podman[83388]: 2026-02-01 08:23:31.069272396 +0000 UTC m=+0.284125665 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, version=17.1.13, tcib_managed=true, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:23:31 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:23:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:23:40 np0005604215.localdomain systemd[1]: tmp-crun.tqlSXz.mount: Deactivated successfully.
Feb 01 08:23:40 np0005604215.localdomain podman[83463]: 2026-02-01 08:23:40.866191687 +0000 UTC m=+0.085338981 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd)
Feb 01 08:23:40 np0005604215.localdomain podman[83463]: 2026-02-01 08:23:40.905692538 +0000 UTC m=+0.124839812 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64)
Feb 01 08:23:40 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:23:42 np0005604215.localdomain podman[83483]: 2026-02-01 08:23:42.880893013 +0000 UTC m=+0.087921237 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.expose-services=)
Feb 01 08:23:42 np0005604215.localdomain podman[83483]: 2026-02-01 08:23:42.889410046 +0000 UTC m=+0.096438280 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, tcib_managed=true, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, vcs-type=git)
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:23:42 np0005604215.localdomain systemd[1]: tmp-crun.j02HBJ.mount: Deactivated successfully.
Feb 01 08:23:42 np0005604215.localdomain podman[83493]: 2026-02-01 08:23:42.998124579 +0000 UTC m=+0.198086903 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team)
Feb 01 08:23:43 np0005604215.localdomain podman[83484]: 2026-02-01 08:23:42.952332812 +0000 UTC m=+0.153895824 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, url=https://www.redhat.com)
Feb 01 08:23:43 np0005604215.localdomain podman[83484]: 2026-02-01 08:23:43.037866758 +0000 UTC m=+0.239429830 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:23:43 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:23:43 np0005604215.localdomain podman[83493]: 2026-02-01 08:23:43.059589932 +0000 UTC m=+0.259552256 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible)
Feb 01 08:23:43 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:23:43 np0005604215.localdomain podman[83482]: 2026-02-01 08:23:43.039225168 +0000 UTC m=+0.247861060 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step5, tcib_managed=true)
Feb 01 08:23:43 np0005604215.localdomain podman[83481]: 2026-02-01 08:23:43.145281453 +0000 UTC m=+0.358384507 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4)
Feb 01 08:23:43 np0005604215.localdomain podman[83481]: 2026-02-01 08:23:43.158579938 +0000 UTC m=+0.371683042 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team)
Feb 01 08:23:43 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:23:43 np0005604215.localdomain podman[83482]: 2026-02-01 08:23:43.172074217 +0000 UTC m=+0.380710129 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, container_name=nova_compute, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:23:43 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:23:43 np0005604215.localdomain systemd[1]: tmp-crun.P0WUTL.mount: Deactivated successfully.
Feb 01 08:23:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:23:46 np0005604215.localdomain systemd[1]: tmp-crun.N8lusH.mount: Deactivated successfully.
Feb 01 08:23:46 np0005604215.localdomain podman[83595]: 2026-02-01 08:23:46.872705262 +0000 UTC m=+0.088269368 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, version=17.1.13, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1)
Feb 01 08:23:47 np0005604215.localdomain podman[83595]: 2026-02-01 08:23:47.24193267 +0000 UTC m=+0.457496776 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:23:47 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:23:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:23:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:23:48 np0005604215.localdomain podman[83616]: 2026-02-01 08:23:48.860912864 +0000 UTC m=+0.080408006 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:23:48 np0005604215.localdomain podman[83617]: 2026-02-01 08:23:48.917157431 +0000 UTC m=+0.132631053 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, build-date=2026-01-12T22:36:40Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:23:48 np0005604215.localdomain podman[83616]: 2026-02-01 08:23:48.931680322 +0000 UTC m=+0.151175414 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible)
Feb 01 08:23:48 np0005604215.localdomain podman[83617]: 2026-02-01 08:23:48.943979727 +0000 UTC m=+0.159453359 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:23:48 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:23:48 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:24:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:24:01 np0005604215.localdomain podman[83664]: 2026-02-01 08:24:01.871468164 +0000 UTC m=+0.086873677 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 08:24:02 np0005604215.localdomain podman[83664]: 2026-02-01 08:24:02.073201596 +0000 UTC m=+0.288607059 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, release=1766032510, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true)
Feb 01 08:24:02 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:24:03 np0005604215.localdomain sudo[83694]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:24:03 np0005604215.localdomain sudo[83694]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:24:03 np0005604215.localdomain sudo[83694]: pam_unix(sudo:session): session closed for user root
Feb 01 08:24:03 np0005604215.localdomain sudo[83709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 08:24:03 np0005604215.localdomain sudo[83709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:24:04 np0005604215.localdomain podman[83797]: 2026-02-01 08:24:04.190124623 +0000 UTC m=+0.079131797 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, ceph=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1764794109, distribution-scope=public, architecture=x86_64, RELEASE=main, name=rhceph)
Feb 01 08:24:04 np0005604215.localdomain podman[83797]: 2026-02-01 08:24:04.290909632 +0000 UTC m=+0.179916806 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, release=1764794109, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, version=7, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main)
Feb 01 08:24:04 np0005604215.localdomain sudo[83709]: pam_unix(sudo:session): session closed for user root
Feb 01 08:24:04 np0005604215.localdomain sudo[83866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:24:04 np0005604215.localdomain sudo[83866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:24:04 np0005604215.localdomain sudo[83866]: pam_unix(sudo:session): session closed for user root
Feb 01 08:24:04 np0005604215.localdomain sudo[83881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:24:04 np0005604215.localdomain sudo[83881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:24:05 np0005604215.localdomain sudo[83881]: pam_unix(sudo:session): session closed for user root
Feb 01 08:24:06 np0005604215.localdomain sudo[83928]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:24:06 np0005604215.localdomain sudo[83928]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:24:06 np0005604215.localdomain sudo[83928]: pam_unix(sudo:session): session closed for user root
Feb 01 08:24:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:24:11 np0005604215.localdomain podman[83943]: 2026-02-01 08:24:11.858467662 +0000 UTC m=+0.073817729 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, build-date=2026-01-12T22:10:15Z, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container)
Feb 01 08:24:11 np0005604215.localdomain podman[83943]: 2026-02-01 08:24:11.892880043 +0000 UTC m=+0.108230090 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:24:11 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: tmp-crun.z3K9As.mount: Deactivated successfully.
Feb 01 08:24:13 np0005604215.localdomain podman[83964]: 2026-02-01 08:24:13.894765469 +0000 UTC m=+0.102963583 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:24:13 np0005604215.localdomain podman[83964]: 2026-02-01 08:24:13.923599865 +0000 UTC m=+0.131797979 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, container_name=nova_compute, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:24:13 np0005604215.localdomain systemd[1]: tmp-crun.Jqc0d0.mount: Deactivated successfully.
Feb 01 08:24:13 np0005604215.localdomain podman[83963]: 2026-02-01 08:24:13.951485462 +0000 UTC m=+0.162858611 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond)
Feb 01 08:24:13 np0005604215.localdomain podman[83963]: 2026-02-01 08:24:13.991751525 +0000 UTC m=+0.203124664 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:24:14 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:24:14 np0005604215.localdomain podman[83965]: 2026-02-01 08:24:13.9949446 +0000 UTC m=+0.198010143 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 01 08:24:14 np0005604215.localdomain podman[83977]: 2026-02-01 08:24:14.053519427 +0000 UTC m=+0.247781108 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:24:14 np0005604215.localdomain podman[83965]: 2026-02-01 08:24:14.079721803 +0000 UTC m=+0.282787286 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:24:14 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:24:14 np0005604215.localdomain podman[83977]: 2026-02-01 08:24:14.13154964 +0000 UTC m=+0.325811281 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:24:14 np0005604215.localdomain podman[83971]: 2026-02-01 08:24:14.154241533 +0000 UTC m=+0.352961996 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, build-date=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:24:14 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:24:14 np0005604215.localdomain podman[83971]: 2026-02-01 08:24:14.211901413 +0000 UTC m=+0.410621936 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, distribution-scope=public)
Feb 01 08:24:14 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:24:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:24:17 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:24:17 np0005604215.localdomain recover_tripleo_nova_virtqemud[84090]: 62016
Feb 01 08:24:17 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:24:17 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:24:17 np0005604215.localdomain podman[84082]: 2026-02-01 08:24:17.876499861 +0000 UTC m=+0.085472796 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:24:18 np0005604215.localdomain podman[84082]: 2026-02-01 08:24:18.285740965 +0000 UTC m=+0.494713860 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:24:18 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:24:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:24:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:24:19 np0005604215.localdomain podman[84108]: 2026-02-01 08:24:19.875374409 +0000 UTC m=+0.089877226 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent)
Feb 01 08:24:19 np0005604215.localdomain podman[84109]: 2026-02-01 08:24:19.92906729 +0000 UTC m=+0.141057253 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, distribution-scope=public, vcs-type=git, architecture=x86_64)
Feb 01 08:24:19 np0005604215.localdomain podman[84108]: 2026-02-01 08:24:19.947735744 +0000 UTC m=+0.162238531 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=)
Feb 01 08:24:19 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:24:20 np0005604215.localdomain podman[84109]: 2026-02-01 08:24:20.000662474 +0000 UTC m=+0.212652487 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 01 08:24:20 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:24:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:24:32 np0005604215.localdomain systemd[1]: tmp-crun.tKb2jn.mount: Deactivated successfully.
Feb 01 08:24:32 np0005604215.localdomain podman[84155]: 2026-02-01 08:24:32.883017711 +0000 UTC m=+0.092838894 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-type=git, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:24:33 np0005604215.localdomain podman[84155]: 2026-02-01 08:24:33.131726805 +0000 UTC m=+0.341548008 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 01 08:24:33 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:24:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:24:42 np0005604215.localdomain systemd[1]: tmp-crun.G4IO2Y.mount: Deactivated successfully.
Feb 01 08:24:42 np0005604215.localdomain podman[84231]: 2026-02-01 08:24:42.876342737 +0000 UTC m=+0.091424752 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:24:42 np0005604215.localdomain podman[84231]: 2026-02-01 08:24:42.88962571 +0000 UTC m=+0.104707795 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, io.openshift.expose-services=, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container)
Feb 01 08:24:42 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: tmp-crun.JbSsbN.mount: Deactivated successfully.
Feb 01 08:24:44 np0005604215.localdomain podman[84252]: 2026-02-01 08:24:44.871264348 +0000 UTC m=+0.088287900 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, container_name=logrotate_crond, batch=17.1_20260112.1, version=17.1.13)
Feb 01 08:24:44 np0005604215.localdomain podman[84252]: 2026-02-01 08:24:44.883479169 +0000 UTC m=+0.100502721 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:24:44 np0005604215.localdomain podman[84253]: 2026-02-01 08:24:44.914924042 +0000 UTC m=+0.125216033 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git)
Feb 01 08:24:44 np0005604215.localdomain podman[84261]: 2026-02-01 08:24:44.88419072 +0000 UTC m=+0.085859686 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, release=1766032510, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 01 08:24:44 np0005604215.localdomain podman[84261]: 2026-02-01 08:24:44.966607235 +0000 UTC m=+0.168276091 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:24:44 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:24:44 np0005604215.localdomain podman[84258]: 2026-02-01 08:24:44.988515534 +0000 UTC m=+0.194289522 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, batch=17.1_20260112.1, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:24:45 np0005604215.localdomain podman[84254]: 2026-02-01 08:24:45.030924532 +0000 UTC m=+0.239314138 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:24:45 np0005604215.localdomain podman[84254]: 2026-02-01 08:24:45.03962778 +0000 UTC m=+0.248017406 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, tcib_managed=true, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Feb 01 08:24:45 np0005604215.localdomain podman[84258]: 2026-02-01 08:24:45.048674998 +0000 UTC m=+0.254449036 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team)
Feb 01 08:24:45 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:24:45 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:24:45 np0005604215.localdomain podman[84253]: 2026-02-01 08:24:45.061955842 +0000 UTC m=+0.272247933 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13)
Feb 01 08:24:45 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:24:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:24:48 np0005604215.localdomain podman[84370]: 2026-02-01 08:24:48.867824987 +0000 UTC m=+0.083283810 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:24:49 np0005604215.localdomain podman[84370]: 2026-02-01 08:24:49.225665248 +0000 UTC m=+0.441124041 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=)
Feb 01 08:24:49 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:24:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:24:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:24:50 np0005604215.localdomain podman[84391]: 2026-02-01 08:24:50.869986132 +0000 UTC m=+0.082275560 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=ovn_metadata_agent, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:24:50 np0005604215.localdomain systemd[1]: tmp-crun.ikFB8J.mount: Deactivated successfully.
Feb 01 08:24:50 np0005604215.localdomain podman[84392]: 2026-02-01 08:24:50.924004963 +0000 UTC m=+0.133812798 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510)
Feb 01 08:24:50 np0005604215.localdomain podman[84391]: 2026-02-01 08:24:50.934178075 +0000 UTC m=+0.146467503 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.5)
Feb 01 08:24:50 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:24:50 np0005604215.localdomain podman[84392]: 2026-02-01 08:24:50.953620252 +0000 UTC m=+0.163428087 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, tcib_managed=true, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vcs-type=git)
Feb 01 08:24:50 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:25:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:25:03 np0005604215.localdomain podman[84437]: 2026-02-01 08:25:03.866581095 +0000 UTC m=+0.079959642 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, version=17.1.13, vcs-type=git, container_name=metrics_qdr, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:25:04 np0005604215.localdomain podman[84437]: 2026-02-01 08:25:04.084546257 +0000 UTC m=+0.297924784 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:25:04 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:25:06 np0005604215.localdomain sudo[84467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:25:06 np0005604215.localdomain sudo[84467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:25:06 np0005604215.localdomain sudo[84467]: pam_unix(sudo:session): session closed for user root
Feb 01 08:25:06 np0005604215.localdomain sudo[84482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:25:06 np0005604215.localdomain sudo[84482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:25:06 np0005604215.localdomain sudo[84482]: pam_unix(sudo:session): session closed for user root
Feb 01 08:25:08 np0005604215.localdomain sudo[84528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:25:08 np0005604215.localdomain sudo[84528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:25:08 np0005604215.localdomain sudo[84528]: pam_unix(sudo:session): session closed for user root
Feb 01 08:25:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:25:13 np0005604215.localdomain podman[84543]: 2026-02-01 08:25:13.879911466 +0000 UTC m=+0.088802975 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, version=17.1.13, config_id=tripleo_step3, container_name=collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:25:13 np0005604215.localdomain podman[84543]: 2026-02-01 08:25:13.890664215 +0000 UTC m=+0.099555724 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.buildah.version=1.41.5, architecture=x86_64, container_name=collectd, distribution-scope=public, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:25:13 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:25:15 np0005604215.localdomain podman[84567]: 2026-02-01 08:25:15.883026168 +0000 UTC m=+0.087607888 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, release=1766032510, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4)
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: tmp-crun.NX1yR1.mount: Deactivated successfully.
Feb 01 08:25:15 np0005604215.localdomain podman[84567]: 2026-02-01 08:25:15.942824691 +0000 UTC m=+0.147406441 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_id=tripleo_step4, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:25:15 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:25:15 np0005604215.localdomain podman[84564]: 2026-02-01 08:25:15.945455149 +0000 UTC m=+0.160089398 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:10:15Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 01 08:25:16 np0005604215.localdomain podman[84565]: 2026-02-01 08:25:15.999558704 +0000 UTC m=+0.209122932 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:25:16 np0005604215.localdomain podman[84564]: 2026-02-01 08:25:16.028647296 +0000 UTC m=+0.243281535 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 08:25:16 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:25:16 np0005604215.localdomain podman[84573]: 2026-02-01 08:25:16.046283098 +0000 UTC m=+0.247910971 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 01 08:25:16 np0005604215.localdomain podman[84565]: 2026-02-01 08:25:16.053655277 +0000 UTC m=+0.263219475 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step5)
Feb 01 08:25:16 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:25:16 np0005604215.localdomain podman[84566]: 2026-02-01 08:25:16.085236323 +0000 UTC m=+0.293819233 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, container_name=iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:25:16 np0005604215.localdomain podman[84573]: 2026-02-01 08:25:16.101811855 +0000 UTC m=+0.303439718 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:25:16 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:25:16 np0005604215.localdomain podman[84566]: 2026-02-01 08:25:16.121666843 +0000 UTC m=+0.330249753 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5)
Feb 01 08:25:16 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:25:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:25:19 np0005604215.localdomain systemd[1]: tmp-crun.aQz4x6.mount: Deactivated successfully.
Feb 01 08:25:19 np0005604215.localdomain podman[84682]: 2026-02-01 08:25:19.862350846 +0000 UTC m=+0.079024994 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, version=17.1.13, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 01 08:25:20 np0005604215.localdomain podman[84682]: 2026-02-01 08:25:20.230618866 +0000 UTC m=+0.447293054 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510)
Feb 01 08:25:20 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:25:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:25:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:25:21 np0005604215.localdomain systemd[1]: tmp-crun.xTf2Yd.mount: Deactivated successfully.
Feb 01 08:25:21 np0005604215.localdomain podman[84705]: 2026-02-01 08:25:21.883488744 +0000 UTC m=+0.090330109 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, batch=17.1_20260112.1, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:25:21 np0005604215.localdomain podman[84705]: 2026-02-01 08:25:21.936313561 +0000 UTC m=+0.143154966 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, tcib_managed=true, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, container_name=ovn_metadata_agent, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:25:21 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:25:21 np0005604215.localdomain podman[84706]: 2026-02-01 08:25:21.939765543 +0000 UTC m=+0.143112404 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 01 08:25:22 np0005604215.localdomain podman[84706]: 2026-02-01 08:25:22.022852607 +0000 UTC m=+0.226199438 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z)
Feb 01 08:25:22 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:25:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:25:34 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:25:34 np0005604215.localdomain recover_tripleo_nova_virtqemud[84755]: 62016
Feb 01 08:25:34 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:25:34 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:25:34 np0005604215.localdomain systemd[1]: tmp-crun.447MDf.mount: Deactivated successfully.
Feb 01 08:25:34 np0005604215.localdomain podman[84753]: 2026-02-01 08:25:34.872357969 +0000 UTC m=+0.082685693 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true)
Feb 01 08:25:35 np0005604215.localdomain podman[84753]: 2026-02-01 08:25:35.091055433 +0000 UTC m=+0.301383187 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:25:35 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:25:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:25:44 np0005604215.localdomain podman[84830]: 2026-02-01 08:25:44.872791674 +0000 UTC m=+0.089179776 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, tcib_managed=true, distribution-scope=public, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:25:44 np0005604215.localdomain podman[84830]: 2026-02-01 08:25:44.88309444 +0000 UTC m=+0.099482532 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vcs-type=git, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:25:44 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: tmp-crun.LzjGFO.mount: Deactivated successfully.
Feb 01 08:25:46 np0005604215.localdomain podman[84850]: 2026-02-01 08:25:46.910236496 +0000 UTC m=+0.120612697 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64)
Feb 01 08:25:46 np0005604215.localdomain podman[84850]: 2026-02-01 08:25:46.921663035 +0000 UTC m=+0.132039206 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, io.openshift.expose-services=)
Feb 01 08:25:46 np0005604215.localdomain podman[84851]: 2026-02-01 08:25:46.937858134 +0000 UTC m=+0.145007100 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vcs-type=git, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:25:46 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:25:46 np0005604215.localdomain podman[84855]: 2026-02-01 08:25:46.985430945 +0000 UTC m=+0.186676956 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 01 08:25:46 np0005604215.localdomain podman[84851]: 2026-02-01 08:25:46.993569686 +0000 UTC m=+0.200718602 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, tcib_managed=true, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 01 08:25:47 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:25:47 np0005604215.localdomain podman[84855]: 2026-02-01 08:25:47.015837096 +0000 UTC m=+0.217083087 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com)
Feb 01 08:25:47 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:25:47 np0005604215.localdomain podman[84852]: 2026-02-01 08:25:47.085253835 +0000 UTC m=+0.288209557 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:25:47 np0005604215.localdomain podman[84859]: 2026-02-01 08:25:47.089948114 +0000 UTC m=+0.288209416 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5)
Feb 01 08:25:47 np0005604215.localdomain podman[84852]: 2026-02-01 08:25:47.09453816 +0000 UTC m=+0.297493882 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, url=https://www.redhat.com, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team)
Feb 01 08:25:47 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:25:47 np0005604215.localdomain podman[84859]: 2026-02-01 08:25:47.117339977 +0000 UTC m=+0.315601309 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13)
Feb 01 08:25:47 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:25:47 np0005604215.localdomain systemd[1]: tmp-crun.xTNCie.mount: Deactivated successfully.
Feb 01 08:25:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:25:50 np0005604215.localdomain systemd[1]: tmp-crun.KpQRtm.mount: Deactivated successfully.
Feb 01 08:25:50 np0005604215.localdomain podman[84963]: 2026-02-01 08:25:50.865747079 +0000 UTC m=+0.075468119 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:25:51 np0005604215.localdomain podman[84963]: 2026-02-01 08:25:51.27286892 +0000 UTC m=+0.482589960 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:25:51 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:25:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:25:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:25:52 np0005604215.localdomain podman[84987]: 2026-02-01 08:25:52.863624827 +0000 UTC m=+0.082132017 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:25:52 np0005604215.localdomain podman[84987]: 2026-02-01 08:25:52.903326074 +0000 UTC m=+0.121833344 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 01 08:25:52 np0005604215.localdomain podman[84988]: 2026-02-01 08:25:52.913437224 +0000 UTC m=+0.129285775 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:25:52 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:25:52 np0005604215.localdomain podman[84988]: 2026-02-01 08:25:52.934989292 +0000 UTC m=+0.150837863 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:25:52 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:26:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:26:05 np0005604215.localdomain podman[85033]: 2026-02-01 08:26:05.867658153 +0000 UTC m=+0.083658132 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:26:06 np0005604215.localdomain podman[85033]: 2026-02-01 08:26:06.09186373 +0000 UTC m=+0.307863769 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vcs-type=git, maintainer=OpenStack TripleO Team)
Feb 01 08:26:06 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:26:08 np0005604215.localdomain sudo[85062]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:26:08 np0005604215.localdomain sudo[85062]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:26:08 np0005604215.localdomain sudo[85062]: pam_unix(sudo:session): session closed for user root
Feb 01 08:26:08 np0005604215.localdomain sudo[85077]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:26:08 np0005604215.localdomain sudo[85077]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:26:09 np0005604215.localdomain sudo[85077]: pam_unix(sudo:session): session closed for user root
Feb 01 08:26:10 np0005604215.localdomain sudo[85125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:26:10 np0005604215.localdomain sudo[85125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:26:10 np0005604215.localdomain sudo[85125]: pam_unix(sudo:session): session closed for user root
Feb 01 08:26:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:26:15 np0005604215.localdomain podman[85140]: 2026-02-01 08:26:15.879416136 +0000 UTC m=+0.097531653 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3)
Feb 01 08:26:15 np0005604215.localdomain podman[85140]: 2026-02-01 08:26:15.89273287 +0000 UTC m=+0.110848517 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=collectd, io.openshift.expose-services=)
Feb 01 08:26:15 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: tmp-crun.bJpLwP.mount: Deactivated successfully.
Feb 01 08:26:17 np0005604215.localdomain podman[85161]: 2026-02-01 08:26:17.87681991 +0000 UTC m=+0.084615311 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, tcib_managed=true, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc.)
Feb 01 08:26:17 np0005604215.localdomain podman[85169]: 2026-02-01 08:26:17.927957816 +0000 UTC m=+0.122829503 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, distribution-scope=public, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: tmp-crun.WorR1d.mount: Deactivated successfully.
Feb 01 08:26:17 np0005604215.localdomain podman[85161]: 2026-02-01 08:26:17.972846757 +0000 UTC m=+0.180642198 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:26:17 np0005604215.localdomain podman[85160]: 2026-02-01 08:26:17.979937136 +0000 UTC m=+0.192589610 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:26:17 np0005604215.localdomain podman[85160]: 2026-02-01 08:26:17.988554523 +0000 UTC m=+0.201207007 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Feb 01 08:26:17 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:26:18 np0005604215.localdomain podman[85162]: 2026-02-01 08:26:18.034075512 +0000 UTC m=+0.237913825 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, distribution-scope=public, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:26:18 np0005604215.localdomain podman[85162]: 2026-02-01 08:26:18.071968495 +0000 UTC m=+0.275806848 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T22:34:43Z, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible)
Feb 01 08:26:18 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:26:18 np0005604215.localdomain podman[85164]: 2026-02-01 08:26:18.095406671 +0000 UTC m=+0.296739310 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, version=17.1.13, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510)
Feb 01 08:26:18 np0005604215.localdomain podman[85169]: 2026-02-01 08:26:18.109375515 +0000 UTC m=+0.304247172 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 01 08:26:18 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:26:18 np0005604215.localdomain podman[85164]: 2026-02-01 08:26:18.131833451 +0000 UTC m=+0.333166080 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:26:18 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:26:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:26:21 np0005604215.localdomain systemd[1]: tmp-crun.ekXTAh.mount: Deactivated successfully.
Feb 01 08:26:21 np0005604215.localdomain podman[85275]: 2026-02-01 08:26:21.873882963 +0000 UTC m=+0.088827305 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:26:22 np0005604215.localdomain podman[85275]: 2026-02-01 08:26:22.238860245 +0000 UTC m=+0.453804637 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:26:22 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:26:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:26:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:26:23 np0005604215.localdomain systemd[1]: tmp-crun.lgAG0z.mount: Deactivated successfully.
Feb 01 08:26:23 np0005604215.localdomain podman[85300]: 2026-02-01 08:26:23.862463305 +0000 UTC m=+0.079221240 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, distribution-scope=public)
Feb 01 08:26:23 np0005604215.localdomain podman[85299]: 2026-02-01 08:26:23.875684608 +0000 UTC m=+0.090905576 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510)
Feb 01 08:26:23 np0005604215.localdomain podman[85300]: 2026-02-01 08:26:23.878652516 +0000 UTC m=+0.095410380 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true)
Feb 01 08:26:23 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:26:23 np0005604215.localdomain podman[85299]: 2026-02-01 08:26:23.918447135 +0000 UTC m=+0.133668123 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:26:23 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:26:24 np0005604215.localdomain systemd[1]: tmp-crun.e0LBfa.mount: Deactivated successfully.
Feb 01 08:26:35 np0005604215.localdomain sshd[85345]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:26:35 np0005604215.localdomain sshd[85345]: error: kex_exchange_identification: banner line contains invalid characters
Feb 01 08:26:35 np0005604215.localdomain sshd[85345]: banner exchange: Connection from 82.147.84.55 port 50602: invalid format
Feb 01 08:26:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:26:36 np0005604215.localdomain podman[85369]: 2026-02-01 08:26:36.872075604 +0000 UTC m=+0.085247198 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z)
Feb 01 08:26:37 np0005604215.localdomain podman[85369]: 2026-02-01 08:26:37.058324407 +0000 UTC m=+0.271495931 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, build-date=2026-01-12T22:10:14Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:26:37 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:26:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:26:46 np0005604215.localdomain podman[85420]: 2026-02-01 08:26:46.87135034 +0000 UTC m=+0.086447725 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, architecture=x86_64, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, distribution-scope=public)
Feb 01 08:26:46 np0005604215.localdomain podman[85420]: 2026-02-01 08:26:46.906431749 +0000 UTC m=+0.121529104 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-type=git, architecture=x86_64, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Feb 01 08:26:46 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:26:48 np0005604215.localdomain podman[85442]: 2026-02-01 08:26:48.886076367 +0000 UTC m=+0.090208536 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, container_name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:26:48 np0005604215.localdomain podman[85442]: 2026-02-01 08:26:48.925731473 +0000 UTC m=+0.129863632 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, container_name=iscsid, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc.)
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:26:48 np0005604215.localdomain podman[85441]: 2026-02-01 08:26:48.941162971 +0000 UTC m=+0.146149776 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, tcib_managed=true, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container)
Feb 01 08:26:48 np0005604215.localdomain podman[85441]: 2026-02-01 08:26:48.974764567 +0000 UTC m=+0.179751382 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 01 08:26:48 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:26:48 np0005604215.localdomain podman[85440]: 2026-02-01 08:26:48.989279027 +0000 UTC m=+0.195717154 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, architecture=x86_64, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, distribution-scope=public, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:26:49 np0005604215.localdomain podman[85459]: 2026-02-01 08:26:48.906784511 +0000 UTC m=+0.096130021 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:26:49 np0005604215.localdomain podman[85440]: 2026-02-01 08:26:49.026834 +0000 UTC m=+0.233272077 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.expose-services=)
Feb 01 08:26:49 np0005604215.localdomain podman[85459]: 2026-02-01 08:26:49.039803255 +0000 UTC m=+0.229148765 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:26:49 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:26:49 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:26:49 np0005604215.localdomain podman[85443]: 2026-02-01 08:26:49.095320161 +0000 UTC m=+0.294089180 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 01 08:26:49 np0005604215.localdomain podman[85443]: 2026-02-01 08:26:49.128813585 +0000 UTC m=+0.327582634 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, release=1766032510, managed_by=tripleo_ansible, io.buildah.version=1.41.5)
Feb 01 08:26:49 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:26:49 np0005604215.localdomain systemd[1]: tmp-crun.5rwM0s.mount: Deactivated successfully.
Feb 01 08:26:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:26:52 np0005604215.localdomain podman[85555]: 2026-02-01 08:26:52.865100317 +0000 UTC m=+0.083296931 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target)
Feb 01 08:26:53 np0005604215.localdomain podman[85555]: 2026-02-01 08:26:53.231281595 +0000 UTC m=+0.449478179 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5)
Feb 01 08:26:53 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:26:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:26:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:26:54 np0005604215.localdomain podman[85579]: 2026-02-01 08:26:54.881863625 +0000 UTC m=+0.071807740 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, version=17.1.13, vcs-type=git)
Feb 01 08:26:54 np0005604215.localdomain podman[85579]: 2026-02-01 08:26:54.903232709 +0000 UTC m=+0.093176894 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:26:54 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:26:54 np0005604215.localdomain podman[85578]: 2026-02-01 08:26:54.94715294 +0000 UTC m=+0.136913649 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, maintainer=OpenStack TripleO Team)
Feb 01 08:26:55 np0005604215.localdomain podman[85578]: 2026-02-01 08:26:55.014594801 +0000 UTC m=+0.204355520 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.buildah.version=1.41.5)
Feb 01 08:26:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:27:00 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:27:00 np0005604215.localdomain recover_tripleo_nova_virtqemud[85626]: 62016
Feb 01 08:27:00 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:27:00 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:27:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:27:07 np0005604215.localdomain podman[85627]: 2026-02-01 08:27:07.862408125 +0000 UTC m=+0.076484720 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T22:10:14Z, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:27:08 np0005604215.localdomain podman[85627]: 2026-02-01 08:27:08.054921762 +0000 UTC m=+0.268998297 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, batch=17.1_20260112.1)
Feb 01 08:27:08 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:27:10 np0005604215.localdomain sudo[85658]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:27:10 np0005604215.localdomain sudo[85658]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:27:10 np0005604215.localdomain sudo[85658]: pam_unix(sudo:session): session closed for user root
Feb 01 08:27:10 np0005604215.localdomain sudo[85673]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:27:10 np0005604215.localdomain sudo[85673]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:27:11 np0005604215.localdomain sudo[85673]: pam_unix(sudo:session): session closed for user root
Feb 01 08:27:11 np0005604215.localdomain sudo[85719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:27:11 np0005604215.localdomain sudo[85719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:27:11 np0005604215.localdomain sudo[85719]: pam_unix(sudo:session): session closed for user root
Feb 01 08:27:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:27:17 np0005604215.localdomain podman[85734]: 2026-02-01 08:27:17.878177456 +0000 UTC m=+0.092285317 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, container_name=collectd)
Feb 01 08:27:17 np0005604215.localdomain podman[85734]: 2026-02-01 08:27:17.914967797 +0000 UTC m=+0.129075648 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, distribution-scope=public, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:27:17 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:27:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:27:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:27:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:27:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:27:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:27:19 np0005604215.localdomain podman[85754]: 2026-02-01 08:27:19.883532867 +0000 UTC m=+0.096436012 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, url=https://www.redhat.com, distribution-scope=public, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 01 08:27:19 np0005604215.localdomain podman[85755]: 2026-02-01 08:27:19.932369144 +0000 UTC m=+0.139767455 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:27:19 np0005604215.localdomain podman[85754]: 2026-02-01 08:27:19.94773525 +0000 UTC m=+0.160638445 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:27:19 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:27:19 np0005604215.localdomain podman[85755]: 2026-02-01 08:27:19.995803086 +0000 UTC m=+0.203201407 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5)
Feb 01 08:27:20 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:27:20 np0005604215.localdomain podman[85756]: 2026-02-01 08:27:20.079359963 +0000 UTC m=+0.284660132 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:27:20 np0005604215.localdomain podman[85756]: 2026-02-01 08:27:20.09375768 +0000 UTC m=+0.299057839 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Feb 01 08:27:20 np0005604215.localdomain podman[85763]: 2026-02-01 08:27:20.05193754 +0000 UTC m=+0.251100087 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, version=17.1.13, release=1766032510, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:27:20 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:27:20 np0005604215.localdomain podman[85763]: 2026-02-01 08:27:20.135726815 +0000 UTC m=+0.334889382 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, version=17.1.13, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 01 08:27:20 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:27:20 np0005604215.localdomain podman[85762]: 2026-02-01 08:27:20.188098808 +0000 UTC m=+0.389997386 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:27:20 np0005604215.localdomain podman[85762]: 2026-02-01 08:27:20.216782558 +0000 UTC m=+0.418681096 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, version=17.1.13, config_id=tripleo_step4, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, batch=17.1_20260112.1)
Feb 01 08:27:20 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:27:20 np0005604215.localdomain systemd[1]: tmp-crun.KqUW9Y.mount: Deactivated successfully.
Feb 01 08:27:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:27:23 np0005604215.localdomain podman[85868]: 2026-02-01 08:27:23.840943886 +0000 UTC m=+0.058361912 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, config_id=tripleo_step4)
Feb 01 08:27:24 np0005604215.localdomain podman[85868]: 2026-02-01 08:27:24.210839793 +0000 UTC m=+0.428257779 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:27:24 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:27:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:27:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:27:25 np0005604215.localdomain systemd[1]: tmp-crun.BzSUYn.mount: Deactivated successfully.
Feb 01 08:27:25 np0005604215.localdomain podman[85891]: 2026-02-01 08:27:25.883042423 +0000 UTC m=+0.095784311 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:27:25 np0005604215.localdomain podman[85892]: 2026-02-01 08:27:25.915421014 +0000 UTC m=+0.125518463 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.buildah.version=1.41.5, architecture=x86_64, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team)
Feb 01 08:27:25 np0005604215.localdomain podman[85891]: 2026-02-01 08:27:25.923747071 +0000 UTC m=+0.136488999 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64)
Feb 01 08:27:25 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:27:25 np0005604215.localdomain podman[85892]: 2026-02-01 08:27:25.964738976 +0000 UTC m=+0.174836465 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, batch=17.1_20260112.1)
Feb 01 08:27:25 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:27:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:27:38 np0005604215.localdomain podman[85984]: 2026-02-01 08:27:38.876271909 +0000 UTC m=+0.089267758 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:27:39 np0005604215.localdomain podman[85984]: 2026-02-01 08:27:39.077756973 +0000 UTC m=+0.290752752 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z)
Feb 01 08:27:39 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:27:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:27:48 np0005604215.localdomain podman[86013]: 2026-02-01 08:27:48.862114925 +0000 UTC m=+0.077330134 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd)
Feb 01 08:27:48 np0005604215.localdomain podman[86013]: 2026-02-01 08:27:48.870933466 +0000 UTC m=+0.086148615 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, tcib_managed=true, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:27:48 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:27:50 np0005604215.localdomain podman[86042]: 2026-02-01 08:27:50.891656342 +0000 UTC m=+0.091461373 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, vcs-type=git, url=https://www.redhat.com)
Feb 01 08:27:50 np0005604215.localdomain podman[86042]: 2026-02-01 08:27:50.943640653 +0000 UTC m=+0.143445624 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, config_id=tripleo_step4, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510)
Feb 01 08:27:50 np0005604215.localdomain podman[86048]: 2026-02-01 08:27:50.952414603 +0000 UTC m=+0.148139252 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi)
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:27:50 np0005604215.localdomain podman[86034]: 2026-02-01 08:27:50.925506466 +0000 UTC m=+0.139130177 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, container_name=logrotate_crond, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:27:50 np0005604215.localdomain podman[86048]: 2026-02-01 08:27:50.987799092 +0000 UTC m=+0.183523671 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, tcib_managed=true)
Feb 01 08:27:50 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:27:51 np0005604215.localdomain podman[86035]: 2026-02-01 08:27:51.035866287 +0000 UTC m=+0.243950444 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, managed_by=tripleo_ansible, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64)
Feb 01 08:27:51 np0005604215.localdomain podman[86034]: 2026-02-01 08:27:51.059096325 +0000 UTC m=+0.272720086 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:27:51 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:27:51 np0005604215.localdomain podman[86036]: 2026-02-01 08:27:51.075316556 +0000 UTC m=+0.281072263 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Feb 01 08:27:51 np0005604215.localdomain podman[86036]: 2026-02-01 08:27:51.08858485 +0000 UTC m=+0.294340537 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=)
Feb 01 08:27:51 np0005604215.localdomain podman[86035]: 2026-02-01 08:27:51.095679201 +0000 UTC m=+0.303763318 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, distribution-scope=public, container_name=nova_compute, release=1766032510, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:27:51 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:27:51 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:27:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:27:54 np0005604215.localdomain podman[86152]: 2026-02-01 08:27:54.875707049 +0000 UTC m=+0.087756222 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1766032510, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:27:55 np0005604215.localdomain podman[86152]: 2026-02-01 08:27:55.285912302 +0000 UTC m=+0.497961475 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., container_name=nova_migration_target, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team)
Feb 01 08:27:55 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:27:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:27:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:27:56 np0005604215.localdomain podman[86176]: 2026-02-01 08:27:56.871331139 +0000 UTC m=+0.085792724 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, batch=17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:27:56 np0005604215.localdomain podman[86175]: 2026-02-01 08:27:56.923549388 +0000 UTC m=+0.141545838 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, architecture=x86_64, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public)
Feb 01 08:27:56 np0005604215.localdomain podman[86176]: 2026-02-01 08:27:56.94588433 +0000 UTC m=+0.160345905 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1)
Feb 01 08:27:56 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:27:56 np0005604215.localdomain podman[86175]: 2026-02-01 08:27:56.994644426 +0000 UTC m=+0.212640836 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:27:57 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:28:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:28:09 np0005604215.localdomain podman[86222]: 2026-02-01 08:28:09.880465115 +0000 UTC m=+0.092526664 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:28:10 np0005604215.localdomain podman[86222]: 2026-02-01 08:28:10.085856555 +0000 UTC m=+0.297918074 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, config_id=tripleo_step1, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:28:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:28:11 np0005604215.localdomain sudo[86251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:28:11 np0005604215.localdomain sudo[86251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:28:11 np0005604215.localdomain sudo[86251]: pam_unix(sudo:session): session closed for user root
Feb 01 08:28:12 np0005604215.localdomain sudo[86266]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:28:12 np0005604215.localdomain sudo[86266]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:28:12 np0005604215.localdomain sudo[86266]: pam_unix(sudo:session): session closed for user root
Feb 01 08:28:13 np0005604215.localdomain sudo[86314]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:28:13 np0005604215.localdomain sudo[86314]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:28:13 np0005604215.localdomain sudo[86314]: pam_unix(sudo:session): session closed for user root
Feb 01 08:28:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:28:19 np0005604215.localdomain podman[86329]: 2026-02-01 08:28:19.878969954 +0000 UTC m=+0.092414951 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, version=17.1.13, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:28:19 np0005604215.localdomain podman[86329]: 2026-02-01 08:28:19.91976492 +0000 UTC m=+0.133209917 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step3, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 01 08:28:19 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:28:20 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:28:20 np0005604215.localdomain recover_tripleo_nova_virtqemud[86350]: 62016
Feb 01 08:28:20 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:28:20 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: tmp-crun.YeNRij.mount: Deactivated successfully.
Feb 01 08:28:21 np0005604215.localdomain podman[86353]: 2026-02-01 08:28:21.907263956 +0000 UTC m=+0.115937146 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, vcs-type=git, build-date=2026-01-12T23:32:04Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:28:21 np0005604215.localdomain podman[86353]: 2026-02-01 08:28:21.940630889 +0000 UTC m=+0.149304139 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, distribution-scope=public, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, config_id=tripleo_step5, vcs-type=git, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc.)
Feb 01 08:28:21 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:28:21 np0005604215.localdomain podman[86355]: 2026-02-01 08:28:21.957416174 +0000 UTC m=+0.159452677 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 01 08:28:21 np0005604215.localdomain podman[86361]: 2026-02-01 08:28:21.927354944 +0000 UTC m=+0.121416977 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, release=1766032510, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:28:22 np0005604215.localdomain podman[86352]: 2026-02-01 08:28:22.007136049 +0000 UTC m=+0.218557575 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, version=17.1.13, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, vendor=Red Hat, Inc.)
Feb 01 08:28:22 np0005604215.localdomain podman[86355]: 2026-02-01 08:28:22.014686475 +0000 UTC m=+0.216722938 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, build-date=2026-01-12T23:07:47Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:28:22 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:28:22 np0005604215.localdomain podman[86361]: 2026-02-01 08:28:22.061790367 +0000 UTC m=+0.255852380 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 01 08:28:22 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:28:22 np0005604215.localdomain podman[86354]: 2026-02-01 08:28:22.106021311 +0000 UTC m=+0.309750426 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, com.redhat.component=openstack-iscsid-container, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 01 08:28:22 np0005604215.localdomain podman[86354]: 2026-02-01 08:28:22.11973344 +0000 UTC m=+0.323462575 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, container_name=iscsid, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:28:22 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:28:22 np0005604215.localdomain podman[86352]: 2026-02-01 08:28:22.173846231 +0000 UTC m=+0.385267757 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, container_name=logrotate_crond, distribution-scope=public)
Feb 01 08:28:22 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:28:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:28:25 np0005604215.localdomain podman[86464]: 2026-02-01 08:28:25.87047495 +0000 UTC m=+0.086664211 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, container_name=nova_migration_target, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, distribution-scope=public)
Feb 01 08:28:26 np0005604215.localdomain podman[86464]: 2026-02-01 08:28:26.241767139 +0000 UTC m=+0.457956360 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com)
Feb 01 08:28:26 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:28:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:28:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:28:27 np0005604215.localdomain podman[86488]: 2026-02-01 08:28:27.873880554 +0000 UTC m=+0.086304250 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:28:27 np0005604215.localdomain systemd[1]: tmp-crun.MI4lPQ.mount: Deactivated successfully.
Feb 01 08:28:27 np0005604215.localdomain podman[86489]: 2026-02-01 08:28:27.9370902 +0000 UTC m=+0.147634647 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:28:27 np0005604215.localdomain podman[86488]: 2026-02-01 08:28:27.952783811 +0000 UTC m=+0.165207527 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, release=1766032510, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 01 08:28:27 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:28:27 np0005604215.localdomain podman[86489]: 2026-02-01 08:28:27.965824999 +0000 UTC m=+0.176369406 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:28:27 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:28:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:28:40 np0005604215.localdomain podman[86581]: 2026-02-01 08:28:40.879853206 +0000 UTC m=+0.090246493 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, release=1766032510, architecture=x86_64, container_name=metrics_qdr, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 01 08:28:41 np0005604215.localdomain podman[86581]: 2026-02-01 08:28:41.065614554 +0000 UTC m=+0.276007791 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:28:41 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:28:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:28:50 np0005604215.localdomain podman[86610]: 2026-02-01 08:28:50.873926677 +0000 UTC m=+0.088371744 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd)
Feb 01 08:28:50 np0005604215.localdomain podman[86610]: 2026-02-01 08:28:50.887802951 +0000 UTC m=+0.102247998 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, architecture=x86_64, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:28:50 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:28:52 np0005604215.localdomain podman[86631]: 2026-02-01 08:28:52.849247633 +0000 UTC m=+0.060908004 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_compute)
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: tmp-crun.lChRxQ.mount: Deactivated successfully.
Feb 01 08:28:52 np0005604215.localdomain podman[86644]: 2026-02-01 08:28:52.893649912 +0000 UTC m=+0.089004574 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:28:52 np0005604215.localdomain podman[86630]: 2026-02-01 08:28:52.908092783 +0000 UTC m=+0.121010454 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:28:52 np0005604215.localdomain podman[86630]: 2026-02-01 08:28:52.916589409 +0000 UTC m=+0.129507010 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:28:52 np0005604215.localdomain podman[86631]: 2026-02-01 08:28:52.925670904 +0000 UTC m=+0.137331355 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, version=17.1.13, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, tcib_managed=true, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:28:52 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:28:52 np0005604215.localdomain podman[86632]: 2026-02-01 08:28:52.918642874 +0000 UTC m=+0.123755591 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, batch=17.1_20260112.1, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid)
Feb 01 08:28:53 np0005604215.localdomain podman[86632]: 2026-02-01 08:28:53.005125638 +0000 UTC m=+0.210238355 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z)
Feb 01 08:28:53 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:28:53 np0005604215.localdomain podman[86644]: 2026-02-01 08:28:53.02116962 +0000 UTC m=+0.216524292 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:28:53 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:28:53 np0005604215.localdomain podman[86643]: 2026-02-01 08:28:53.1027269 +0000 UTC m=+0.305427711 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, version=17.1.13, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:28:53 np0005604215.localdomain podman[86643]: 2026-02-01 08:28:53.131433537 +0000 UTC m=+0.334134368 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, release=1766032510, container_name=ceilometer_agent_compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com)
Feb 01 08:28:53 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:28:53 np0005604215.localdomain systemd[1]: tmp-crun.25BoLN.mount: Deactivated successfully.
Feb 01 08:28:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:28:56 np0005604215.localdomain podman[86741]: 2026-02-01 08:28:56.869252683 +0000 UTC m=+0.081880591 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:28:57 np0005604215.localdomain podman[86741]: 2026-02-01 08:28:57.240968317 +0000 UTC m=+0.453596215 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:28:57 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:28:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:28:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:28:58 np0005604215.localdomain podman[86764]: 2026-02-01 08:28:58.872914596 +0000 UTC m=+0.085922398 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git)
Feb 01 08:28:58 np0005604215.localdomain podman[86764]: 2026-02-01 08:28:58.916249801 +0000 UTC m=+0.129257683 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, container_name=ovn_metadata_agent, release=1766032510, vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com)
Feb 01 08:28:58 np0005604215.localdomain podman[86765]: 2026-02-01 08:28:58.927452411 +0000 UTC m=+0.137326945 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:28:58 np0005604215.localdomain podman[86765]: 2026-02-01 08:28:58.955642392 +0000 UTC m=+0.165516926 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, tcib_managed=true, io.openshift.expose-services=)
Feb 01 08:28:58 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:28:58 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:29:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:29:11 np0005604215.localdomain podman[86812]: 2026-02-01 08:29:11.881881872 +0000 UTC m=+0.095509308 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:29:12 np0005604215.localdomain podman[86812]: 2026-02-01 08:29:12.111986887 +0000 UTC m=+0.325614343 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_id=tripleo_step1)
Feb 01 08:29:12 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:29:13 np0005604215.localdomain sudo[86841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:29:13 np0005604215.localdomain sudo[86841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:29:13 np0005604215.localdomain sudo[86841]: pam_unix(sudo:session): session closed for user root
Feb 01 08:29:13 np0005604215.localdomain sudo[86856]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:29:13 np0005604215.localdomain sudo[86856]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:29:14 np0005604215.localdomain sudo[86856]: pam_unix(sudo:session): session closed for user root
Feb 01 08:29:15 np0005604215.localdomain sudo[86902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:29:15 np0005604215.localdomain sudo[86902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:29:15 np0005604215.localdomain sudo[86902]: pam_unix(sudo:session): session closed for user root
Feb 01 08:29:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:29:21 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:29:21 np0005604215.localdomain recover_tripleo_nova_virtqemud[86919]: 62016
Feb 01 08:29:21 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:29:21 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:29:21 np0005604215.localdomain systemd[1]: tmp-crun.TDrMFC.mount: Deactivated successfully.
Feb 01 08:29:21 np0005604215.localdomain podman[86917]: 2026-02-01 08:29:21.882092977 +0000 UTC m=+0.095323052 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, vcs-type=git, version=17.1.13, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:29:21 np0005604215.localdomain podman[86917]: 2026-02-01 08:29:21.918369361 +0000 UTC m=+0.131599426 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, container_name=collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, vcs-type=git, build-date=2026-01-12T22:10:15Z, release=1766032510)
Feb 01 08:29:21 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:29:23 np0005604215.localdomain podman[86940]: 2026-02-01 08:29:23.883681795 +0000 UTC m=+0.091189983 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5)
Feb 01 08:29:23 np0005604215.localdomain podman[86939]: 2026-02-01 08:29:23.933738229 +0000 UTC m=+0.141297739 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, tcib_managed=true, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=logrotate_crond, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:29:23 np0005604215.localdomain podman[86940]: 2026-02-01 08:29:23.945952582 +0000 UTC m=+0.153460800 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1)
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:29:23 np0005604215.localdomain systemd[1]: tmp-crun.x6MIEo.mount: Deactivated successfully.
Feb 01 08:29:23 np0005604215.localdomain podman[86951]: 2026-02-01 08:29:23.997033469 +0000 UTC m=+0.193843512 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:29:24 np0005604215.localdomain podman[86941]: 2026-02-01 08:29:24.050594573 +0000 UTC m=+0.253717003 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 01 08:29:24 np0005604215.localdomain podman[86951]: 2026-02-01 08:29:24.079203268 +0000 UTC m=+0.276013351 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1766032510, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:29:24 np0005604215.localdomain podman[86941]: 2026-02-01 08:29:24.084769462 +0000 UTC m=+0.287891882 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.openshift.expose-services=, release=1766032510)
Feb 01 08:29:24 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:29:24 np0005604215.localdomain podman[86942]: 2026-02-01 08:29:24.111422686 +0000 UTC m=+0.311507501 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 01 08:29:24 np0005604215.localdomain podman[86939]: 2026-02-01 08:29:24.120926643 +0000 UTC m=+0.328486153 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, io.openshift.expose-services=, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 01 08:29:24 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:29:24 np0005604215.localdomain podman[86942]: 2026-02-01 08:29:24.142411155 +0000 UTC m=+0.342495940 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com)
Feb 01 08:29:24 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:29:24 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:29:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:29:27 np0005604215.localdomain podman[87056]: 2026-02-01 08:29:27.866629316 +0000 UTC m=+0.081191030 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:29:28 np0005604215.localdomain podman[87056]: 2026-02-01 08:29:28.242190609 +0000 UTC m=+0.456752403 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:29:28 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:29:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:29:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:29:29 np0005604215.localdomain podman[87079]: 2026-02-01 08:29:29.878338489 +0000 UTC m=+0.087090184 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:29:29 np0005604215.localdomain podman[87080]: 2026-02-01 08:29:29.935218158 +0000 UTC m=+0.140675149 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:29:29 np0005604215.localdomain podman[87079]: 2026-02-01 08:29:29.949907777 +0000 UTC m=+0.158659472 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:29:29 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:29:29 np0005604215.localdomain podman[87080]: 2026-02-01 08:29:29.989733022 +0000 UTC m=+0.195189983 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, batch=17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13)
Feb 01 08:29:30 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:29:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:29:42 np0005604215.localdomain systemd[1]: tmp-crun.Uq7gAf.mount: Deactivated successfully.
Feb 01 08:29:42 np0005604215.localdomain podman[87172]: 2026-02-01 08:29:42.886977494 +0000 UTC m=+0.103184928 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true)
Feb 01 08:29:43 np0005604215.localdomain podman[87172]: 2026-02-01 08:29:43.094686139 +0000 UTC m=+0.310893583 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:29:43 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:29:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:29:52 np0005604215.localdomain podman[87201]: 2026-02-01 08:29:52.869437444 +0000 UTC m=+0.079220368 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, release=1766032510, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:29:52 np0005604215.localdomain podman[87201]: 2026-02-01 08:29:52.877699783 +0000 UTC m=+0.087482697 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=collectd, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:29:52 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:29:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:29:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:29:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:29:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:29:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:29:54 np0005604215.localdomain systemd[1]: tmp-crun.CO8GYH.mount: Deactivated successfully.
Feb 01 08:29:54 np0005604215.localdomain podman[87221]: 2026-02-01 08:29:54.949119473 +0000 UTC m=+0.154235484 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:29:54 np0005604215.localdomain podman[87229]: 2026-02-01 08:29:54.909516065 +0000 UTC m=+0.105128519 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:29:54 np0005604215.localdomain podman[87229]: 2026-02-01 08:29:54.995036739 +0000 UTC m=+0.190649153 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 01 08:29:55 np0005604215.localdomain podman[87221]: 2026-02-01 08:29:55.004204536 +0000 UTC m=+0.209320507 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step5, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:29:55 np0005604215.localdomain podman[87226]: 2026-02-01 08:29:55.009863803 +0000 UTC m=+0.207676556 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:29:55 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:29:55 np0005604215.localdomain podman[87226]: 2026-02-01 08:29:55.035749912 +0000 UTC m=+0.233562735 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64)
Feb 01 08:29:55 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:29:55 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:29:55 np0005604215.localdomain podman[87222]: 2026-02-01 08:29:55.095091907 +0000 UTC m=+0.297363189 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, release=1766032510)
Feb 01 08:29:55 np0005604215.localdomain podman[87222]: 2026-02-01 08:29:55.1076512 +0000 UTC m=+0.309922492 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z)
Feb 01 08:29:55 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:29:55 np0005604215.localdomain podman[87220]: 2026-02-01 08:29:55.186453815 +0000 UTC m=+0.396539511 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:29:55 np0005604215.localdomain podman[87220]: 2026-02-01 08:29:55.22471845 +0000 UTC m=+0.434804086 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:29:55 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:29:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:29:58 np0005604215.localdomain systemd[1]: tmp-crun.4i2Pyl.mount: Deactivated successfully.
Feb 01 08:29:58 np0005604215.localdomain podman[87335]: 2026-02-01 08:29:58.871926295 +0000 UTC m=+0.088124696 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:29:59 np0005604215.localdomain podman[87335]: 2026-02-01 08:29:59.232612023 +0000 UTC m=+0.448810364 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, version=17.1.13)
Feb 01 08:29:59 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:30:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:30:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:30:00 np0005604215.localdomain podman[87358]: 2026-02-01 08:30:00.862945732 +0000 UTC m=+0.080765817 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:30:00 np0005604215.localdomain systemd[1]: tmp-crun.C2FtgG.mount: Deactivated successfully.
Feb 01 08:30:00 np0005604215.localdomain podman[87359]: 2026-02-01 08:30:00.919560312 +0000 UTC m=+0.133106803 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com)
Feb 01 08:30:00 np0005604215.localdomain podman[87358]: 2026-02-01 08:30:00.930618837 +0000 UTC m=+0.148438932 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:30:00 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:30:00 np0005604215.localdomain podman[87359]: 2026-02-01 08:30:00.972677192 +0000 UTC m=+0.186223703 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, vcs-type=git, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:30:00 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:30:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:30:13 np0005604215.localdomain podman[87404]: 2026-02-01 08:30:13.888628288 +0000 UTC m=+0.103437775 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:30:14 np0005604215.localdomain podman[87404]: 2026-02-01 08:30:14.061692069 +0000 UTC m=+0.276501486 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, architecture=x86_64, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 01 08:30:14 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:30:15 np0005604215.localdomain sudo[87433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:30:15 np0005604215.localdomain sudo[87433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:30:15 np0005604215.localdomain sudo[87433]: pam_unix(sudo:session): session closed for user root
Feb 01 08:30:15 np0005604215.localdomain sudo[87448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:30:15 np0005604215.localdomain sudo[87448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:30:15 np0005604215.localdomain sudo[87448]: pam_unix(sudo:session): session closed for user root
Feb 01 08:30:18 np0005604215.localdomain sudo[87494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:30:18 np0005604215.localdomain sudo[87494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:30:18 np0005604215.localdomain sudo[87494]: pam_unix(sudo:session): session closed for user root
Feb 01 08:30:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:30:23 np0005604215.localdomain podman[87509]: 2026-02-01 08:30:23.872569113 +0000 UTC m=+0.082404528 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:30:23 np0005604215.localdomain podman[87509]: 2026-02-01 08:30:23.882466422 +0000 UTC m=+0.092301847 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-type=git, container_name=collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.buildah.version=1.41.5)
Feb 01 08:30:23 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: tmp-crun.SB2UB3.mount: Deactivated successfully.
Feb 01 08:30:25 np0005604215.localdomain podman[87529]: 2026-02-01 08:30:25.891717399 +0000 UTC m=+0.106126420 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, release=1766032510, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:30:25 np0005604215.localdomain podman[87529]: 2026-02-01 08:30:25.895082754 +0000 UTC m=+0.109491775 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z)
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: tmp-crun.kxuJip.mount: Deactivated successfully.
Feb 01 08:30:25 np0005604215.localdomain podman[87530]: 2026-02-01 08:30:25.938048018 +0000 UTC m=+0.146350297 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:30:25 np0005604215.localdomain podman[87530]: 2026-02-01 08:30:25.963611737 +0000 UTC m=+0.171914026 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:30:25 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:30:26 np0005604215.localdomain podman[87531]: 2026-02-01 08:30:26.048706817 +0000 UTC m=+0.253329611 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 01 08:30:26 np0005604215.localdomain podman[87537]: 2026-02-01 08:30:26.101936192 +0000 UTC m=+0.303781160 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, tcib_managed=true, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:30:26 np0005604215.localdomain podman[87531]: 2026-02-01 08:30:26.134252733 +0000 UTC m=+0.338875577 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20260112.1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:30:26 np0005604215.localdomain podman[87537]: 2026-02-01 08:30:26.152032188 +0000 UTC m=+0.353877136 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:30:26 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:30:26 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:30:26 np0005604215.localdomain podman[87538]: 2026-02-01 08:30:26.151799931 +0000 UTC m=+0.351195702 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, architecture=x86_64, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:30:26 np0005604215.localdomain podman[87538]: 2026-02-01 08:30:26.231247415 +0000 UTC m=+0.430643236 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:30:26 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:30:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:30:29 np0005604215.localdomain systemd[1]: tmp-crun.Z4sw8O.mount: Deactivated successfully.
Feb 01 08:30:29 np0005604215.localdomain podman[87649]: 2026-02-01 08:30:29.861720345 +0000 UTC m=+0.075748079 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, architecture=x86_64, release=1766032510, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.13)
Feb 01 08:30:30 np0005604215.localdomain podman[87649]: 2026-02-01 08:30:30.232169929 +0000 UTC m=+0.446197663 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:30:30 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:30:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:30:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:30:31 np0005604215.localdomain systemd[1]: tmp-crun.gZjAFG.mount: Deactivated successfully.
Feb 01 08:30:31 np0005604215.localdomain podman[87672]: 2026-02-01 08:30:31.884609959 +0000 UTC m=+0.095506998 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, build-date=2026-01-12T22:36:40Z)
Feb 01 08:30:31 np0005604215.localdomain podman[87672]: 2026-02-01 08:30:31.903720806 +0000 UTC m=+0.114617815 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.buildah.version=1.41.5, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4)
Feb 01 08:30:31 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:30:31 np0005604215.localdomain podman[87671]: 2026-02-01 08:30:31.970231756 +0000 UTC m=+0.184548602 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:30:32 np0005604215.localdomain podman[87671]: 2026-02-01 08:30:32.009816683 +0000 UTC m=+0.224133549 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:30:32 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:30:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:30:44 np0005604215.localdomain podman[87768]: 2026-02-01 08:30:44.870614175 +0000 UTC m=+0.084841143 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z)
Feb 01 08:30:45 np0005604215.localdomain podman[87768]: 2026-02-01 08:30:45.065859791 +0000 UTC m=+0.280086809 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=)
Feb 01 08:30:45 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:30:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:30:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 487 writes, 1908 keys, 487 commit groups, 1.0 writes per commit group, ingest: 2.24 MB, 0.00 MB/s
                                                          Interval WAL: 487 writes, 193 syncs, 2.52 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:30:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:30:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3000.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 489 writes, 1859 keys, 489 commit groups, 1.0 writes per commit group, ingest: 2.34 MB, 0.00 MB/s
                                                          Interval WAL: 489 writes, 177 syncs, 2.76 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:30:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:30:54 np0005604215.localdomain podman[87798]: 2026-02-01 08:30:54.874223807 +0000 UTC m=+0.085324849 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, vcs-type=git, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, version=17.1.13, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:30:54 np0005604215.localdomain podman[87798]: 2026-02-01 08:30:54.881122832 +0000 UTC m=+0.092223904 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, container_name=collectd, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 01 08:30:54 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:30:56 np0005604215.localdomain podman[87819]: 2026-02-01 08:30:56.888850091 +0000 UTC m=+0.091832012 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, config_id=tripleo_step5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:30:56 np0005604215.localdomain podman[87819]: 2026-02-01 08:30:56.941609511 +0000 UTC m=+0.144591422 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:30:56 np0005604215.localdomain systemd[1]: tmp-crun.5V94io.mount: Deactivated successfully.
Feb 01 08:30:56 np0005604215.localdomain podman[87818]: 2026-02-01 08:30:56.945847494 +0000 UTC m=+0.151015273 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, distribution-scope=public, vendor=Red Hat, Inc., container_name=logrotate_crond, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:30:57 np0005604215.localdomain podman[87818]: 2026-02-01 08:30:57.028676794 +0000 UTC m=+0.233844583 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, name=rhosp-rhel9/openstack-cron, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:30:57 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:30:57 np0005604215.localdomain podman[87820]: 2026-02-01 08:30:56.996518029 +0000 UTC m=+0.196410464 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:30:57 np0005604215.localdomain podman[87821]: 2026-02-01 08:30:57.103368209 +0000 UTC m=+0.301589681 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:30:57 np0005604215.localdomain podman[87820]: 2026-02-01 08:30:57.130401434 +0000 UTC m=+0.330293869 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, io.openshift.expose-services=, container_name=iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public)
Feb 01 08:30:57 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:30:57 np0005604215.localdomain podman[87821]: 2026-02-01 08:30:57.186262741 +0000 UTC m=+0.384484193 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible)
Feb 01 08:30:57 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:30:57 np0005604215.localdomain podman[87828]: 2026-02-01 08:30:57.204364537 +0000 UTC m=+0.396356135 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public)
Feb 01 08:30:57 np0005604215.localdomain podman[87828]: 2026-02-01 08:30:57.236712778 +0000 UTC m=+0.428704406 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, vendor=Red Hat, Inc.)
Feb 01 08:30:57 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:31:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:31:00 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:31:00 np0005604215.localdomain recover_tripleo_nova_virtqemud[87939]: 62016
Feb 01 08:31:00 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:31:00 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:31:00 np0005604215.localdomain systemd[1]: tmp-crun.LVbsWN.mount: Deactivated successfully.
Feb 01 08:31:00 np0005604215.localdomain podman[87935]: 2026-02-01 08:31:00.870720531 +0000 UTC m=+0.083599106 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:31:01 np0005604215.localdomain podman[87935]: 2026-02-01 08:31:01.269980094 +0000 UTC m=+0.482858659 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:31:01 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:31:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:31:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:31:02 np0005604215.localdomain systemd[1]: tmp-crun.9etjsi.mount: Deactivated successfully.
Feb 01 08:31:02 np0005604215.localdomain podman[87962]: 2026-02-01 08:31:02.890861688 +0000 UTC m=+0.102909059 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:31:02 np0005604215.localdomain podman[87961]: 2026-02-01 08:31:02.932268602 +0000 UTC m=+0.147093490 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step4, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:31:02 np0005604215.localdomain podman[87961]: 2026-02-01 08:31:02.971392436 +0000 UTC m=+0.186217344 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:31:02 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:31:03 np0005604215.localdomain podman[87962]: 2026-02-01 08:31:03.021719239 +0000 UTC m=+0.233766570 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, container_name=ovn_controller)
Feb 01 08:31:03 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:31:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:31:15 np0005604215.localdomain systemd[1]: tmp-crun.PXHTIl.mount: Deactivated successfully.
Feb 01 08:31:15 np0005604215.localdomain podman[88009]: 2026-02-01 08:31:15.889269231 +0000 UTC m=+0.101043425 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, url=https://www.redhat.com, com.redhat.component=openstack-qdrouterd-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:31:16 np0005604215.localdomain podman[88009]: 2026-02-01 08:31:16.090646807 +0000 UTC m=+0.302420991 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:31:16 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:31:18 np0005604215.localdomain sudo[88038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:31:18 np0005604215.localdomain sudo[88038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:31:18 np0005604215.localdomain sudo[88038]: pam_unix(sudo:session): session closed for user root
Feb 01 08:31:18 np0005604215.localdomain sudo[88053]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:31:18 np0005604215.localdomain sudo[88053]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:31:18 np0005604215.localdomain sudo[88053]: pam_unix(sudo:session): session closed for user root
Feb 01 08:31:20 np0005604215.localdomain sudo[88100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:31:20 np0005604215.localdomain sudo[88100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:31:20 np0005604215.localdomain sudo[88100]: pam_unix(sudo:session): session closed for user root
Feb 01 08:31:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:31:25 np0005604215.localdomain podman[88115]: 2026-02-01 08:31:25.88675953 +0000 UTC m=+0.100003312 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, release=1766032510, com.redhat.component=openstack-collectd-container, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:31:25 np0005604215.localdomain podman[88115]: 2026-02-01 08:31:25.901731564 +0000 UTC m=+0.114975306 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, container_name=collectd)
Feb 01 08:31:25 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: tmp-crun.24LqTe.mount: Deactivated successfully.
Feb 01 08:31:27 np0005604215.localdomain podman[88150]: 2026-02-01 08:31:27.918472178 +0000 UTC m=+0.108806625 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, build-date=2026-01-12T23:07:30Z, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 01 08:31:27 np0005604215.localdomain podman[88136]: 2026-02-01 08:31:27.941682426 +0000 UTC m=+0.149056590 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true, config_id=tripleo_step4, batch=17.1_20260112.1, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:31:27 np0005604215.localdomain podman[88136]: 2026-02-01 08:31:27.94958357 +0000 UTC m=+0.156957734 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 01 08:31:27 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:31:27 np0005604215.localdomain podman[88138]: 2026-02-01 08:31:27.996923284 +0000 UTC m=+0.198945552 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:31:28 np0005604215.localdomain podman[88141]: 2026-02-01 08:31:27.904524667 +0000 UTC m=+0.103787910 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, managed_by=tripleo_ansible, architecture=x86_64, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z)
Feb 01 08:31:28 np0005604215.localdomain podman[88138]: 2026-02-01 08:31:28.006629293 +0000 UTC m=+0.208651531 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, config_id=tripleo_step3, architecture=x86_64, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:31:28 np0005604215.localdomain podman[88141]: 2026-02-01 08:31:28.035687952 +0000 UTC m=+0.234951215 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5)
Feb 01 08:31:28 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:31:28 np0005604215.localdomain podman[88150]: 2026-02-01 08:31:28.048149847 +0000 UTC m=+0.238484314 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13)
Feb 01 08:31:28 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:31:28 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:31:28 np0005604215.localdomain podman[88137]: 2026-02-01 08:31:28.141319658 +0000 UTC m=+0.346859606 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:31:28 np0005604215.localdomain podman[88137]: 2026-02-01 08:31:28.202816619 +0000 UTC m=+0.408356587 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, container_name=nova_compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:31:28 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:31:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:31:31 np0005604215.localdomain systemd[1]: tmp-crun.vT3Bqw.mount: Deactivated successfully.
Feb 01 08:31:31 np0005604215.localdomain podman[88252]: 2026-02-01 08:31:31.862086596 +0000 UTC m=+0.083216433 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, version=17.1.13, container_name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:31:32 np0005604215.localdomain podman[88252]: 2026-02-01 08:31:32.245659896 +0000 UTC m=+0.466789763 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true)
Feb 01 08:31:32 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:31:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:31:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:31:33 np0005604215.localdomain podman[88278]: 2026-02-01 08:31:33.88301697 +0000 UTC m=+0.093047997 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 01 08:31:33 np0005604215.localdomain podman[88277]: 2026-02-01 08:31:33.854582342 +0000 UTC m=+0.070725379 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:31:33 np0005604215.localdomain podman[88278]: 2026-02-01 08:31:33.932784589 +0000 UTC m=+0.142815596 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:31:33 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:31:33 np0005604215.localdomain podman[88277]: 2026-02-01 08:31:33.989154692 +0000 UTC m=+0.205297729 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-type=git, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, version=17.1.13, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:31:34 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:31:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:31:46 np0005604215.localdomain systemd[1]: tmp-crun.zNObkr.mount: Deactivated successfully.
Feb 01 08:31:46 np0005604215.localdomain podman[88369]: 2026-02-01 08:31:46.881809137 +0000 UTC m=+0.095478263 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, maintainer=OpenStack TripleO Team, distribution-scope=public, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible)
Feb 01 08:31:47 np0005604215.localdomain podman[88369]: 2026-02-01 08:31:47.069363726 +0000 UTC m=+0.283032862 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:31:47 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:31:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:31:56 np0005604215.localdomain podman[88400]: 2026-02-01 08:31:56.845424103 +0000 UTC m=+0.068624173 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, container_name=collectd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 01 08:31:56 np0005604215.localdomain podman[88400]: 2026-02-01 08:31:56.88382695 +0000 UTC m=+0.107027030 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:31:56 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:31:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:31:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:31:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:31:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:31:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:31:58 np0005604215.localdomain podman[88422]: 2026-02-01 08:31:58.873968581 +0000 UTC m=+0.083972197 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, config_id=tripleo_step3, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:31:58 np0005604215.localdomain podman[88422]: 2026-02-01 08:31:58.882574357 +0000 UTC m=+0.092577963 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, release=1766032510, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, tcib_managed=true, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:31:58 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:31:58 np0005604215.localdomain podman[88421]: 2026-02-01 08:31:58.928792215 +0000 UTC m=+0.142212128 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=nova_compute, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:31:58 np0005604215.localdomain podman[88429]: 2026-02-01 08:31:58.989514213 +0000 UTC m=+0.194624709 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 01 08:31:59 np0005604215.localdomain podman[88429]: 2026-02-01 08:31:59.020611635 +0000 UTC m=+0.225722151 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:31:59 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:31:59 np0005604215.localdomain podman[88423]: 2026-02-01 08:31:59.038202588 +0000 UTC m=+0.245299755 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, distribution-scope=public, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13)
Feb 01 08:31:59 np0005604215.localdomain podman[88421]: 2026-02-01 08:31:59.060087875 +0000 UTC m=+0.273507788 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1)
Feb 01 08:31:59 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:31:59 np0005604215.localdomain podman[88423]: 2026-02-01 08:31:59.075626616 +0000 UTC m=+0.282723793 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, distribution-scope=public, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:31:59 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:31:59 np0005604215.localdomain podman[88420]: 2026-02-01 08:31:59.138245992 +0000 UTC m=+0.355167853 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true)
Feb 01 08:31:59 np0005604215.localdomain podman[88420]: 2026-02-01 08:31:59.150544941 +0000 UTC m=+0.367466802 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., container_name=logrotate_crond, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 01 08:31:59 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:31:59 np0005604215.localdomain systemd[1]: tmp-crun.UqPgBM.mount: Deactivated successfully.
Feb 01 08:32:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:32:02 np0005604215.localdomain podman[88539]: 2026-02-01 08:32:02.855560424 +0000 UTC m=+0.072315877 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_id=tripleo_step4, container_name=nova_migration_target, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:32:03 np0005604215.localdomain podman[88539]: 2026-02-01 08:32:03.188654242 +0000 UTC m=+0.405409695 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:32:03 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:32:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:32:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:32:04 np0005604215.localdomain podman[88562]: 2026-02-01 08:32:04.860069258 +0000 UTC m=+0.074830684 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, container_name=ovn_metadata_agent, vcs-type=git, tcib_managed=true, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:32:04 np0005604215.localdomain podman[88562]: 2026-02-01 08:32:04.902687976 +0000 UTC m=+0.117449362 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, config_id=tripleo_step4, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:32:04 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:32:04 np0005604215.localdomain podman[88563]: 2026-02-01 08:32:04.920537408 +0000 UTC m=+0.131625891 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public)
Feb 01 08:32:04 np0005604215.localdomain podman[88563]: 2026-02-01 08:32:04.946676216 +0000 UTC m=+0.157764689 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:32:04 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:32:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:32:17 np0005604215.localdomain systemd[1]: tmp-crun.nzXF1C.mount: Deactivated successfully.
Feb 01 08:32:17 np0005604215.localdomain podman[88609]: 2026-02-01 08:32:17.878094271 +0000 UTC m=+0.094045548 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Feb 01 08:32:18 np0005604215.localdomain podman[88609]: 2026-02-01 08:32:18.068531819 +0000 UTC m=+0.284483046 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, architecture=x86_64, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, container_name=metrics_qdr, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5)
Feb 01 08:32:18 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:32:20 np0005604215.localdomain sudo[88638]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:32:20 np0005604215.localdomain sudo[88638]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:32:20 np0005604215.localdomain sudo[88638]: pam_unix(sudo:session): session closed for user root
Feb 01 08:32:20 np0005604215.localdomain sudo[88653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 08:32:20 np0005604215.localdomain sudo[88653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:32:21 np0005604215.localdomain sudo[88653]: pam_unix(sudo:session): session closed for user root
Feb 01 08:32:21 np0005604215.localdomain sudo[88689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:32:21 np0005604215.localdomain sudo[88689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:32:21 np0005604215.localdomain sudo[88689]: pam_unix(sudo:session): session closed for user root
Feb 01 08:32:21 np0005604215.localdomain sudo[88704]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:32:21 np0005604215.localdomain sudo[88704]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:32:21 np0005604215.localdomain sudo[88704]: pam_unix(sudo:session): session closed for user root
Feb 01 08:32:22 np0005604215.localdomain sudo[88751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:32:22 np0005604215.localdomain sudo[88751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:32:22 np0005604215.localdomain sudo[88751]: pam_unix(sudo:session): session closed for user root
Feb 01 08:32:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:32:27 np0005604215.localdomain podman[88766]: 2026-02-01 08:32:27.868043903 +0000 UTC m=+0.080207222 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, url=https://www.redhat.com, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, tcib_managed=true)
Feb 01 08:32:27 np0005604215.localdomain podman[88766]: 2026-02-01 08:32:27.885243454 +0000 UTC m=+0.097406773 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container)
Feb 01 08:32:27 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:32:29 np0005604215.localdomain podman[88787]: 2026-02-01 08:32:29.930468228 +0000 UTC m=+0.140538096 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, container_name=nova_compute, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13)
Feb 01 08:32:29 np0005604215.localdomain podman[88786]: 2026-02-01 08:32:29.888280144 +0000 UTC m=+0.100718585 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, version=17.1.13)
Feb 01 08:32:29 np0005604215.localdomain podman[88786]: 2026-02-01 08:32:29.974091607 +0000 UTC m=+0.186530048 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, container_name=logrotate_crond, release=1766032510, config_id=tripleo_step4)
Feb 01 08:32:29 np0005604215.localdomain podman[88787]: 2026-02-01 08:32:29.983531469 +0000 UTC m=+0.193601287 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5)
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:32:29 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:32:30 np0005604215.localdomain podman[88788]: 2026-02-01 08:32:30.02432611 +0000 UTC m=+0.229528148 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, distribution-scope=public, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible)
Feb 01 08:32:30 np0005604215.localdomain podman[88790]: 2026-02-01 08:32:29.979216425 +0000 UTC m=+0.180855342 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:32:30 np0005604215.localdomain podman[88790]: 2026-02-01 08:32:30.060171448 +0000 UTC m=+0.261810325 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 01 08:32:30 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:32:30 np0005604215.localdomain podman[88800]: 2026-02-01 08:32:30.073454189 +0000 UTC m=+0.271652970 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc.)
Feb 01 08:32:30 np0005604215.localdomain podman[88788]: 2026-02-01 08:32:30.083758057 +0000 UTC m=+0.288960065 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, container_name=iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:32:30 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:32:30 np0005604215.localdomain podman[88800]: 2026-02-01 08:32:30.130711469 +0000 UTC m=+0.328910230 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, distribution-scope=public, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc.)
Feb 01 08:32:30 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:32:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:32:33 np0005604215.localdomain podman[88903]: 2026-02-01 08:32:33.859400822 +0000 UTC m=+0.074888586 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, batch=17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:32:34 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:32:34 np0005604215.localdomain podman[88903]: 2026-02-01 08:32:34.249134742 +0000 UTC m=+0.464622456 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_migration_target, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:32:34 np0005604215.localdomain recover_tripleo_nova_virtqemud[88925]: 62016
Feb 01 08:32:34 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:32:34 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:32:34 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:32:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:32:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:32:35 np0005604215.localdomain podman[88928]: 2026-02-01 08:32:35.867622442 +0000 UTC m=+0.084456472 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 01 08:32:35 np0005604215.localdomain podman[88929]: 2026-02-01 08:32:35.91900754 +0000 UTC m=+0.132195768 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, config_id=tripleo_step4)
Feb 01 08:32:35 np0005604215.localdomain podman[88928]: 2026-02-01 08:32:35.93773602 +0000 UTC m=+0.154569980 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:32:35 np0005604215.localdomain podman[88929]: 2026-02-01 08:32:35.945969084 +0000 UTC m=+0.159157362 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, architecture=x86_64, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true)
Feb 01 08:32:35 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:32:35 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:32:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:32:48 np0005604215.localdomain podman[88998]: 2026-02-01 08:32:48.866744119 +0000 UTC m=+0.081859562 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, release=1766032510, vcs-type=git, managed_by=tripleo_ansible, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:32:49 np0005604215.localdomain podman[88998]: 2026-02-01 08:32:49.083907944 +0000 UTC m=+0.299023417 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:32:49 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:32:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:32:58 np0005604215.localdomain podman[89027]: 2026-02-01 08:32:58.860117604 +0000 UTC m=+0.074229176 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5)
Feb 01 08:32:58 np0005604215.localdomain podman[89027]: 2026-02-01 08:32:58.867691568 +0000 UTC m=+0.081803110 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:32:58 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:33:00 np0005604215.localdomain podman[89051]: 2026-02-01 08:33:00.885591458 +0000 UTC m=+0.089376225 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: tmp-crun.FTXYpK.mount: Deactivated successfully.
Feb 01 08:33:00 np0005604215.localdomain podman[89056]: 2026-02-01 08:33:00.943388624 +0000 UTC m=+0.144031344 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi)
Feb 01 08:33:00 np0005604215.localdomain podman[89051]: 2026-02-01 08:33:00.947800241 +0000 UTC m=+0.151585008 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, release=1766032510)
Feb 01 08:33:00 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:33:00 np0005604215.localdomain podman[89046]: 2026-02-01 08:33:00.995660891 +0000 UTC m=+0.205655760 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:33:01 np0005604215.localdomain podman[89047]: 2026-02-01 08:33:01.034252374 +0000 UTC m=+0.240375093 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, container_name=iscsid, vcs-type=git)
Feb 01 08:33:01 np0005604215.localdomain podman[89046]: 2026-02-01 08:33:01.048973589 +0000 UTC m=+0.258968508 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:33:01 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:33:01 np0005604215.localdomain podman[89047]: 2026-02-01 08:33:01.069669098 +0000 UTC m=+0.275791847 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:33:01 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:33:01 np0005604215.localdomain podman[89056]: 2026-02-01 08:33:01.105765644 +0000 UTC m=+0.306408404 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, container_name=ceilometer_agent_ipmi, vcs-type=git, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 01 08:33:01 np0005604215.localdomain podman[89045]: 2026-02-01 08:33:01.140189049 +0000 UTC m=+0.353404028 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, release=1766032510, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team)
Feb 01 08:33:01 np0005604215.localdomain podman[89045]: 2026-02-01 08:33:01.151599892 +0000 UTC m=+0.364814891 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, container_name=logrotate_crond, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 01 08:33:01 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:33:01 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:33:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:33:04 np0005604215.localdomain systemd[1]: tmp-crun.rSaXxi.mount: Deactivated successfully.
Feb 01 08:33:04 np0005604215.localdomain podman[89160]: 2026-02-01 08:33:04.864495556 +0000 UTC m=+0.084423681 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, container_name=nova_migration_target, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5)
Feb 01 08:33:05 np0005604215.localdomain podman[89160]: 2026-02-01 08:33:05.220865124 +0000 UTC m=+0.440793239 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:33:05 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:33:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:33:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:33:06 np0005604215.localdomain podman[89183]: 2026-02-01 08:33:06.858194127 +0000 UTC m=+0.075943918 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:33:06 np0005604215.localdomain podman[89184]: 2026-02-01 08:33:06.92071635 +0000 UTC m=+0.133005363 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, tcib_managed=true, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:33:06 np0005604215.localdomain podman[89183]: 2026-02-01 08:33:06.940007337 +0000 UTC m=+0.157757108 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, vcs-type=git)
Feb 01 08:33:06 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:33:06 np0005604215.localdomain podman[89184]: 2026-02-01 08:33:06.992704056 +0000 UTC m=+0.204993029 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, version=17.1.13, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:33:07 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:33:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:33:19 np0005604215.localdomain podman[89230]: 2026-02-01 08:33:19.875687352 +0000 UTC m=+0.089533229 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 01 08:33:20 np0005604215.localdomain podman[89230]: 2026-02-01 08:33:20.100254266 +0000 UTC m=+0.314100123 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, config_id=tripleo_step1, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc., io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible)
Feb 01 08:33:20 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:33:22 np0005604215.localdomain sudo[89259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:33:22 np0005604215.localdomain sudo[89259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:33:22 np0005604215.localdomain sudo[89259]: pam_unix(sudo:session): session closed for user root
Feb 01 08:33:22 np0005604215.localdomain sudo[89274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:33:22 np0005604215.localdomain sudo[89274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:33:23 np0005604215.localdomain sudo[89274]: pam_unix(sudo:session): session closed for user root
Feb 01 08:33:24 np0005604215.localdomain sudo[89320]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:33:24 np0005604215.localdomain sudo[89320]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:33:24 np0005604215.localdomain sudo[89320]: pam_unix(sudo:session): session closed for user root
Feb 01 08:33:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:33:29 np0005604215.localdomain systemd[1]: tmp-crun.rUKqON.mount: Deactivated successfully.
Feb 01 08:33:29 np0005604215.localdomain podman[89335]: 2026-02-01 08:33:29.875839815 +0000 UTC m=+0.089475018 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, container_name=collectd, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-type=git, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 01 08:33:29 np0005604215.localdomain podman[89335]: 2026-02-01 08:33:29.889636722 +0000 UTC m=+0.103271905 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, version=17.1.13, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:33:29 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:33:31 np0005604215.localdomain podman[89357]: 2026-02-01 08:33:31.87837227 +0000 UTC m=+0.089505509 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-type=git, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: tmp-crun.wn7vMK.mount: Deactivated successfully.
Feb 01 08:33:31 np0005604215.localdomain podman[89358]: 2026-02-01 08:33:31.928970214 +0000 UTC m=+0.138791873 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:33:31 np0005604215.localdomain podman[89357]: 2026-02-01 08:33:31.936728704 +0000 UTC m=+0.147861943 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, release=1766032510, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:33:31 np0005604215.localdomain podman[89358]: 2026-02-01 08:33:31.967786474 +0000 UTC m=+0.177608153 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, version=17.1.13)
Feb 01 08:33:31 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:33:31 np0005604215.localdomain podman[89359]: 2026-02-01 08:33:31.983618994 +0000 UTC m=+0.189462339 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:33:32 np0005604215.localdomain podman[89359]: 2026-02-01 08:33:32.034663672 +0000 UTC m=+0.240507027 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:33:32 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:33:32 np0005604215.localdomain podman[89356]: 2026-02-01 08:33:32.038459659 +0000 UTC m=+0.246540974 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public)
Feb 01 08:33:32 np0005604215.localdomain podman[89360]: 2026-02-01 08:33:32.089030162 +0000 UTC m=+0.292384170 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, tcib_managed=true, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 01 08:33:32 np0005604215.localdomain podman[89356]: 2026-02-01 08:33:32.118922047 +0000 UTC m=+0.327003312 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step4, architecture=x86_64, container_name=logrotate_crond, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team)
Feb 01 08:33:32 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:33:32 np0005604215.localdomain podman[89360]: 2026-02-01 08:33:32.17172459 +0000 UTC m=+0.375078608 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:33:32 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:33:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:33:35 np0005604215.localdomain podman[89477]: 2026-02-01 08:33:35.873151241 +0000 UTC m=+0.085674700 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:33:36 np0005604215.localdomain podman[89477]: 2026-02-01 08:33:36.244599866 +0000 UTC m=+0.457123295 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, container_name=nova_migration_target, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:33:36 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:33:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:33:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:33:37 np0005604215.localdomain systemd[1]: tmp-crun.d8ASk8.mount: Deactivated successfully.
Feb 01 08:33:37 np0005604215.localdomain podman[89500]: 2026-02-01 08:33:37.865238292 +0000 UTC m=+0.083049749 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, distribution-scope=public, architecture=x86_64, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container)
Feb 01 08:33:37 np0005604215.localdomain podman[89499]: 2026-02-01 08:33:37.910003036 +0000 UTC m=+0.130175635 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:33:37 np0005604215.localdomain podman[89500]: 2026-02-01 08:33:37.913612537 +0000 UTC m=+0.131424024 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, container_name=ovn_controller, config_id=tripleo_step4)
Feb 01 08:33:37 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:33:37 np0005604215.localdomain podman[89499]: 2026-02-01 08:33:37.957775223 +0000 UTC m=+0.177947872 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T22:56:19Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:33:37 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:33:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:33:50 np0005604215.localdomain systemd[1]: tmp-crun.W6r7qe.mount: Deactivated successfully.
Feb 01 08:33:50 np0005604215.localdomain podman[89570]: 2026-02-01 08:33:50.872935503 +0000 UTC m=+0.092754349 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:33:51 np0005604215.localdomain podman[89570]: 2026-02-01 08:33:51.086892828 +0000 UTC m=+0.306711664 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, version=17.1.13, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z)
Feb 01 08:33:51 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:34:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:34:00 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:34:00 np0005604215.localdomain recover_tripleo_nova_virtqemud[89606]: 62016
Feb 01 08:34:00 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:34:00 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:34:00 np0005604215.localdomain systemd[1]: tmp-crun.3I1ySj.mount: Deactivated successfully.
Feb 01 08:34:00 np0005604215.localdomain podman[89600]: 2026-02-01 08:34:00.865424023 +0000 UTC m=+0.076250988 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, tcib_managed=true, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:34:00 np0005604215.localdomain podman[89600]: 2026-02-01 08:34:00.901353614 +0000 UTC m=+0.112180609 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, container_name=collectd, batch=17.1_20260112.1, vcs-type=git, com.redhat.component=openstack-collectd-container, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, architecture=x86_64)
Feb 01 08:34:00 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:34:02 np0005604215.localdomain podman[89623]: 2026-02-01 08:34:02.886621655 +0000 UTC m=+0.089684604 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, version=17.1.13, vcs-type=git, container_name=nova_compute)
Feb 01 08:34:02 np0005604215.localdomain podman[89623]: 2026-02-01 08:34:02.911618968 +0000 UTC m=+0.114681897 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, config_id=tripleo_step5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc.)
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: tmp-crun.5qEowd.mount: Deactivated successfully.
Feb 01 08:34:02 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:34:02 np0005604215.localdomain podman[89625]: 2026-02-01 08:34:02.939367976 +0000 UTC m=+0.139186474 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:34:02 np0005604215.localdomain podman[89625]: 2026-02-01 08:34:02.995729749 +0000 UTC m=+0.195548277 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, distribution-scope=public)
Feb 01 08:34:02 np0005604215.localdomain podman[89622]: 2026-02-01 08:34:02.995688247 +0000 UTC m=+0.204058970 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, version=17.1.13, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 01 08:34:03 np0005604215.localdomain podman[89622]: 2026-02-01 08:34:03.032853306 +0000 UTC m=+0.241224029 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, architecture=x86_64, version=17.1.13, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:34:03 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:34:03 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:34:03 np0005604215.localdomain podman[89631]: 2026-02-01 08:34:03.059724047 +0000 UTC m=+0.255081607 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, io.openshift.expose-services=)
Feb 01 08:34:03 np0005604215.localdomain podman[89631]: 2026-02-01 08:34:03.08891474 +0000 UTC m=+0.284272350 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, release=1766032510, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, version=17.1.13)
Feb 01 08:34:03 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:34:03 np0005604215.localdomain podman[89624]: 2026-02-01 08:34:03.137856303 +0000 UTC m=+0.337515027 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., release=1766032510, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, distribution-scope=public, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step3)
Feb 01 08:34:03 np0005604215.localdomain podman[89624]: 2026-02-01 08:34:03.175792996 +0000 UTC m=+0.375451740 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:34:03 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:34:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:34:06 np0005604215.localdomain podman[89738]: 2026-02-01 08:34:06.862987986 +0000 UTC m=+0.077839068 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:34:07 np0005604215.localdomain podman[89738]: 2026-02-01 08:34:07.24704728 +0000 UTC m=+0.461898372 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public)
Feb 01 08:34:07 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:34:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:34:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:34:08 np0005604215.localdomain podman[89761]: 2026-02-01 08:34:08.8658427 +0000 UTC m=+0.079871430 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:34:08 np0005604215.localdomain podman[89762]: 2026-02-01 08:34:08.918696244 +0000 UTC m=+0.129629548 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:34:08 np0005604215.localdomain podman[89762]: 2026-02-01 08:34:08.946632268 +0000 UTC m=+0.157565572 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.openshift.expose-services=, architecture=x86_64)
Feb 01 08:34:08 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:34:08 np0005604215.localdomain podman[89761]: 2026-02-01 08:34:08.997206422 +0000 UTC m=+0.211235082 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:34:09 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:34:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:34:21 np0005604215.localdomain podman[89808]: 2026-02-01 08:34:21.905331065 +0000 UTC m=+0.119648681 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, io.buildah.version=1.41.5, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 01 08:34:22 np0005604215.localdomain podman[89808]: 2026-02-01 08:34:22.087197148 +0000 UTC m=+0.301514764 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, container_name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5)
Feb 01 08:34:22 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:34:24 np0005604215.localdomain sudo[89835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:34:24 np0005604215.localdomain sudo[89835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:34:24 np0005604215.localdomain sudo[89835]: pam_unix(sudo:session): session closed for user root
Feb 01 08:34:24 np0005604215.localdomain sudo[89850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 08:34:24 np0005604215.localdomain sudo[89850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:34:25 np0005604215.localdomain podman[89938]: 2026-02-01 08:34:25.351617307 +0000 UTC m=+0.103225683 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, ceph=True, vcs-type=git, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 08:34:25 np0005604215.localdomain podman[89938]: 2026-02-01 08:34:25.451768743 +0000 UTC m=+0.203377109 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, version=7, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git)
Feb 01 08:34:25 np0005604215.localdomain sudo[89850]: pam_unix(sudo:session): session closed for user root
Feb 01 08:34:25 np0005604215.localdomain sudo[90007]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:34:25 np0005604215.localdomain sudo[90007]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:34:25 np0005604215.localdomain sudo[90007]: pam_unix(sudo:session): session closed for user root
Feb 01 08:34:25 np0005604215.localdomain sudo[90022]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:34:25 np0005604215.localdomain sudo[90022]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:34:26 np0005604215.localdomain sudo[90022]: pam_unix(sudo:session): session closed for user root
Feb 01 08:34:27 np0005604215.localdomain sudo[90070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:34:27 np0005604215.localdomain sudo[90070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:34:27 np0005604215.localdomain sudo[90070]: pam_unix(sudo:session): session closed for user root
Feb 01 08:34:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:34:31 np0005604215.localdomain podman[90086]: 2026-02-01 08:34:31.921211086 +0000 UTC m=+0.124052877 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, release=1766032510, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true)
Feb 01 08:34:31 np0005604215.localdomain podman[90086]: 2026-02-01 08:34:31.956569079 +0000 UTC m=+0.159410850 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:34:31 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: tmp-crun.h3lRVi.mount: Deactivated successfully.
Feb 01 08:34:33 np0005604215.localdomain podman[90109]: 2026-02-01 08:34:33.869334318 +0000 UTC m=+0.081667277 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:34:33 np0005604215.localdomain podman[90107]: 2026-02-01 08:34:33.913761421 +0000 UTC m=+0.130535436 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, release=1766032510, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp-rhel9/openstack-cron, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com)
Feb 01 08:34:33 np0005604215.localdomain podman[90108]: 2026-02-01 08:34:33.92373823 +0000 UTC m=+0.134455848 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:34:33 np0005604215.localdomain podman[90107]: 2026-02-01 08:34:33.932655525 +0000 UTC m=+0.149429550 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, vcs-type=git, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=)
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:34:33 np0005604215.localdomain podman[90108]: 2026-02-01 08:34:33.953599203 +0000 UTC m=+0.164316811 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, release=1766032510, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:34:33 np0005604215.localdomain podman[90109]: 2026-02-01 08:34:33.985332424 +0000 UTC m=+0.197665433 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., version=17.1.13, config_id=tripleo_step3, io.openshift.expose-services=, vcs-type=git)
Feb 01 08:34:33 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:34:34 np0005604215.localdomain podman[90110]: 2026-02-01 08:34:33.885737684 +0000 UTC m=+0.091065766 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:34:34 np0005604215.localdomain podman[90110]: 2026-02-01 08:34:34.06603306 +0000 UTC m=+0.271361092 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4)
Feb 01 08:34:34 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:34:34 np0005604215.localdomain podman[90111]: 2026-02-01 08:34:34.074373687 +0000 UTC m=+0.281146464 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 01 08:34:34 np0005604215.localdomain podman[90111]: 2026-02-01 08:34:34.095724337 +0000 UTC m=+0.302497124 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 01 08:34:34 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:34:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:34:37 np0005604215.localdomain podman[90225]: 2026-02-01 08:34:37.854651256 +0000 UTC m=+0.067863159 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:34:38 np0005604215.localdomain podman[90225]: 2026-02-01 08:34:38.253829198 +0000 UTC m=+0.467041131 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:34:38 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:34:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:34:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:34:39 np0005604215.localdomain podman[90249]: 2026-02-01 08:34:39.872742041 +0000 UTC m=+0.081916304 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:34:39 np0005604215.localdomain podman[90249]: 2026-02-01 08:34:39.926904186 +0000 UTC m=+0.136078439 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:34:39 np0005604215.localdomain systemd[1]: tmp-crun.tXxdHd.mount: Deactivated successfully.
Feb 01 08:34:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:34:39 np0005604215.localdomain podman[90250]: 2026-02-01 08:34:39.954075816 +0000 UTC m=+0.156715046 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller)
Feb 01 08:34:39 np0005604215.localdomain podman[90250]: 2026-02-01 08:34:39.978546452 +0000 UTC m=+0.181185672 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 01 08:34:39 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:34:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:34:52 np0005604215.localdomain podman[90320]: 2026-02-01 08:34:52.875677812 +0000 UTC m=+0.087619920 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Feb 01 08:34:53 np0005604215.localdomain podman[90320]: 2026-02-01 08:34:53.095733426 +0000 UTC m=+0.307675524 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git)
Feb 01 08:34:53 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:35:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:35:02 np0005604215.localdomain systemd[1]: tmp-crun.Ygy6M1.mount: Deactivated successfully.
Feb 01 08:35:02 np0005604215.localdomain podman[90349]: 2026-02-01 08:35:02.875388203 +0000 UTC m=+0.091375997 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, config_id=tripleo_step3, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:35:02 np0005604215.localdomain podman[90349]: 2026-02-01 08:35:02.918614199 +0000 UTC m=+0.134602053 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, version=17.1.13, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 01 08:35:02 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:35:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:35:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:35:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:35:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:35:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:35:04 np0005604215.localdomain podman[90369]: 2026-02-01 08:35:04.88805563 +0000 UTC m=+0.099467026 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1766032510, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:35:04 np0005604215.localdomain podman[90369]: 2026-02-01 08:35:04.895195061 +0000 UTC m=+0.106606417 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, vcs-type=git, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, com.redhat.component=openstack-cron-container, io.openshift.expose-services=)
Feb 01 08:35:04 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:35:04 np0005604215.localdomain podman[90372]: 2026-02-01 08:35:04.991375144 +0000 UTC m=+0.191192022 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, distribution-scope=public, release=1766032510, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Feb 01 08:35:05 np0005604215.localdomain podman[90383]: 2026-02-01 08:35:04.955030521 +0000 UTC m=+0.148863944 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 01 08:35:05 np0005604215.localdomain podman[90372]: 2026-02-01 08:35:05.027687207 +0000 UTC m=+0.227504065 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:35:05 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:35:05 np0005604215.localdomain podman[90371]: 2026-02-01 08:35:05.043599019 +0000 UTC m=+0.244868722 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, release=1766032510, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:35:05 np0005604215.localdomain podman[90371]: 2026-02-01 08:35:05.07726463 +0000 UTC m=+0.278534303 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, url=https://www.redhat.com, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, config_id=tripleo_step3, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, build-date=2026-01-12T22:34:43Z)
Feb 01 08:35:05 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:35:05 np0005604215.localdomain podman[90383]: 2026-02-01 08:35:05.092937774 +0000 UTC m=+0.286771187 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public)
Feb 01 08:35:05 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:35:05 np0005604215.localdomain podman[90370]: 2026-02-01 08:35:05.094909765 +0000 UTC m=+0.302425911 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, container_name=nova_compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:35:05 np0005604215.localdomain podman[90370]: 2026-02-01 08:35:05.176132937 +0000 UTC m=+0.383649053 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1766032510)
Feb 01 08:35:05 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:35:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:35:08 np0005604215.localdomain systemd[1]: tmp-crun.oBGu6J.mount: Deactivated successfully.
Feb 01 08:35:08 np0005604215.localdomain podman[90488]: 2026-02-01 08:35:08.879352803 +0000 UTC m=+0.095737801 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, architecture=x86_64, vcs-type=git, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 01 08:35:09 np0005604215.localdomain podman[90488]: 2026-02-01 08:35:09.248066683 +0000 UTC m=+0.464451741 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, tcib_managed=true, config_id=tripleo_step4, release=1766032510, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:35:09 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:35:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:35:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:35:10 np0005604215.localdomain podman[90512]: 2026-02-01 08:35:10.882789325 +0000 UTC m=+0.084547925 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, version=17.1.13, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible)
Feb 01 08:35:10 np0005604215.localdomain podman[90512]: 2026-02-01 08:35:10.916620581 +0000 UTC m=+0.118379171 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, config_id=tripleo_step4, container_name=ovn_controller)
Feb 01 08:35:10 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:35:10 np0005604215.localdomain systemd[1]: tmp-crun.csxpE1.mount: Deactivated successfully.
Feb 01 08:35:10 np0005604215.localdomain podman[90511]: 2026-02-01 08:35:10.956561715 +0000 UTC m=+0.160867214 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, release=1766032510, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, config_id=tripleo_step4, tcib_managed=true)
Feb 01 08:35:10 np0005604215.localdomain podman[90511]: 2026-02-01 08:35:10.993650593 +0000 UTC m=+0.197956092 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ovn_metadata_agent, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:35:11 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:35:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:35:23 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:35:23 np0005604215.localdomain recover_tripleo_nova_virtqemud[90565]: 62016
Feb 01 08:35:23 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:35:23 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:35:23 np0005604215.localdomain systemd[1]: tmp-crun.ZaWbrn.mount: Deactivated successfully.
Feb 01 08:35:23 np0005604215.localdomain podman[90558]: 2026-02-01 08:35:23.890337642 +0000 UTC m=+0.097710742 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 08:35:24 np0005604215.localdomain podman[90558]: 2026-02-01 08:35:24.102230424 +0000 UTC m=+0.309603474 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, managed_by=tripleo_ansible)
Feb 01 08:35:24 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:35:27 np0005604215.localdomain sudo[90588]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:35:27 np0005604215.localdomain sudo[90588]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:35:27 np0005604215.localdomain sudo[90588]: pam_unix(sudo:session): session closed for user root
Feb 01 08:35:27 np0005604215.localdomain sudo[90603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:35:27 np0005604215.localdomain sudo[90603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:35:28 np0005604215.localdomain sudo[90603]: pam_unix(sudo:session): session closed for user root
Feb 01 08:35:28 np0005604215.localdomain sudo[90650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:35:28 np0005604215.localdomain sudo[90650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:35:28 np0005604215.localdomain sudo[90650]: pam_unix(sudo:session): session closed for user root
Feb 01 08:35:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:35:33 np0005604215.localdomain podman[90665]: 2026-02-01 08:35:33.912437664 +0000 UTC m=+0.124442989 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, tcib_managed=true, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:35:33 np0005604215.localdomain podman[90665]: 2026-02-01 08:35:33.94594949 +0000 UTC m=+0.157954845 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, vcs-type=git, container_name=collectd, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:35:33 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: tmp-crun.kyow5R.mount: Deactivated successfully.
Feb 01 08:35:35 np0005604215.localdomain podman[90688]: 2026-02-01 08:35:35.852880629 +0000 UTC m=+0.070882834 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, config_id=tripleo_step3, version=17.1.13, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, batch=17.1_20260112.1)
Feb 01 08:35:35 np0005604215.localdomain podman[90689]: 2026-02-01 08:35:35.870782992 +0000 UTC m=+0.083649758 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, vcs-type=git, io.openshift.expose-services=, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510)
Feb 01 08:35:35 np0005604215.localdomain podman[90689]: 2026-02-01 08:35:35.916587008 +0000 UTC m=+0.129453814 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:35:35 np0005604215.localdomain podman[90690]: 2026-02-01 08:35:35.928851597 +0000 UTC m=+0.136144650 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510, distribution-scope=public, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 01 08:35:35 np0005604215.localdomain podman[90688]: 2026-02-01 08:35:35.935330547 +0000 UTC m=+0.153332772 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, config_id=tripleo_step3, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:35:35 np0005604215.localdomain podman[90690]: 2026-02-01 08:35:35.946267746 +0000 UTC m=+0.153560799 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13)
Feb 01 08:35:35 np0005604215.localdomain podman[90687]: 2026-02-01 08:35:35.906175546 +0000 UTC m=+0.120919410 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, tcib_managed=true, distribution-scope=public, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1)
Feb 01 08:35:35 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:35:36 np0005604215.localdomain podman[90686]: 2026-02-01 08:35:36.005400394 +0000 UTC m=+0.223131400 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, version=17.1.13, architecture=x86_64, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:35:36 np0005604215.localdomain podman[90686]: 2026-02-01 08:35:36.033564125 +0000 UTC m=+0.251295141 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, container_name=logrotate_crond, distribution-scope=public)
Feb 01 08:35:36 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:35:36 np0005604215.localdomain podman[90687]: 2026-02-01 08:35:36.041777879 +0000 UTC m=+0.256521733 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, config_id=tripleo_step5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 01 08:35:36 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:35:36 np0005604215.localdomain systemd[1]: tmp-crun.NUXvnC.mount: Deactivated successfully.
Feb 01 08:35:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:35:39 np0005604215.localdomain podman[90802]: 2026-02-01 08:35:39.859586009 +0000 UTC m=+0.079000993 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:35:40 np0005604215.localdomain podman[90802]: 2026-02-01 08:35:40.204616756 +0000 UTC m=+0.424031700 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 01 08:35:40 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:35:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:35:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:35:41 np0005604215.localdomain podman[90824]: 2026-02-01 08:35:41.870234555 +0000 UTC m=+0.085682101 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=17.1.13, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 01 08:35:41 np0005604215.localdomain podman[90825]: 2026-02-01 08:35:41.922357226 +0000 UTC m=+0.135612194 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, container_name=ovn_controller, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:35:41 np0005604215.localdomain podman[90825]: 2026-02-01 08:35:41.945677056 +0000 UTC m=+0.158932064 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:35:41 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:35:42 np0005604215.localdomain podman[90824]: 2026-02-01 08:35:41.999092919 +0000 UTC m=+0.214540455 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:35:42 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:35:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:35:54 np0005604215.localdomain podman[90872]: 2026-02-01 08:35:54.876520841 +0000 UTC m=+0.091079717 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:35:55 np0005604215.localdomain podman[90872]: 2026-02-01 08:35:55.099279718 +0000 UTC m=+0.313838574 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible)
Feb 01 08:35:55 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:36:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:36:04 np0005604215.localdomain podman[90901]: 2026-02-01 08:36:04.862759966 +0000 UTC m=+0.080652954 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 01 08:36:04 np0005604215.localdomain podman[90901]: 2026-02-01 08:36:04.872399144 +0000 UTC m=+0.090292112 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, architecture=x86_64, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1)
Feb 01 08:36:04 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: tmp-crun.jihlsg.mount: Deactivated successfully.
Feb 01 08:36:06 np0005604215.localdomain podman[90926]: 2026-02-01 08:36:06.864391153 +0000 UTC m=+0.076344452 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, managed_by=tripleo_ansible, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510)
Feb 01 08:36:06 np0005604215.localdomain podman[90926]: 2026-02-01 08:36:06.872340678 +0000 UTC m=+0.084293977 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13)
Feb 01 08:36:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:36:06 np0005604215.localdomain podman[90922]: 2026-02-01 08:36:06.881371218 +0000 UTC m=+0.091975545 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.buildah.version=1.41.5, container_name=nova_compute, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, release=1766032510, io.openshift.expose-services=)
Feb 01 08:36:06 np0005604215.localdomain podman[90930]: 2026-02-01 08:36:06.933060676 +0000 UTC m=+0.135350386 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, config_id=tripleo_step4, tcib_managed=true, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team)
Feb 01 08:36:06 np0005604215.localdomain podman[90929]: 2026-02-01 08:36:06.989426829 +0000 UTC m=+0.195663981 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, release=1766032510, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1)
Feb 01 08:36:07 np0005604215.localdomain podman[90922]: 2026-02-01 08:36:07.006875458 +0000 UTC m=+0.217479835 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:36:07 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:36:07 np0005604215.localdomain podman[90929]: 2026-02-01 08:36:07.042870251 +0000 UTC m=+0.249107403 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510)
Feb 01 08:36:07 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:36:07 np0005604215.localdomain podman[90921]: 2026-02-01 08:36:06.954398236 +0000 UTC m=+0.173901238 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, vcs-type=git, release=1766032510, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:36:07 np0005604215.localdomain podman[90930]: 2026-02-01 08:36:07.062860339 +0000 UTC m=+0.265150009 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:36:07 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:36:07 np0005604215.localdomain podman[90921]: 2026-02-01 08:36:07.090179824 +0000 UTC m=+0.309682786 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, release=1766032510)
Feb 01 08:36:07 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:36:07 np0005604215.localdomain systemd[1]: tmp-crun.lSVAfm.mount: Deactivated successfully.
Feb 01 08:36:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:36:10 np0005604215.localdomain podman[91032]: 2026-02-01 08:36:10.865069617 +0000 UTC m=+0.072803492 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:36:11 np0005604215.localdomain podman[91032]: 2026-02-01 08:36:11.241132724 +0000 UTC m=+0.448866539 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, config_id=tripleo_step4, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:36:11 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:36:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:36:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:36:12 np0005604215.localdomain podman[91055]: 2026-02-01 08:36:12.873109 +0000 UTC m=+0.085250377 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:36:12 np0005604215.localdomain podman[91055]: 2026-02-01 08:36:12.91967221 +0000 UTC m=+0.131813587 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:36:12 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:36:12 np0005604215.localdomain podman[91056]: 2026-02-01 08:36:12.940790263 +0000 UTC m=+0.149403971 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:36:12 np0005604215.localdomain podman[91056]: 2026-02-01 08:36:12.992181682 +0000 UTC m=+0.200795330 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:36:13 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:36:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:36:25 np0005604215.localdomain podman[91104]: 2026-02-01 08:36:25.865494318 +0000 UTC m=+0.080930224 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:36:26 np0005604215.localdomain podman[91104]: 2026-02-01 08:36:26.061591091 +0000 UTC m=+0.277026937 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, container_name=metrics_qdr, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:36:26 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:36:29 np0005604215.localdomain sudo[91133]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:36:29 np0005604215.localdomain sudo[91133]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:36:29 np0005604215.localdomain sudo[91133]: pam_unix(sudo:session): session closed for user root
Feb 01 08:36:29 np0005604215.localdomain sudo[91148]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:36:29 np0005604215.localdomain sudo[91148]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:36:29 np0005604215.localdomain sudo[91148]: pam_unix(sudo:session): session closed for user root
Feb 01 08:36:30 np0005604215.localdomain sudo[91194]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:36:30 np0005604215.localdomain sudo[91194]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:36:30 np0005604215.localdomain sudo[91194]: pam_unix(sudo:session): session closed for user root
Feb 01 08:36:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:36:35 np0005604215.localdomain podman[91209]: 2026-02-01 08:36:35.859540003 +0000 UTC m=+0.075923998 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, architecture=x86_64, version=17.1.13, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com)
Feb 01 08:36:35 np0005604215.localdomain podman[91209]: 2026-02-01 08:36:35.899801857 +0000 UTC m=+0.116185862 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., config_id=tripleo_step3)
Feb 01 08:36:35 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:36:37 np0005604215.localdomain podman[91238]: 2026-02-01 08:36:37.87480216 +0000 UTC m=+0.075715411 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: tmp-crun.D51DK6.mount: Deactivated successfully.
Feb 01 08:36:37 np0005604215.localdomain podman[91238]: 2026-02-01 08:36:37.929622436 +0000 UTC m=+0.130535677 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:36:37 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:36:37 np0005604215.localdomain podman[91230]: 2026-02-01 08:36:37.974274876 +0000 UTC m=+0.183616148 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-type=git, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, container_name=nova_compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:36:37 np0005604215.localdomain podman[91229]: 2026-02-01 08:36:37.931091361 +0000 UTC m=+0.143336203 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:36:38 np0005604215.localdomain podman[91232]: 2026-02-01 08:36:38.046449977 +0000 UTC m=+0.251242808 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, version=17.1.13, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:36:38 np0005604215.localdomain podman[91230]: 2026-02-01 08:36:38.055651802 +0000 UTC m=+0.264993154 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:36:38 np0005604215.localdomain podman[91229]: 2026-02-01 08:36:38.065535867 +0000 UTC m=+0.277780699 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-type=git, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4)
Feb 01 08:36:38 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:36:38 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:36:38 np0005604215.localdomain podman[91232]: 2026-02-01 08:36:38.078645193 +0000 UTC m=+0.283438024 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:36:38 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:36:38 np0005604215.localdomain podman[91231]: 2026-02-01 08:36:38.136537323 +0000 UTC m=+0.343766600 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, version=17.1.13, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com)
Feb 01 08:36:38 np0005604215.localdomain podman[91231]: 2026-02-01 08:36:38.169199472 +0000 UTC m=+0.376428769 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-type=git, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64)
Feb 01 08:36:38 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:36:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:36:41 np0005604215.localdomain podman[91350]: 2026-02-01 08:36:41.860596992 +0000 UTC m=+0.080444228 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, release=1766032510, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.buildah.version=1.41.5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, maintainer=OpenStack TripleO Team)
Feb 01 08:36:42 np0005604215.localdomain podman[91350]: 2026-02-01 08:36:42.185855009 +0000 UTC m=+0.405702265 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:36:42 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:36:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:36:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:36:43 np0005604215.localdomain podman[91374]: 2026-02-01 08:36:43.882495635 +0000 UTC m=+0.082797091 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:36:43 np0005604215.localdomain systemd[1]: tmp-crun.QqzJfx.mount: Deactivated successfully.
Feb 01 08:36:43 np0005604215.localdomain podman[91373]: 2026-02-01 08:36:43.942380926 +0000 UTC m=+0.145484089 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, tcib_managed=true, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 01 08:36:43 np0005604215.localdomain podman[91374]: 2026-02-01 08:36:43.958466924 +0000 UTC m=+0.158768350 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step4, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:36:43 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:36:43 np0005604215.localdomain podman[91373]: 2026-02-01 08:36:43.98872399 +0000 UTC m=+0.191827483 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vcs-type=git, version=17.1.13, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:36:44 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:36:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:36:56 np0005604215.localdomain podman[91421]: 2026-02-01 08:36:56.863848603 +0000 UTC m=+0.070677207 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., config_id=tripleo_step1, release=1766032510, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:36:57 np0005604215.localdomain podman[91421]: 2026-02-01 08:36:57.065765286 +0000 UTC m=+0.272593890 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1)
Feb 01 08:36:57 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:37:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:37:06 np0005604215.localdomain podman[91450]: 2026-02-01 08:37:06.893197919 +0000 UTC m=+0.106423191 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.openshift.expose-services=)
Feb 01 08:37:06 np0005604215.localdomain podman[91450]: 2026-02-01 08:37:06.9022704 +0000 UTC m=+0.115495702 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step3, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, version=17.1.13, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:37:06 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:37:08 np0005604215.localdomain podman[91471]: 2026-02-01 08:37:08.873494897 +0000 UTC m=+0.083325808 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, distribution-scope=public, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:37:08 np0005604215.localdomain podman[91471]: 2026-02-01 08:37:08.931647854 +0000 UTC m=+0.141478695 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, managed_by=tripleo_ansible, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1)
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:37:08 np0005604215.localdomain podman[91470]: 2026-02-01 08:37:08.856581964 +0000 UTC m=+0.072601226 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-cron-container, summary=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1)
Feb 01 08:37:08 np0005604215.localdomain podman[91475]: 2026-02-01 08:37:08.91952621 +0000 UTC m=+0.130627141 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, maintainer=OpenStack TripleO Team)
Feb 01 08:37:08 np0005604215.localdomain podman[91477]: 2026-02-01 08:37:08.97449912 +0000 UTC m=+0.176817568 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, distribution-scope=public, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:37:08 np0005604215.localdomain podman[91470]: 2026-02-01 08:37:08.989599856 +0000 UTC m=+0.205619128 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:37:08 np0005604215.localdomain podman[91477]: 2026-02-01 08:37:08.997239673 +0000 UTC m=+0.199558151 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, config_id=tripleo_step4, container_name=ceilometer_agent_compute)
Feb 01 08:37:08 np0005604215.localdomain podman[91483]: 2026-02-01 08:37:08.947270687 +0000 UTC m=+0.147567103 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510)
Feb 01 08:37:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:37:09 np0005604215.localdomain podman[91475]: 2026-02-01 08:37:09.000675159 +0000 UTC m=+0.211776080 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:37:09 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:37:09 np0005604215.localdomain podman[91483]: 2026-02-01 08:37:09.030691097 +0000 UTC m=+0.230987473 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, io.buildah.version=1.41.5, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:37:09 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:37:09 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:37:09 np0005604215.localdomain systemd[1]: tmp-crun.2qulod.mount: Deactivated successfully.
Feb 01 08:37:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:37:12 np0005604215.localdomain podman[91583]: 2026-02-01 08:37:12.867512902 +0000 UTC m=+0.082901963 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:37:13 np0005604215.localdomain podman[91583]: 2026-02-01 08:37:13.262854616 +0000 UTC m=+0.478243687 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target)
Feb 01 08:37:13 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:37:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:37:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:37:14 np0005604215.localdomain systemd[1]: tmp-crun.at1JrH.mount: Deactivated successfully.
Feb 01 08:37:14 np0005604215.localdomain podman[91605]: 2026-02-01 08:37:14.879560222 +0000 UTC m=+0.092624265 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:37:14 np0005604215.localdomain podman[91606]: 2026-02-01 08:37:14.931377073 +0000 UTC m=+0.141094443 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, version=17.1.13, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:37:14 np0005604215.localdomain podman[91605]: 2026-02-01 08:37:14.944219611 +0000 UTC m=+0.157283684 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:56:19Z, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:37:14 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:37:14 np0005604215.localdomain podman[91606]: 2026-02-01 08:37:14.960860505 +0000 UTC m=+0.170577915 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5)
Feb 01 08:37:14 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:37:15 np0005604215.localdomain systemd[1]: tmp-crun.3AnrNz.mount: Deactivated successfully.
Feb 01 08:37:20 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:37:20 np0005604215.localdomain recover_tripleo_nova_virtqemud[91655]: 62016
Feb 01 08:37:20 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:37:20 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:37:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:37:27 np0005604215.localdomain podman[91656]: 2026-02-01 08:37:27.866858253 +0000 UTC m=+0.080525751 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible)
Feb 01 08:37:28 np0005604215.localdomain podman[91656]: 2026-02-01 08:37:28.069639023 +0000 UTC m=+0.283306451 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, release=1766032510)
Feb 01 08:37:28 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:37:30 np0005604215.localdomain sudo[91685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:37:30 np0005604215.localdomain sudo[91685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:37:30 np0005604215.localdomain sudo[91685]: pam_unix(sudo:session): session closed for user root
Feb 01 08:37:30 np0005604215.localdomain sudo[91700]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:37:30 np0005604215.localdomain sudo[91700]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:37:31 np0005604215.localdomain sudo[91700]: pam_unix(sudo:session): session closed for user root
Feb 01 08:37:32 np0005604215.localdomain sudo[91747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:37:32 np0005604215.localdomain sudo[91747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:37:32 np0005604215.localdomain sudo[91747]: pam_unix(sudo:session): session closed for user root
Feb 01 08:37:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:37:37 np0005604215.localdomain systemd[1]: tmp-crun.mLaVJv.mount: Deactivated successfully.
Feb 01 08:37:37 np0005604215.localdomain podman[91762]: 2026-02-01 08:37:37.877247425 +0000 UTC m=+0.088860859 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, architecture=x86_64, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, distribution-scope=public)
Feb 01 08:37:37 np0005604215.localdomain podman[91762]: 2026-02-01 08:37:37.913658111 +0000 UTC m=+0.125271515 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:37:37 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: tmp-crun.MOz4Y2.mount: Deactivated successfully.
Feb 01 08:37:39 np0005604215.localdomain podman[91783]: 2026-02-01 08:37:39.926015548 +0000 UTC m=+0.138830123 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:37:39 np0005604215.localdomain podman[91783]: 2026-02-01 08:37:39.952770486 +0000 UTC m=+0.165585061 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, container_name=nova_compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z)
Feb 01 08:37:39 np0005604215.localdomain podman[91784]: 2026-02-01 08:37:39.90827117 +0000 UTC m=+0.118037891 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible)
Feb 01 08:37:39 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:37:39 np0005604215.localdomain podman[91791]: 2026-02-01 08:37:39.965991605 +0000 UTC m=+0.169376358 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:37:40 np0005604215.localdomain podman[91782]: 2026-02-01 08:37:40.01436572 +0000 UTC m=+0.229654461 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, config_id=tripleo_step4, architecture=x86_64, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.buildah.version=1.41.5, version=17.1.13)
Feb 01 08:37:40 np0005604215.localdomain podman[91791]: 2026-02-01 08:37:40.065735628 +0000 UTC m=+0.269120401 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:37:40 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:37:40 np0005604215.localdomain podman[91785]: 2026-02-01 08:37:40.082129135 +0000 UTC m=+0.289672327 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510)
Feb 01 08:37:40 np0005604215.localdomain podman[91784]: 2026-02-01 08:37:40.091423622 +0000 UTC m=+0.301190403 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step3, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., vcs-type=git)
Feb 01 08:37:40 np0005604215.localdomain podman[91782]: 2026-02-01 08:37:40.098799991 +0000 UTC m=+0.314088712 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., tcib_managed=true, container_name=logrotate_crond, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:37:40 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:37:40 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:37:40 np0005604215.localdomain podman[91785]: 2026-02-01 08:37:40.133818233 +0000 UTC m=+0.341361405 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, distribution-scope=public, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z)
Feb 01 08:37:40 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:37:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:37:43 np0005604215.localdomain podman[91900]: 2026-02-01 08:37:43.866250374 +0000 UTC m=+0.084923058 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, release=1766032510, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:37:44 np0005604215.localdomain podman[91900]: 2026-02-01 08:37:44.234076046 +0000 UTC m=+0.452748700 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, vcs-type=git, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:37:44 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:37:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:37:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:37:45 np0005604215.localdomain podman[91925]: 2026-02-01 08:37:45.87403445 +0000 UTC m=+0.087954421 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.buildah.version=1.41.5, architecture=x86_64, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible)
Feb 01 08:37:45 np0005604215.localdomain podman[91925]: 2026-02-01 08:37:45.896234776 +0000 UTC m=+0.110154807 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 01 08:37:45 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:37:45 np0005604215.localdomain podman[91924]: 2026-02-01 08:37:45.97783665 +0000 UTC m=+0.195023641 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 01 08:37:46 np0005604215.localdomain podman[91924]: 2026-02-01 08:37:46.045888804 +0000 UTC m=+0.263075815 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, vcs-type=git, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:37:46 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:37:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:37:58 np0005604215.localdomain systemd[1]: tmp-crun.PPkcBs.mount: Deactivated successfully.
Feb 01 08:37:58 np0005604215.localdomain podman[91971]: 2026-02-01 08:37:58.864080639 +0000 UTC m=+0.081808921 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:37:59 np0005604215.localdomain podman[91971]: 2026-02-01 08:37:59.086746393 +0000 UTC m=+0.304474635 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com)
Feb 01 08:37:59 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:38:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:38:08 np0005604215.localdomain podman[91999]: 2026-02-01 08:38:08.873154569 +0000 UTC m=+0.086162175 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-collectd-container, tcib_managed=true, build-date=2026-01-12T22:10:15Z, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:38:08 np0005604215.localdomain podman[91999]: 2026-02-01 08:38:08.906717776 +0000 UTC m=+0.119725352 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, release=1766032510, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, vcs-type=git, container_name=collectd, config_id=tripleo_step3)
Feb 01 08:38:08 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: tmp-crun.EDmgxT.mount: Deactivated successfully.
Feb 01 08:38:10 np0005604215.localdomain podman[92019]: 2026-02-01 08:38:10.875173507 +0000 UTC m=+0.089505349 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, version=17.1.13, vcs-type=git, name=rhosp-rhel9/openstack-cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:38:10 np0005604215.localdomain podman[92029]: 2026-02-01 08:38:10.884534786 +0000 UTC m=+0.077926520 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, vcs-type=git, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, io.buildah.version=1.41.5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:38:10 np0005604215.localdomain podman[92022]: 2026-02-01 08:38:10.936709409 +0000 UTC m=+0.139527225 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid, version=17.1.13, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z)
Feb 01 08:38:10 np0005604215.localdomain podman[92019]: 2026-02-01 08:38:10.965128408 +0000 UTC m=+0.179460220 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:38:10 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:38:10 np0005604215.localdomain podman[92022]: 2026-02-01 08:38:10.97490562 +0000 UTC m=+0.177723506 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, release=1766032510, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:38:10 np0005604215.localdomain podman[92027]: 2026-02-01 08:38:10.984371323 +0000 UTC m=+0.183677221 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:38:11 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:38:11 np0005604215.localdomain podman[92029]: 2026-02-01 08:38:11.009885311 +0000 UTC m=+0.203277115 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git)
Feb 01 08:38:11 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:38:11 np0005604215.localdomain podman[92027]: 2026-02-01 08:38:11.041639323 +0000 UTC m=+0.240945241 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, build-date=2026-01-12T23:07:47Z, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, version=17.1.13, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible)
Feb 01 08:38:11 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:38:11 np0005604215.localdomain podman[92020]: 2026-02-01 08:38:11.099697828 +0000 UTC m=+0.304143754 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.openshift.expose-services=)
Feb 01 08:38:11 np0005604215.localdomain podman[92020]: 2026-02-01 08:38:11.153008796 +0000 UTC m=+0.357454632 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible)
Feb 01 08:38:11 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:38:11 np0005604215.localdomain systemd[1]: tmp-crun.HCOEW6.mount: Deactivated successfully.
Feb 01 08:38:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:38:14 np0005604215.localdomain podman[92134]: 2026-02-01 08:38:14.865177778 +0000 UTC m=+0.078181048 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_migration_target, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container)
Feb 01 08:38:15 np0005604215.localdomain podman[92134]: 2026-02-01 08:38:15.249974386 +0000 UTC m=+0.462977456 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., container_name=nova_migration_target, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:38:15 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:38:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:38:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:38:16 np0005604215.localdomain podman[92159]: 2026-02-01 08:38:16.876733511 +0000 UTC m=+0.086288878 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, release=1766032510, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, container_name=ovn_controller, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:38:16 np0005604215.localdomain podman[92159]: 2026-02-01 08:38:16.926546122 +0000 UTC m=+0.136101469 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:38:16 np0005604215.localdomain podman[92158]: 2026-02-01 08:38:16.938390108 +0000 UTC m=+0.149831193 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, tcib_managed=true)
Feb 01 08:38:16 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Deactivated successfully.
Feb 01 08:38:17 np0005604215.localdomain podman[92158]: 2026-02-01 08:38:17.000771227 +0000 UTC m=+0.212212272 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, architecture=x86_64, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, vcs-type=git, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 01 08:38:17 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:38:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:38:29 np0005604215.localdomain systemd[1]: tmp-crun.aGNbFq.mount: Deactivated successfully.
Feb 01 08:38:29 np0005604215.localdomain podman[92206]: 2026-02-01 08:38:29.872157825 +0000 UTC m=+0.085735172 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, tcib_managed=true, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, config_id=tripleo_step1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:38:30 np0005604215.localdomain podman[92206]: 2026-02-01 08:38:30.093761416 +0000 UTC m=+0.307338763 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, config_id=tripleo_step1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:38:30 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:38:32 np0005604215.localdomain sudo[92236]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:38:32 np0005604215.localdomain sudo[92236]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:38:32 np0005604215.localdomain sudo[92236]: pam_unix(sudo:session): session closed for user root
Feb 01 08:38:32 np0005604215.localdomain sudo[92251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:38:32 np0005604215.localdomain sudo[92251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:38:32 np0005604215.localdomain sudo[92251]: pam_unix(sudo:session): session closed for user root
Feb 01 08:38:33 np0005604215.localdomain sudo[92298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:38:33 np0005604215.localdomain sudo[92298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:38:33 np0005604215.localdomain sudo[92298]: pam_unix(sudo:session): session closed for user root
Feb 01 08:38:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:38:39 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:38:39 np0005604215.localdomain recover_tripleo_nova_virtqemud[92318]: 62016
Feb 01 08:38:39 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:38:39 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:38:39 np0005604215.localdomain systemd[1]: tmp-crun.Sx6kj3.mount: Deactivated successfully.
Feb 01 08:38:39 np0005604215.localdomain podman[92313]: 2026-02-01 08:38:39.871672271 +0000 UTC m=+0.086698612 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, config_id=tripleo_step3, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, release=1766032510, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:38:39 np0005604215.localdomain podman[92313]: 2026-02-01 08:38:39.907671364 +0000 UTC m=+0.122697705 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible)
Feb 01 08:38:39 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:38:41 np0005604215.localdomain podman[92337]: 2026-02-01 08:38:41.869550791 +0000 UTC m=+0.081659816 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, release=1766032510, container_name=nova_compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:38:41 np0005604215.localdomain podman[92338]: 2026-02-01 08:38:41.924660995 +0000 UTC m=+0.131746114 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, architecture=x86_64, vcs-type=git, build-date=2026-01-12T22:34:43Z, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc.)
Feb 01 08:38:41 np0005604215.localdomain podman[92338]: 2026-02-01 08:38:41.939608047 +0000 UTC m=+0.146693206 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible, container_name=iscsid, tcib_managed=true, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:38:41 np0005604215.localdomain podman[92337]: 2026-02-01 08:38:41.977935802 +0000 UTC m=+0.190044837 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, release=1766032510, maintainer=OpenStack TripleO Team, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:38:41 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:38:41 np0005604215.localdomain podman[92336]: 2026-02-01 08:38:41.990562262 +0000 UTC m=+0.202671107 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, config_id=tripleo_step4, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, release=1766032510)
Feb 01 08:38:42 np0005604215.localdomain podman[92339]: 2026-02-01 08:38:42.032510159 +0000 UTC m=+0.238796984 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, distribution-scope=public, release=1766032510, url=https://www.redhat.com, tcib_managed=true, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_compute, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc.)
Feb 01 08:38:42 np0005604215.localdomain podman[92347]: 2026-02-01 08:38:42.088076687 +0000 UTC m=+0.289835032 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:38:42 np0005604215.localdomain podman[92336]: 2026-02-01 08:38:42.111805141 +0000 UTC m=+0.323914066 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, name=rhosp-rhel9/openstack-cron, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, batch=17.1_20260112.1, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public)
Feb 01 08:38:42 np0005604215.localdomain podman[92347]: 2026-02-01 08:38:42.119627963 +0000 UTC m=+0.321386318 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, distribution-scope=public, release=1766032510, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13)
Feb 01 08:38:42 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:38:42 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:38:42 np0005604215.localdomain podman[92339]: 2026-02-01 08:38:42.168220456 +0000 UTC m=+0.374507291 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, container_name=ceilometer_agent_compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:38:42 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:38:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:38:45 np0005604215.localdomain podman[92451]: 2026-02-01 08:38:45.872814745 +0000 UTC m=+0.089215350 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, tcib_managed=true, batch=17.1_20260112.1, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T23:32:04Z)
Feb 01 08:38:46 np0005604215.localdomain podman[92451]: 2026-02-01 08:38:46.248509891 +0000 UTC m=+0.464910486 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=)
Feb 01 08:38:46 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:38:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:38:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:38:47 np0005604215.localdomain podman[92474]: 2026-02-01 08:38:47.880417375 +0000 UTC m=+0.090725607 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510)
Feb 01 08:38:47 np0005604215.localdomain podman[92475]: 2026-02-01 08:38:47.934898809 +0000 UTC m=+0.143183688 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, version=17.1.13, tcib_managed=true)
Feb 01 08:38:47 np0005604215.localdomain podman[92475]: 2026-02-01 08:38:47.958502938 +0000 UTC m=+0.166787877 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com)
Feb 01 08:38:47 np0005604215.localdomain podman[92475]: unhealthy
Feb 01 08:38:47 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:38:47 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:38:47 np0005604215.localdomain podman[92474]: 2026-02-01 08:38:47.986959519 +0000 UTC m=+0.197267761 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, distribution-scope=public, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., release=1766032510, url=https://www.redhat.com, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:38:48 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:39:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:39:00 np0005604215.localdomain systemd[1]: tmp-crun.X7uIUq.mount: Deactivated successfully.
Feb 01 08:39:00 np0005604215.localdomain podman[92525]: 2026-02-01 08:39:00.882521542 +0000 UTC m=+0.100335773 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, release=1766032510, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:39:01 np0005604215.localdomain podman[92525]: 2026-02-01 08:39:01.067095009 +0000 UTC m=+0.284909210 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, tcib_managed=true, container_name=metrics_qdr, build-date=2026-01-12T22:10:14Z, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:39:01 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:39:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:39:10 np0005604215.localdomain podman[92554]: 2026-02-01 08:39:10.858430847 +0000 UTC m=+0.077017163 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, architecture=x86_64, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 01 08:39:10 np0005604215.localdomain podman[92554]: 2026-02-01 08:39:10.896585457 +0000 UTC m=+0.115171753 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:15Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:39:10 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: tmp-crun.2QQV8I.mount: Deactivated successfully.
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: tmp-crun.FCpVQq.mount: Deactivated successfully.
Feb 01 08:39:12 np0005604215.localdomain podman[92574]: 2026-02-01 08:39:12.926138725 +0000 UTC m=+0.139796303 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, architecture=x86_64, com.redhat.component=openstack-cron-container, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4)
Feb 01 08:39:12 np0005604215.localdomain podman[92579]: 2026-02-01 08:39:12.901604097 +0000 UTC m=+0.103351467 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 01 08:39:12 np0005604215.localdomain podman[92574]: 2026-02-01 08:39:12.958212768 +0000 UTC m=+0.171870306 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, release=1766032510)
Feb 01 08:39:12 np0005604215.localdomain podman[92575]: 2026-02-01 08:39:12.96642119 +0000 UTC m=+0.180348696 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, container_name=nova_compute, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:39:12 np0005604215.localdomain podman[92576]: 2026-02-01 08:39:12.97253689 +0000 UTC m=+0.180488602 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team)
Feb 01 08:39:12 np0005604215.localdomain podman[92576]: 2026-02-01 08:39:12.981545369 +0000 UTC m=+0.189497101 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step3, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 01 08:39:12 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:39:13 np0005604215.localdomain podman[92588]: 2026-02-01 08:39:13.021337339 +0000 UTC m=+0.223002317 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible)
Feb 01 08:39:13 np0005604215.localdomain podman[92588]: 2026-02-01 08:39:13.070708076 +0000 UTC m=+0.272373054 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T23:07:30Z)
Feb 01 08:39:13 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:39:13 np0005604215.localdomain podman[92579]: 2026-02-01 08:39:13.084734679 +0000 UTC m=+0.286482059 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, tcib_managed=true, managed_by=tripleo_ansible, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, vcs-type=git, build-date=2026-01-12T23:07:47Z, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:39:13 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:39:13 np0005604215.localdomain podman[92575]: 2026-02-01 08:39:13.123805097 +0000 UTC m=+0.337732543 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, url=https://www.redhat.com, distribution-scope=public, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, io.buildah.version=1.41.5, architecture=x86_64, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible)
Feb 01 08:39:13 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:39:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:39:16 np0005604215.localdomain podman[92691]: 2026-02-01 08:39:16.848113736 +0000 UTC m=+0.067438256 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1)
Feb 01 08:39:17 np0005604215.localdomain podman[92691]: 2026-02-01 08:39:17.229011503 +0000 UTC m=+0.448335963 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 01 08:39:17 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:39:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:39:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:39:18 np0005604215.localdomain systemd[1]: tmp-crun.jPjrHv.mount: Deactivated successfully.
Feb 01 08:39:18 np0005604215.localdomain podman[92714]: 2026-02-01 08:39:18.881495835 +0000 UTC m=+0.094268676 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1)
Feb 01 08:39:18 np0005604215.localdomain podman[92714]: 2026-02-01 08:39:18.932733639 +0000 UTC m=+0.145506470 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, architecture=x86_64, release=1766032510, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, config_id=tripleo_step4, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:39:18 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Deactivated successfully.
Feb 01 08:39:18 np0005604215.localdomain podman[92715]: 2026-02-01 08:39:18.937837807 +0000 UTC m=+0.148260365 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:39:19 np0005604215.localdomain podman[92715]: 2026-02-01 08:39:19.019509261 +0000 UTC m=+0.229931789 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:39:19 np0005604215.localdomain podman[92715]: unhealthy
Feb 01 08:39:19 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:39:19 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:39:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:39:31 np0005604215.localdomain podman[92764]: 2026-02-01 08:39:31.861402267 +0000 UTC m=+0.080252681 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:39:32 np0005604215.localdomain podman[92764]: 2026-02-01 08:39:32.10856086 +0000 UTC m=+0.327411244 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, distribution-scope=public, tcib_managed=true, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:39:32 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:39:33 np0005604215.localdomain sudo[92793]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:39:33 np0005604215.localdomain sudo[92793]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:39:33 np0005604215.localdomain sudo[92793]: pam_unix(sudo:session): session closed for user root
Feb 01 08:39:33 np0005604215.localdomain sudo[92808]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:39:33 np0005604215.localdomain sudo[92808]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:39:34 np0005604215.localdomain sudo[92808]: pam_unix(sudo:session): session closed for user root
Feb 01 08:39:38 np0005604215.localdomain sudo[92853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:39:38 np0005604215.localdomain sudo[92853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:39:38 np0005604215.localdomain sudo[92853]: pam_unix(sudo:session): session closed for user root
Feb 01 08:39:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:39:41 np0005604215.localdomain podman[92868]: 2026-02-01 08:39:41.877554247 +0000 UTC m=+0.095358359 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, distribution-scope=public, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:39:41 np0005604215.localdomain podman[92868]: 2026-02-01 08:39:41.890734344 +0000 UTC m=+0.108538536 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, release=1766032510, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:39:41 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:39:43 np0005604215.localdomain podman[92890]: 2026-02-01 08:39:43.886377826 +0000 UTC m=+0.090805389 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible)
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: tmp-crun.i7xMxJ.mount: Deactivated successfully.
Feb 01 08:39:43 np0005604215.localdomain podman[92889]: 2026-02-01 08:39:43.934450312 +0000 UTC m=+0.142927650 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, distribution-scope=public, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true)
Feb 01 08:39:43 np0005604215.localdomain podman[92891]: 2026-02-01 08:39:43.956676009 +0000 UTC m=+0.155881591 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:39:43 np0005604215.localdomain podman[92889]: 2026-02-01 08:39:43.961522129 +0000 UTC m=+0.169999457 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, container_name=nova_compute, managed_by=tripleo_ansible, tcib_managed=true, architecture=x86_64, config_id=tripleo_step5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510)
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:39:43 np0005604215.localdomain podman[92890]: 2026-02-01 08:39:43.974948803 +0000 UTC m=+0.179376446 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:39:43 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:39:44 np0005604215.localdomain podman[92891]: 2026-02-01 08:39:44.009574845 +0000 UTC m=+0.208780357 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, architecture=x86_64, release=1766032510, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:39:44 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:39:44 np0005604215.localdomain podman[92910]: 2026-02-01 08:39:44.104783828 +0000 UTC m=+0.298602863 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, container_name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:39:44 np0005604215.localdomain podman[92888]: 2026-02-01 08:39:44.135612601 +0000 UTC m=+0.344238674 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., tcib_managed=true, io.openshift.expose-services=, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:39:44 np0005604215.localdomain podman[92910]: 2026-02-01 08:39:44.153247466 +0000 UTC m=+0.347066481 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4)
Feb 01 08:39:44 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:39:44 np0005604215.localdomain podman[92888]: 2026-02-01 08:39:44.167169857 +0000 UTC m=+0.375795910 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=logrotate_crond, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, version=17.1.13)
Feb 01 08:39:44 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:39:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:39:47 np0005604215.localdomain systemd[1]: tmp-crun.IrZHT9.mount: Deactivated successfully.
Feb 01 08:39:47 np0005604215.localdomain podman[93001]: 2026-02-01 08:39:47.871641311 +0000 UTC m=+0.084033439 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, version=17.1.13, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, managed_by=tripleo_ansible)
Feb 01 08:39:48 np0005604215.localdomain podman[93001]: 2026-02-01 08:39:48.241850768 +0000 UTC m=+0.454242876 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., distribution-scope=public, version=17.1.13, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git)
Feb 01 08:39:48 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: tmp-crun.XmtLre.mount: Deactivated successfully.
Feb 01 08:39:49 np0005604215.localdomain podman[93024]: 2026-02-01 08:39:49.875379704 +0000 UTC m=+0.094870704 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z)
Feb 01 08:39:49 np0005604215.localdomain podman[93025]: 2026-02-01 08:39:49.922670536 +0000 UTC m=+0.139629288 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=healthy, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:39:49 np0005604215.localdomain podman[93025]: 2026-02-01 08:39:49.933532161 +0000 UTC m=+0.150490933 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, managed_by=tripleo_ansible, url=https://www.redhat.com, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:39:49 np0005604215.localdomain podman[93025]: unhealthy
Feb 01 08:39:49 np0005604215.localdomain podman[93024]: 2026-02-01 08:39:49.941148977 +0000 UTC m=+0.160639977 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent)
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:39:49 np0005604215.localdomain podman[93024]: unhealthy
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:39:49 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:40:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:40:02 np0005604215.localdomain podman[93063]: 2026-02-01 08:40:02.858796697 +0000 UTC m=+0.076158769 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:40:03 np0005604215.localdomain podman[93063]: 2026-02-01 08:40:03.069945564 +0000 UTC m=+0.287307596 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:40:03 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:40:10 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:40:10 np0005604215.localdomain recover_tripleo_nova_virtqemud[93095]: 62016
Feb 01 08:40:10 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:40:10 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:40:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:40:12 np0005604215.localdomain podman[93096]: 2026-02-01 08:40:12.863140501 +0000 UTC m=+0.081455896 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:40:12 np0005604215.localdomain podman[93096]: 2026-02-01 08:40:12.901632833 +0000 UTC m=+0.119948238 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, tcib_managed=true, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13)
Feb 01 08:40:12 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: tmp-crun.6n6nPQ.mount: Deactivated successfully.
Feb 01 08:40:14 np0005604215.localdomain podman[93116]: 2026-02-01 08:40:14.8835971 +0000 UTC m=+0.095096642 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z)
Feb 01 08:40:14 np0005604215.localdomain podman[93117]: 2026-02-01 08:40:14.938147514 +0000 UTC m=+0.145875968 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5)
Feb 01 08:40:14 np0005604215.localdomain podman[93119]: 2026-02-01 08:40:14.90665157 +0000 UTC m=+0.107489418 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:40:14 np0005604215.localdomain podman[93119]: 2026-02-01 08:40:14.98637347 +0000 UTC m=+0.187211338 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, distribution-scope=public, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Feb 01 08:40:14 np0005604215.localdomain podman[93125]: 2026-02-01 08:40:14.993364329 +0000 UTC m=+0.194472007 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1)
Feb 01 08:40:14 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:40:15 np0005604215.localdomain podman[93118]: 2026-02-01 08:40:15.030402526 +0000 UTC m=+0.234400594 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, config_id=tripleo_step3, io.buildah.version=1.41.5, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., version=17.1.13, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:40:15 np0005604215.localdomain podman[93118]: 2026-02-01 08:40:15.037963842 +0000 UTC m=+0.241962000 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, tcib_managed=true, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13)
Feb 01 08:40:15 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:40:15 np0005604215.localdomain podman[93116]: 2026-02-01 08:40:15.063782259 +0000 UTC m=+0.275281801 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:40:15 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:40:15 np0005604215.localdomain podman[93125]: 2026-02-01 08:40:15.091786824 +0000 UTC m=+0.292894502 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible)
Feb 01 08:40:15 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:40:15 np0005604215.localdomain podman[93117]: 2026-02-01 08:40:15.115034 +0000 UTC m=+0.322762464 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, tcib_managed=true, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:40:15 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:40:15 np0005604215.localdomain systemd[1]: tmp-crun.Xhpduj.mount: Deactivated successfully.
Feb 01 08:40:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:40:18 np0005604215.localdomain podman[93233]: 2026-02-01 08:40:18.847964716 +0000 UTC m=+0.066614471 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:40:19 np0005604215.localdomain podman[93233]: 2026-02-01 08:40:19.215748806 +0000 UTC m=+0.434398551 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc.)
Feb 01 08:40:19 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: tmp-crun.qxzhzI.mount: Deactivated successfully.
Feb 01 08:40:20 np0005604215.localdomain podman[93258]: 2026-02-01 08:40:20.876426605 +0000 UTC m=+0.090363494 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, container_name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510)
Feb 01 08:40:20 np0005604215.localdomain podman[93259]: 2026-02-01 08:40:20.919053246 +0000 UTC m=+0.129812196 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, build-date=2026-01-12T22:36:40Z, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:40:20 np0005604215.localdomain podman[93259]: 2026-02-01 08:40:20.93261072 +0000 UTC m=+0.143369660 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, release=1766032510)
Feb 01 08:40:20 np0005604215.localdomain podman[93259]: unhealthy
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:40:20 np0005604215.localdomain podman[93258]: 2026-02-01 08:40:20.969890844 +0000 UTC m=+0.183827733 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, maintainer=OpenStack TripleO Team, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, managed_by=tripleo_ansible, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:40:20 np0005604215.localdomain podman[93258]: unhealthy
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:40:20 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:40:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:40:33 np0005604215.localdomain podman[93298]: 2026-02-01 08:40:33.88592374 +0000 UTC m=+0.079973419 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:40:34 np0005604215.localdomain podman[93298]: 2026-02-01 08:40:34.076605017 +0000 UTC m=+0.270654686 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, release=1766032510, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 01 08:40:34 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:40:38 np0005604215.localdomain sudo[93327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:40:38 np0005604215.localdomain sudo[93327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:40:39 np0005604215.localdomain sudo[93327]: pam_unix(sudo:session): session closed for user root
Feb 01 08:40:39 np0005604215.localdomain sudo[93342]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:40:39 np0005604215.localdomain sudo[93342]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:40:39 np0005604215.localdomain sudo[93342]: pam_unix(sudo:session): session closed for user root
Feb 01 08:40:40 np0005604215.localdomain sudo[93388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:40:40 np0005604215.localdomain sudo[93388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:40:40 np0005604215.localdomain sudo[93388]: pam_unix(sudo:session): session closed for user root
Feb 01 08:40:40 np0005604215.localdomain sudo[93403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 08:40:40 np0005604215.localdomain sudo[93403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:40:40 np0005604215.localdomain sudo[93403]: pam_unix(sudo:session): session closed for user root
Feb 01 08:40:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:40:43 np0005604215.localdomain systemd[1]: tmp-crun.X5JWRr.mount: Deactivated successfully.
Feb 01 08:40:43 np0005604215.localdomain podman[93437]: 2026-02-01 08:40:43.885509516 +0000 UTC m=+0.099562381 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, version=17.1.13, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:40:43 np0005604215.localdomain podman[93437]: 2026-02-01 08:40:43.89620875 +0000 UTC m=+0.110261575 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:40:43 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:40:45 np0005604215.localdomain sudo[93458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:40:45 np0005604215.localdomain sudo[93458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:40:45 np0005604215.localdomain sudo[93458]: pam_unix(sudo:session): session closed for user root
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: tmp-crun.uIHHeO.mount: Deactivated successfully.
Feb 01 08:40:45 np0005604215.localdomain podman[93483]: 2026-02-01 08:40:45.593155732 +0000 UTC m=+0.087623158 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:40:45 np0005604215.localdomain podman[93474]: 2026-02-01 08:40:45.629355513 +0000 UTC m=+0.135461583 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute)
Feb 01 08:40:45 np0005604215.localdomain podman[93474]: 2026-02-01 08:40:45.686721535 +0000 UTC m=+0.192827555 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:40:45 np0005604215.localdomain podman[93483]: 2026-02-01 08:40:45.703830949 +0000 UTC m=+0.198298295 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5)
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:40:45 np0005604215.localdomain podman[93476]: 2026-02-01 08:40:45.687475539 +0000 UTC m=+0.186067524 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, architecture=x86_64, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, version=17.1.13, container_name=ceilometer_agent_compute, url=https://www.redhat.com, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public)
Feb 01 08:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:40:45 np0005604215.localdomain podman[93473]: 2026-02-01 08:40:45.791071905 +0000 UTC m=+0.299465296 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond)
Feb 01 08:40:45 np0005604215.localdomain podman[93473]: 2026-02-01 08:40:45.803741091 +0000 UTC m=+0.312134542 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:40:45 np0005604215.localdomain podman[93476]: 2026-02-01 08:40:45.818024637 +0000 UTC m=+0.316616572 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:47Z, architecture=x86_64, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, config_id=tripleo_step4, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:40:45 np0005604215.localdomain podman[93475]: 2026-02-01 08:40:45.841455519 +0000 UTC m=+0.342939565 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, version=17.1.13, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, container_name=iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid)
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:40:45 np0005604215.localdomain podman[93475]: 2026-02-01 08:40:45.875499853 +0000 UTC m=+0.376983899 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, distribution-scope=public, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, tcib_managed=true, vcs-type=git)
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:40:45 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:40:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:40:49 np0005604215.localdomain systemd[1]: tmp-crun.LAy5qt.mount: Deactivated successfully.
Feb 01 08:40:49 np0005604215.localdomain podman[93588]: 2026-02-01 08:40:49.854872608 +0000 UTC m=+0.073035803 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, architecture=x86_64, container_name=nova_migration_target, config_id=tripleo_step4, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:40:50 np0005604215.localdomain podman[93588]: 2026-02-01 08:40:50.247563665 +0000 UTC m=+0.465726840 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, container_name=nova_migration_target, batch=17.1_20260112.1, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true)
Feb 01 08:40:50 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 3600.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.01 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:40:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:40:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:40:51 np0005604215.localdomain podman[93613]: 2026-02-01 08:40:51.861849786 +0000 UTC m=+0.079747933 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, vendor=Red Hat, Inc., release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent)
Feb 01 08:40:51 np0005604215.localdomain podman[93614]: 2026-02-01 08:40:51.880764726 +0000 UTC m=+0.093076239 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, container_name=ovn_controller, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_id=tripleo_step4, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:40:51 np0005604215.localdomain podman[93614]: 2026-02-01 08:40:51.891548563 +0000 UTC m=+0.103860096 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64)
Feb 01 08:40:51 np0005604215.localdomain podman[93614]: unhealthy
Feb 01 08:40:51 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:40:51 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:40:51 np0005604215.localdomain podman[93613]: 2026-02-01 08:40:51.91162504 +0000 UTC m=+0.129523117 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:40:51 np0005604215.localdomain podman[93613]: unhealthy
Feb 01 08:40:51 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:40:51 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:41:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:41:04 np0005604215.localdomain podman[93654]: 2026-02-01 08:41:04.871766323 +0000 UTC m=+0.086807643 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, release=1766032510, io.openshift.expose-services=, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, version=17.1.13)
Feb 01 08:41:05 np0005604215.localdomain podman[93654]: 2026-02-01 08:41:05.108847189 +0000 UTC m=+0.323888519 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:41:05 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:41:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:41:14 np0005604215.localdomain systemd[1]: tmp-crun.jE53M8.mount: Deactivated successfully.
Feb 01 08:41:14 np0005604215.localdomain podman[93683]: 2026-02-01 08:41:14.874817017 +0000 UTC m=+0.086731481 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, version=17.1.13, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, container_name=collectd, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 01 08:41:14 np0005604215.localdomain podman[93683]: 2026-02-01 08:41:14.886675598 +0000 UTC m=+0.098590032 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step3, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git)
Feb 01 08:41:14 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:41:15 np0005604215.localdomain podman[93702]: 2026-02-01 08:41:15.855028348 +0000 UTC m=+0.070938917 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z)
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: tmp-crun.RHcz9n.mount: Deactivated successfully.
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:41:15 np0005604215.localdomain podman[93702]: 2026-02-01 08:41:15.91238324 +0000 UTC m=+0.128293879 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, architecture=x86_64, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, tcib_managed=true, build-date=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: tmp-crun.CkNJ70.mount: Deactivated successfully.
Feb 01 08:41:15 np0005604215.localdomain podman[93701]: 2026-02-01 08:41:15.928759732 +0000 UTC m=+0.148434588 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, version=17.1.13, io.buildah.version=1.41.5, release=1766032510)
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:41:15 np0005604215.localdomain podman[93742]: 2026-02-01 08:41:15.982765229 +0000 UTC m=+0.068580864 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, container_name=iscsid, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, architecture=x86_64)
Feb 01 08:41:15 np0005604215.localdomain podman[93701]: 2026-02-01 08:41:15.985532385 +0000 UTC m=+0.205207211 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, url=https://www.redhat.com, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:41:15 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:41:16 np0005604215.localdomain podman[93746]: 2026-02-01 08:41:16.06699978 +0000 UTC m=+0.130494308 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T23:07:47Z)
Feb 01 08:41:16 np0005604215.localdomain podman[93746]: 2026-02-01 08:41:16.088532623 +0000 UTC m=+0.152027121 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:41:16 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:41:16 np0005604215.localdomain podman[93742]: 2026-02-01 08:41:16.120480131 +0000 UTC m=+0.206295806 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, com.redhat.component=openstack-iscsid-container)
Feb 01 08:41:16 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:41:16 np0005604215.localdomain podman[93741]: 2026-02-01 08:41:16.041246266 +0000 UTC m=+0.133328747 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:41:16 np0005604215.localdomain podman[93741]: 2026-02-01 08:41:16.171007899 +0000 UTC m=+0.263090400 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.buildah.version=1.41.5, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:41:16 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:41:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:41:20 np0005604215.localdomain systemd[1]: tmp-crun.C4WmfC.mount: Deactivated successfully.
Feb 01 08:41:20 np0005604215.localdomain podman[93818]: 2026-02-01 08:41:20.870044887 +0000 UTC m=+0.083025105 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, version=17.1.13, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:41:21 np0005604215.localdomain podman[93818]: 2026-02-01 08:41:21.223337074 +0000 UTC m=+0.436317332 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, url=https://www.redhat.com, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, container_name=nova_migration_target, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, config_id=tripleo_step4)
Feb 01 08:41:21 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:41:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:41:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:41:22 np0005604215.localdomain podman[93839]: 2026-02-01 08:41:22.864133431 +0000 UTC m=+0.080020420 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true)
Feb 01 08:41:22 np0005604215.localdomain podman[93839]: 2026-02-01 08:41:22.883707433 +0000 UTC m=+0.099594482 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, batch=17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, managed_by=tripleo_ansible)
Feb 01 08:41:22 np0005604215.localdomain podman[93839]: unhealthy
Feb 01 08:41:22 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:41:22 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:41:22 np0005604215.localdomain podman[93840]: 2026-02-01 08:41:22.972764905 +0000 UTC m=+0.185526736 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, distribution-scope=public, tcib_managed=true, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:41:23 np0005604215.localdomain podman[93840]: 2026-02-01 08:41:23.016814922 +0000 UTC m=+0.229576803 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=ovn_controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-type=git, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., config_id=tripleo_step4)
Feb 01 08:41:23 np0005604215.localdomain podman[93840]: unhealthy
Feb 01 08:41:23 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:41:23 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:41:35 np0005604215.localdomain sshd[93877]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:41:35 np0005604215.localdomain sshd[93877]: error: kex_exchange_identification: banner line contains invalid characters
Feb 01 08:41:35 np0005604215.localdomain sshd[93877]: banner exchange: Connection from 160.25.6.19 port 39846: invalid format
Feb 01 08:41:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:41:35 np0005604215.localdomain podman[93878]: 2026-02-01 08:41:35.880767859 +0000 UTC m=+0.088847657 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, architecture=x86_64, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=metrics_qdr, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, version=17.1.13)
Feb 01 08:41:36 np0005604215.localdomain podman[93878]: 2026-02-01 08:41:36.092667098 +0000 UTC m=+0.300746926 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:41:36 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:41:45 np0005604215.localdomain sudo[93908]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:41:45 np0005604215.localdomain sudo[93908]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:41:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:41:45 np0005604215.localdomain sudo[93908]: pam_unix(sudo:session): session closed for user root
Feb 01 08:41:45 np0005604215.localdomain sudo[93929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:41:45 np0005604215.localdomain sudo[93929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:41:45 np0005604215.localdomain podman[93922]: 2026-02-01 08:41:45.707161003 +0000 UTC m=+0.094778252 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, url=https://www.redhat.com, container_name=collectd, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd)
Feb 01 08:41:45 np0005604215.localdomain podman[93922]: 2026-02-01 08:41:45.71667275 +0000 UTC m=+0.104290019 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, version=17.1.13)
Feb 01 08:41:45 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:41:46 np0005604215.localdomain sudo[93929]: pam_unix(sudo:session): session closed for user root
Feb 01 08:41:46 np0005604215.localdomain sudo[93990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:41:46 np0005604215.localdomain sudo[93990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:41:46 np0005604215.localdomain sudo[93990]: pam_unix(sudo:session): session closed for user root
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:41:46 np0005604215.localdomain sudo[94005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- inventory --format=json-pretty --filter-for-batch
Feb 01 08:41:46 np0005604215.localdomain sudo[94005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: tmp-crun.uxtUhY.mount: Deactivated successfully.
Feb 01 08:41:46 np0005604215.localdomain podman[94017]: 2026-02-01 08:41:46.721679237 +0000 UTC m=+0.083552902 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible)
Feb 01 08:41:46 np0005604215.localdomain podman[94017]: 2026-02-01 08:41:46.808618273 +0000 UTC m=+0.170491938 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, architecture=x86_64, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid)
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:41:46 np0005604215.localdomain podman[94032]: 2026-02-01 08:41:46.772815864 +0000 UTC m=+0.132011595 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:41:46 np0005604215.localdomain podman[94022]: 2026-02-01 08:41:46.752340455 +0000 UTC m=+0.111789114 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:41:46 np0005604215.localdomain podman[94032]: 2026-02-01 08:41:46.856617902 +0000 UTC m=+0.215813593 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, url=https://www.redhat.com)
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:41:46 np0005604215.localdomain podman[94022]: 2026-02-01 08:41:46.88569026 +0000 UTC m=+0.245138869 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:41:46 np0005604215.localdomain podman[94014]: 2026-02-01 08:41:46.705497271 +0000 UTC m=+0.077484472 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, tcib_managed=true, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 01 08:41:46 np0005604215.localdomain podman[94014]: 2026-02-01 08:41:46.936324801 +0000 UTC m=+0.308311992 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.openshift.expose-services=, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:41:46 np0005604215.localdomain podman[94011]: 2026-02-01 08:41:46.979505011 +0000 UTC m=+0.350393417 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vcs-type=git, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:41:46 np0005604215.localdomain podman[94011]: 2026-02-01 08:41:46.985019543 +0000 UTC m=+0.355907969 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-cron-container, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:41:46 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 2026-02-01 08:41:47.282741014 +0000 UTC m=+0.071355051 container create a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_proskuriakova, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.expose-services=, release=1764794109, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 08:41:47 np0005604215.localdomain systemd[1]: Started libpod-conmon-a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942.scope.
Feb 01 08:41:47 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 2026-02-01 08:41:47.25509553 +0000 UTC m=+0.043709597 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 2026-02-01 08:41:47.354985241 +0000 UTC m=+0.143599278 container init a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_proskuriakova, ceph=True, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 2026-02-01 08:41:47.363695443 +0000 UTC m=+0.152309460 container start a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_proskuriakova, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, architecture=x86_64, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, release=1764794109, io.buildah.version=1.41.4, version=7, CEPH_POINT_RELEASE=)
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 2026-02-01 08:41:47.363872658 +0000 UTC m=+0.152486705 container attach a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_proskuriakova, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, build-date=2025-12-08T17:28:53Z, vcs-type=git, ceph=True, com.redhat.component=rhceph-container, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 08:41:47 np0005604215.localdomain kind_proskuriakova[94192]: 167 167
Feb 01 08:41:47 np0005604215.localdomain systemd[1]: libpod-a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942.scope: Deactivated successfully.
Feb 01 08:41:47 np0005604215.localdomain podman[94177]: 2026-02-01 08:41:47.368651908 +0000 UTC m=+0.157265975 container died a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_proskuriakova, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, vcs-type=git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 08:41:47 np0005604215.localdomain podman[94198]: 2026-02-01 08:41:47.462332984 +0000 UTC m=+0.080681272 container remove a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_proskuriakova, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, version=7)
Feb 01 08:41:47 np0005604215.localdomain systemd[1]: libpod-conmon-a99ccce78b246bc26f1d4f4c59557a761a541ce8ada40c57a8ef8a7b8e181942.scope: Deactivated successfully.
Feb 01 08:41:47 np0005604215.localdomain podman[94221]: 
Feb 01 08:41:47 np0005604215.localdomain podman[94221]: 2026-02-01 08:41:47.660524716 +0000 UTC m=+0.076349386 container create eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_buck, vcs-type=git, com.redhat.component=rhceph-container, version=7, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, release=1764794109, io.openshift.expose-services=)
Feb 01 08:41:47 np0005604215.localdomain systemd[1]: Started libpod-conmon-eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447.scope.
Feb 01 08:41:47 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 08:41:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/975ac4e8be7df32fbfc4d681407780e12093dcf7eec7d7f9a44d56f550b7855f/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 08:41:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/975ac4e8be7df32fbfc4d681407780e12093dcf7eec7d7f9a44d56f550b7855f/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 08:41:47 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/975ac4e8be7df32fbfc4d681407780e12093dcf7eec7d7f9a44d56f550b7855f/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 08:41:47 np0005604215.localdomain podman[94221]: 2026-02-01 08:41:47.720969623 +0000 UTC m=+0.136794293 container init eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_buck, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, release=1764794109, io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 08:41:47 np0005604215.localdomain podman[94221]: 2026-02-01 08:41:47.629733824 +0000 UTC m=+0.045558524 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 08:41:47 np0005604215.localdomain podman[94221]: 2026-02-01 08:41:47.733100923 +0000 UTC m=+0.148925583 container start eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_buck, release=1764794109, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 08:41:47 np0005604215.localdomain podman[94221]: 2026-02-01 08:41:47.733542907 +0000 UTC m=+0.149367617 container attach eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_buck, vcs-type=git, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, ceph=True, release=1764794109, build-date=2025-12-08T17:28:53Z, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_CLEAN=True, version=7)
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]: [
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:     {
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "available": false,
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "ceph_device": false,
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "lsm_data": {},
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "lvs": [],
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "path": "/dev/sr0",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "rejected_reasons": [
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "Has a FileSystem",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "Insufficient space (<5GB)"
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         ],
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         "sys_api": {
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "actuators": null,
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "device_nodes": "sr0",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "human_readable_size": "482.00 KB",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "id_bus": "ata",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "model": "QEMU DVD-ROM",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "nr_requests": "2",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "partitions": {},
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "path": "/dev/sr0",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "removable": "1",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "rev": "2.5+",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "ro": "0",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "rotational": "1",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "sas_address": "",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "sas_device_handle": "",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "scheduler_mode": "mq-deadline",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "sectors": 0,
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "sectorsize": "2048",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "size": 493568.0,
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "support_discard": "0",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "type": "disk",
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:             "vendor": "QEMU"
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:         }
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]:     }
Feb 01 08:41:48 np0005604215.localdomain festive_buck[94235]: ]
Feb 01 08:41:48 np0005604215.localdomain systemd[1]: libpod-eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447.scope: Deactivated successfully.
Feb 01 08:41:48 np0005604215.localdomain podman[94221]: 2026-02-01 08:41:48.611666809 +0000 UTC m=+1.027491479 container died eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_buck, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, release=1764794109, ceph=True, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-type=git, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 08:41:48 np0005604215.localdomain podman[96012]: 2026-02-01 08:41:48.677928339 +0000 UTC m=+0.060809361 container remove eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_buck, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, name=rhceph, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, ceph=True, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=)
Feb 01 08:41:48 np0005604215.localdomain systemd[1]: libpod-conmon-eaa9ed93ad359bd12a9b2a7ee2014bc26cca34fa0c383688db1c86b5649c2447.scope: Deactivated successfully.
Feb 01 08:41:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-975ac4e8be7df32fbfc4d681407780e12093dcf7eec7d7f9a44d56f550b7855f-merged.mount: Deactivated successfully.
Feb 01 08:41:48 np0005604215.localdomain sudo[94005]: pam_unix(sudo:session): session closed for user root
Feb 01 08:41:49 np0005604215.localdomain sudo[96026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:41:49 np0005604215.localdomain sudo[96026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:41:49 np0005604215.localdomain sudo[96026]: pam_unix(sudo:session): session closed for user root
Feb 01 08:41:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:41:51 np0005604215.localdomain systemd[1]: tmp-crun.jICVzs.mount: Deactivated successfully.
Feb 01 08:41:51 np0005604215.localdomain podman[96041]: 2026-02-01 08:41:51.875739527 +0000 UTC m=+0.086210133 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:41:52 np0005604215.localdomain podman[96041]: 2026-02-01 08:41:52.24870874 +0000 UTC m=+0.459179346 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, batch=17.1_20260112.1, architecture=x86_64, container_name=nova_migration_target, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, version=17.1.13)
Feb 01 08:41:52 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:41:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:41:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:41:53 np0005604215.localdomain podman[96065]: 2026-02-01 08:41:53.854221945 +0000 UTC m=+0.067220971 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:41:53 np0005604215.localdomain podman[96065]: 2026-02-01 08:41:53.866719695 +0000 UTC m=+0.079718691 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com)
Feb 01 08:41:53 np0005604215.localdomain podman[96065]: unhealthy
Feb 01 08:41:53 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:41:53 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:41:53 np0005604215.localdomain podman[96064]: 2026-02-01 08:41:53.899064716 +0000 UTC m=+0.112043281 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-type=git, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:41:53 np0005604215.localdomain podman[96064]: 2026-02-01 08:41:53.909598025 +0000 UTC m=+0.122576590 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vcs-type=git, batch=17.1_20260112.1, release=1766032510, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:41:53 np0005604215.localdomain podman[96064]: unhealthy
Feb 01 08:41:53 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:41:53 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:42:00 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:42:00 np0005604215.localdomain recover_tripleo_nova_virtqemud[96106]: 62016
Feb 01 08:42:00 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:42:00 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:42:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:42:06 np0005604215.localdomain podman[96107]: 2026-02-01 08:42:06.865182555 +0000 UTC m=+0.082356134 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, config_id=tripleo_step1, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:42:07 np0005604215.localdomain podman[96107]: 2026-02-01 08:42:07.08590983 +0000 UTC m=+0.303083369 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, version=17.1.13, url=https://www.redhat.com, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64)
Feb 01 08:42:07 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:42:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:42:15 np0005604215.localdomain podman[96136]: 2026-02-01 08:42:15.86572596 +0000 UTC m=+0.084161980 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, version=17.1.13, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:42:15 np0005604215.localdomain podman[96136]: 2026-02-01 08:42:15.880800221 +0000 UTC m=+0.099236231 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, distribution-scope=public, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 01 08:42:15 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: tmp-crun.gOrlFY.mount: Deactivated successfully.
Feb 01 08:42:17 np0005604215.localdomain podman[96165]: 2026-02-01 08:42:17.876371273 +0000 UTC m=+0.078823364 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 01 08:42:17 np0005604215.localdomain podman[96158]: 2026-02-01 08:42:17.88845139 +0000 UTC m=+0.094647938 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step3, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:42:17 np0005604215.localdomain podman[96165]: 2026-02-01 08:42:17.90222074 +0000 UTC m=+0.104672811 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:42:17 np0005604215.localdomain podman[96158]: 2026-02-01 08:42:17.92461923 +0000 UTC m=+0.130815788 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 08:42:17 np0005604215.localdomain podman[96157]: 2026-02-01 08:42:17.938786082 +0000 UTC m=+0.145801056 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, config_id=tripleo_step5, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13)
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:42:17 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:42:17 np0005604215.localdomain podman[96160]: 2026-02-01 08:42:17.998779246 +0000 UTC m=+0.199721470 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:47Z, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git)
Feb 01 08:42:18 np0005604215.localdomain podman[96157]: 2026-02-01 08:42:18.021668272 +0000 UTC m=+0.228683216 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, version=17.1.13)
Feb 01 08:42:18 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:42:18 np0005604215.localdomain podman[96160]: 2026-02-01 08:42:18.059857504 +0000 UTC m=+0.260799698 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, distribution-scope=public, vcs-type=git)
Feb 01 08:42:18 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:42:18 np0005604215.localdomain podman[96156]: 2026-02-01 08:42:18.103845429 +0000 UTC m=+0.316475558 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=)
Feb 01 08:42:18 np0005604215.localdomain podman[96156]: 2026-02-01 08:42:18.11091935 +0000 UTC m=+0.323549519 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, container_name=logrotate_crond)
Feb 01 08:42:18 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:42:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:42:22 np0005604215.localdomain systemd[1]: tmp-crun.p5pZV2.mount: Deactivated successfully.
Feb 01 08:42:22 np0005604215.localdomain podman[96274]: 2026-02-01 08:42:22.862361464 +0000 UTC m=+0.081412594 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_id=tripleo_step4, io.buildah.version=1.41.5, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, vendor=Red Hat, Inc., container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:42:23 np0005604215.localdomain podman[96274]: 2026-02-01 08:42:23.227039917 +0000 UTC m=+0.446091037 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T23:32:04Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:42:23 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:42:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:42:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:42:24 np0005604215.localdomain podman[96299]: 2026-02-01 08:42:24.870084315 +0000 UTC m=+0.081071314 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, url=https://www.redhat.com, io.openshift.expose-services=)
Feb 01 08:42:24 np0005604215.localdomain podman[96299]: 2026-02-01 08:42:24.889751269 +0000 UTC m=+0.100738298 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team)
Feb 01 08:42:24 np0005604215.localdomain podman[96299]: unhealthy
Feb 01 08:42:24 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:42:24 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:42:24 np0005604215.localdomain podman[96298]: 2026-02-01 08:42:24.97713071 +0000 UTC m=+0.188765969 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, release=1766032510, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:42:24 np0005604215.localdomain podman[96298]: 2026-02-01 08:42:24.995764071 +0000 UTC m=+0.207399330 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, distribution-scope=public, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5)
Feb 01 08:42:25 np0005604215.localdomain podman[96298]: unhealthy
Feb 01 08:42:25 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:42:25 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:42:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:42:37 np0005604215.localdomain podman[96335]: 2026-02-01 08:42:37.87510009 +0000 UTC m=+0.090824679 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, url=https://www.redhat.com, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']})
Feb 01 08:42:38 np0005604215.localdomain podman[96335]: 2026-02-01 08:42:38.07485503 +0000 UTC m=+0.290579619 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, vendor=Red Hat, Inc., tcib_managed=true, distribution-scope=public, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:42:38 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:42:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:42:46 np0005604215.localdomain podman[96365]: 2026-02-01 08:42:46.866954164 +0000 UTC m=+0.080901378 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, architecture=x86_64, vcs-type=git, release=1766032510, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, batch=17.1_20260112.1, tcib_managed=true)
Feb 01 08:42:46 np0005604215.localdomain podman[96365]: 2026-02-01 08:42:46.879817736 +0000 UTC m=+0.093764960 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, vcs-type=git, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, tcib_managed=true, container_name=collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:42:46 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: tmp-crun.yKkTFu.mount: Deactivated successfully.
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: tmp-crun.wMp4Qf.mount: Deactivated successfully.
Feb 01 08:42:48 np0005604215.localdomain podman[96388]: 2026-02-01 08:42:48.883574382 +0000 UTC m=+0.084904283 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:42:48 np0005604215.localdomain podman[96394]: 2026-02-01 08:42:48.951776584 +0000 UTC m=+0.146434707 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, build-date=2026-01-12T23:07:30Z, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, architecture=x86_64)
Feb 01 08:42:48 np0005604215.localdomain podman[96388]: 2026-02-01 08:42:48.968797845 +0000 UTC m=+0.170127766 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:42:48 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:42:48 np0005604215.localdomain podman[96385]: 2026-02-01 08:42:48.986372594 +0000 UTC m=+0.196163149 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, distribution-scope=public, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true)
Feb 01 08:42:49 np0005604215.localdomain podman[96385]: 2026-02-01 08:42:49.023779682 +0000 UTC m=+0.233570237 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.expose-services=, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vendor=Red Hat, Inc., container_name=logrotate_crond, tcib_managed=true, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:42:49 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:42:49 np0005604215.localdomain podman[96386]: 2026-02-01 08:42:49.050649202 +0000 UTC m=+0.258553808 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, release=1766032510, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public)
Feb 01 08:42:49 np0005604215.localdomain podman[96394]: 2026-02-01 08:42:49.061717938 +0000 UTC m=+0.256376081 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:42:49 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:42:49 np0005604215.localdomain podman[96386]: 2026-02-01 08:42:49.084694396 +0000 UTC m=+0.292599082 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vendor=Red Hat, Inc., managed_by=tripleo_ansible, release=1766032510, version=17.1.13, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team)
Feb 01 08:42:49 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:42:49 np0005604215.localdomain podman[96387]: 2026-02-01 08:42:48.91356576 +0000 UTC m=+0.117102810 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:42:49 np0005604215.localdomain podman[96387]: 2026-02-01 08:42:49.143812432 +0000 UTC m=+0.347349572 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:42:49 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:42:49 np0005604215.localdomain sudo[96501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:42:49 np0005604215.localdomain sudo[96501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:42:49 np0005604215.localdomain sudo[96501]: pam_unix(sudo:session): session closed for user root
Feb 01 08:42:49 np0005604215.localdomain sudo[96516]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 08:42:49 np0005604215.localdomain sudo[96516]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:42:50 np0005604215.localdomain sudo[96516]: pam_unix(sudo:session): session closed for user root
Feb 01 08:42:50 np0005604215.localdomain sudo[96552]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:42:50 np0005604215.localdomain sudo[96552]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:42:50 np0005604215.localdomain sudo[96552]: pam_unix(sudo:session): session closed for user root
Feb 01 08:42:50 np0005604215.localdomain sudo[96567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:42:50 np0005604215.localdomain sudo[96567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:42:51 np0005604215.localdomain sudo[96567]: pam_unix(sudo:session): session closed for user root
Feb 01 08:42:51 np0005604215.localdomain sudo[96614]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:42:51 np0005604215.localdomain sudo[96614]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:42:51 np0005604215.localdomain sudo[96614]: pam_unix(sudo:session): session closed for user root
Feb 01 08:42:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:42:53 np0005604215.localdomain podman[96629]: 2026-02-01 08:42:53.962407484 +0000 UTC m=+0.084359847 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, container_name=nova_migration_target, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, architecture=x86_64, release=1766032510, vcs-type=git)
Feb 01 08:42:54 np0005604215.localdomain podman[96629]: 2026-02-01 08:42:54.328740328 +0000 UTC m=+0.450692671 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com)
Feb 01 08:42:54 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:42:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:42:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:42:55 np0005604215.localdomain podman[96653]: 2026-02-01 08:42:55.870253714 +0000 UTC m=+0.083324343 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, vcs-type=git, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:42:55 np0005604215.localdomain podman[96654]: 2026-02-01 08:42:55.919037929 +0000 UTC m=+0.128552297 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true)
Feb 01 08:42:55 np0005604215.localdomain podman[96653]: 2026-02-01 08:42:55.938572769 +0000 UTC m=+0.151643428 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, container_name=ovn_metadata_agent, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:42:55 np0005604215.localdomain podman[96653]: unhealthy
Feb 01 08:42:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:42:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:42:55 np0005604215.localdomain podman[96654]: 2026-02-01 08:42:55.95976043 +0000 UTC m=+0.169274718 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, architecture=x86_64, url=https://www.redhat.com, distribution-scope=public, managed_by=tripleo_ansible, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1)
Feb 01 08:42:55 np0005604215.localdomain podman[96654]: unhealthy
Feb 01 08:42:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:42:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:43:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:43:08 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:43:08 np0005604215.localdomain recover_tripleo_nova_virtqemud[96700]: 62016
Feb 01 08:43:08 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:43:08 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:43:08 np0005604215.localdomain podman[96693]: 2026-02-01 08:43:08.874562566 +0000 UTC m=+0.089790816 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, vcs-type=git, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:43:09 np0005604215.localdomain podman[96693]: 2026-02-01 08:43:09.09876337 +0000 UTC m=+0.313991550 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team)
Feb 01 08:43:09 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:43:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:43:17 np0005604215.localdomain podman[96724]: 2026-02-01 08:43:17.866513073 +0000 UTC m=+0.082305332 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, name=rhosp-rhel9/openstack-collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, tcib_managed=true, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, batch=17.1_20260112.1, url=https://www.redhat.com, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:43:17 np0005604215.localdomain podman[96724]: 2026-02-01 08:43:17.881686718 +0000 UTC m=+0.097478947 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, version=17.1.13, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510)
Feb 01 08:43:17 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: tmp-crun.MPmNJh.mount: Deactivated successfully.
Feb 01 08:43:19 np0005604215.localdomain podman[96744]: 2026-02-01 08:43:19.881869353 +0000 UTC m=+0.098967343 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, tcib_managed=true, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vcs-type=git, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, container_name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']})
Feb 01 08:43:19 np0005604215.localdomain podman[96744]: 2026-02-01 08:43:19.911737185 +0000 UTC m=+0.128835145 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 01 08:43:19 np0005604215.localdomain podman[96745]: 2026-02-01 08:43:19.923398209 +0000 UTC m=+0.137774784 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, container_name=nova_compute, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, release=1766032510, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:43:19 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:43:19 np0005604215.localdomain podman[96746]: 2026-02-01 08:43:19.980308728 +0000 UTC m=+0.190936815 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, build-date=2026-01-12T22:34:43Z, config_id=tripleo_step3, architecture=x86_64, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, io.buildah.version=1.41.5, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:43:19 np0005604215.localdomain podman[96746]: 2026-02-01 08:43:19.988448272 +0000 UTC m=+0.199076359 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13)
Feb 01 08:43:19 np0005604215.localdomain podman[96745]: 2026-02-01 08:43:19.997920948 +0000 UTC m=+0.212297583 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, architecture=x86_64, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 01 08:43:20 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:43:20 np0005604215.localdomain podman[96747]: 2026-02-01 08:43:20.046552657 +0000 UTC m=+0.252291223 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, managed_by=tripleo_ansible, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:43:20 np0005604215.localdomain podman[96753]: 2026-02-01 08:43:19.899416231 +0000 UTC m=+0.102061040 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, release=1766032510)
Feb 01 08:43:20 np0005604215.localdomain podman[96753]: 2026-02-01 08:43:20.086526686 +0000 UTC m=+0.289171455 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, distribution-scope=public, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team)
Feb 01 08:43:20 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:43:20 np0005604215.localdomain podman[96747]: 2026-02-01 08:43:20.107590124 +0000 UTC m=+0.313328660 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, build-date=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 01 08:43:20 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:43:20 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:43:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:43:24 np0005604215.localdomain podman[96861]: 2026-02-01 08:43:24.873454089 +0000 UTC m=+0.083921563 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target)
Feb 01 08:43:25 np0005604215.localdomain podman[96861]: 2026-02-01 08:43:25.265694772 +0000 UTC m=+0.476162286 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_migration_target, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 01 08:43:25 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:43:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:43:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:43:26 np0005604215.localdomain podman[96883]: 2026-02-01 08:43:26.870476375 +0000 UTC m=+0.084015685 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vcs-type=git, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=ovn_metadata_agent)
Feb 01 08:43:26 np0005604215.localdomain podman[96883]: 2026-02-01 08:43:26.890731478 +0000 UTC m=+0.104270788 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.buildah.version=1.41.5, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1766032510, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public)
Feb 01 08:43:26 np0005604215.localdomain podman[96883]: unhealthy
Feb 01 08:43:26 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:43:26 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:43:26 np0005604215.localdomain podman[96884]: 2026-02-01 08:43:26.976803107 +0000 UTC m=+0.187863820 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step4, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.buildah.version=1.41.5, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team)
Feb 01 08:43:27 np0005604215.localdomain podman[96884]: 2026-02-01 08:43:27.020013947 +0000 UTC m=+0.231074630 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-type=git, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:43:27 np0005604215.localdomain podman[96884]: unhealthy
Feb 01 08:43:27 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:43:27 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:43:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:43:39 np0005604215.localdomain systemd[1]: tmp-crun.wr5lnu.mount: Deactivated successfully.
Feb 01 08:43:39 np0005604215.localdomain podman[96924]: 2026-02-01 08:43:39.87613745 +0000 UTC m=+0.092151780 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.openshift.expose-services=, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, release=1766032510, version=17.1.13)
Feb 01 08:43:40 np0005604215.localdomain podman[96924]: 2026-02-01 08:43:40.069632735 +0000 UTC m=+0.285647075 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, url=https://www.redhat.com, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64)
Feb 01 08:43:40 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:43:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:43:48 np0005604215.localdomain podman[96953]: 2026-02-01 08:43:48.870606448 +0000 UTC m=+0.085247545 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, io.openshift.expose-services=, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, version=17.1.13, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true)
Feb 01 08:43:48 np0005604215.localdomain podman[96953]: 2026-02-01 08:43:48.904383463 +0000 UTC m=+0.119024570 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step3, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, architecture=x86_64, com.redhat.component=openstack-collectd-container, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, vcs-type=git, url=https://www.redhat.com)
Feb 01 08:43:48 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: tmp-crun.GySAk2.mount: Deactivated successfully.
Feb 01 08:43:50 np0005604215.localdomain podman[96975]: 2026-02-01 08:43:50.891312954 +0000 UTC m=+0.096622790 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, build-date=2026-01-12T22:34:43Z, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git)
Feb 01 08:43:50 np0005604215.localdomain podman[96975]: 2026-02-01 08:43:50.925518052 +0000 UTC m=+0.130827818 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, container_name=iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:43:50 np0005604215.localdomain podman[96976]: 2026-02-01 08:43:50.934549364 +0000 UTC m=+0.138769225 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vcs-type=git, url=https://www.redhat.com)
Feb 01 08:43:50 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:43:50 np0005604215.localdomain podman[96974]: 2026-02-01 08:43:50.983636018 +0000 UTC m=+0.193717792 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, config_id=tripleo_step5, tcib_managed=true, build-date=2026-01-12T23:32:04Z)
Feb 01 08:43:50 np0005604215.localdomain podman[96976]: 2026-02-01 08:43:50.988778299 +0000 UTC m=+0.192998190 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, release=1766032510, vcs-type=git, version=17.1.13, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:43:51 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:43:51 np0005604215.localdomain podman[96974]: 2026-02-01 08:43:51.036813919 +0000 UTC m=+0.246895723 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, release=1766032510, url=https://www.redhat.com, vcs-type=git, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:43:51 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:43:51 np0005604215.localdomain podman[96984]: 2026-02-01 08:43:50.91167724 +0000 UTC m=+0.109468301 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:43:51 np0005604215.localdomain podman[96973]: 2026-02-01 08:43:51.041042621 +0000 UTC m=+0.252703925 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, container_name=logrotate_crond, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:43:51 np0005604215.localdomain podman[96973]: 2026-02-01 08:43:51.120625778 +0000 UTC m=+0.332287152 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, release=1766032510, architecture=x86_64, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=logrotate_crond)
Feb 01 08:43:51 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:43:51 np0005604215.localdomain podman[96984]: 2026-02-01 08:43:51.144913326 +0000 UTC m=+0.342704417 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, url=https://www.redhat.com, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, io.buildah.version=1.41.5, distribution-scope=public, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:43:51 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:43:51 np0005604215.localdomain sudo[97094]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:43:51 np0005604215.localdomain sudo[97094]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:43:51 np0005604215.localdomain sudo[97094]: pam_unix(sudo:session): session closed for user root
Feb 01 08:43:51 np0005604215.localdomain sudo[97109]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:43:51 np0005604215.localdomain sudo[97109]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:43:52 np0005604215.localdomain sudo[97109]: pam_unix(sudo:session): session closed for user root
Feb 01 08:43:53 np0005604215.localdomain sudo[97155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:43:53 np0005604215.localdomain sudo[97155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:43:53 np0005604215.localdomain sudo[97155]: pam_unix(sudo:session): session closed for user root
Feb 01 08:43:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:43:55 np0005604215.localdomain podman[97170]: 2026-02-01 08:43:55.862191183 +0000 UTC m=+0.078913165 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, version=17.1.13, release=1766032510, container_name=nova_migration_target, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=)
Feb 01 08:43:56 np0005604215.localdomain podman[97170]: 2026-02-01 08:43:56.259844416 +0000 UTC m=+0.476566428 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, version=17.1.13, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public)
Feb 01 08:43:56 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:43:57 np0005604215.localdomain podman[97196]: 2026-02-01 08:43:57.862270235 +0000 UTC m=+0.080404272 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.openshift.expose-services=, container_name=ovn_controller, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, version=17.1.13, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:36:40Z)
Feb 01 08:43:57 np0005604215.localdomain podman[97196]: 2026-02-01 08:43:57.907749106 +0000 UTC m=+0.125883133 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: tmp-crun.Gn1MZu.mount: Deactivated successfully.
Feb 01 08:43:57 np0005604215.localdomain podman[97196]: unhealthy
Feb 01 08:43:57 np0005604215.localdomain podman[97195]: 2026-02-01 08:43:57.925314835 +0000 UTC m=+0.144820995 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:43:57 np0005604215.localdomain podman[97195]: 2026-02-01 08:43:57.968742381 +0000 UTC m=+0.188248521 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:43:57 np0005604215.localdomain podman[97195]: unhealthy
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:43:57 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:44:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:44:10 np0005604215.localdomain podman[97235]: 2026-02-01 08:44:10.871069348 +0000 UTC m=+0.086723270 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, release=1766032510, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:44:11 np0005604215.localdomain podman[97235]: 2026-02-01 08:44:11.08878119 +0000 UTC m=+0.304435092 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, release=1766032510, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 01 08:44:11 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:44:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:44:19 np0005604215.localdomain systemd[1]: tmp-crun.U5tYOR.mount: Deactivated successfully.
Feb 01 08:44:19 np0005604215.localdomain podman[97263]: 2026-02-01 08:44:19.880646085 +0000 UTC m=+0.094473632 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, tcib_managed=true, io.openshift.expose-services=, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, architecture=x86_64, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:44:19 np0005604215.localdomain podman[97263]: 2026-02-01 08:44:19.916801955 +0000 UTC m=+0.130629462 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, version=17.1.13, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vendor=Red Hat, Inc., container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 01 08:44:19 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: tmp-crun.S8yrxP.mount: Deactivated successfully.
Feb 01 08:44:21 np0005604215.localdomain podman[97285]: 2026-02-01 08:44:21.903263482 +0000 UTC m=+0.119309129 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_compute, architecture=x86_64, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, distribution-scope=public, tcib_managed=true)
Feb 01 08:44:21 np0005604215.localdomain podman[97286]: 2026-02-01 08:44:21.866426551 +0000 UTC m=+0.080897038 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, container_name=iscsid, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:44:21 np0005604215.localdomain podman[97298]: 2026-02-01 08:44:21.92656893 +0000 UTC m=+0.132466509 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:44:21 np0005604215.localdomain podman[97285]: 2026-02-01 08:44:21.96275819 +0000 UTC m=+0.178803847 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, release=1766032510, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, container_name=nova_compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:44:21 np0005604215.localdomain podman[97298]: 2026-02-01 08:44:21.977433948 +0000 UTC m=+0.183331487 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, url=https://www.redhat.com, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:44:21 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:44:22 np0005604215.localdomain podman[97286]: 2026-02-01 08:44:21.999940543 +0000 UTC m=+0.214411020 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:34:43Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, container_name=iscsid, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3)
Feb 01 08:44:22 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:44:22 np0005604215.localdomain podman[97284]: 2026-02-01 08:44:21.970793471 +0000 UTC m=+0.188217380 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, release=1766032510, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron)
Feb 01 08:44:22 np0005604215.localdomain podman[97284]: 2026-02-01 08:44:22.051343318 +0000 UTC m=+0.268767267 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, distribution-scope=public, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:44:22 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:44:22 np0005604215.localdomain podman[97287]: 2026-02-01 08:44:22.136674703 +0000 UTC m=+0.342300514 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:44:22 np0005604215.localdomain podman[97287]: 2026-02-01 08:44:22.170660455 +0000 UTC m=+0.376286296 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5)
Feb 01 08:44:22 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:44:22 np0005604215.localdomain systemd[1]: tmp-crun.oMel2h.mount: Deactivated successfully.
Feb 01 08:44:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:44:26 np0005604215.localdomain systemd[1]: tmp-crun.ocmfza.mount: Deactivated successfully.
Feb 01 08:44:26 np0005604215.localdomain podman[97399]: 2026-02-01 08:44:26.865574223 +0000 UTC m=+0.081625260 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, version=17.1.13)
Feb 01 08:44:27 np0005604215.localdomain podman[97399]: 2026-02-01 08:44:27.253497893 +0000 UTC m=+0.469549000 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:44:27 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:44:28 np0005604215.localdomain podman[97421]: 2026-02-01 08:44:28.850183813 +0000 UTC m=+0.064457815 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=)
Feb 01 08:44:28 np0005604215.localdomain podman[97421]: 2026-02-01 08:44:28.86607812 +0000 UTC m=+0.080352132 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, architecture=x86_64, url=https://www.redhat.com, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13)
Feb 01 08:44:28 np0005604215.localdomain podman[97421]: unhealthy
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: tmp-crun.ZkF78R.mount: Deactivated successfully.
Feb 01 08:44:28 np0005604215.localdomain podman[97422]: 2026-02-01 08:44:28.920553851 +0000 UTC m=+0.132854041 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, config_id=tripleo_step4, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:44:28 np0005604215.localdomain podman[97422]: 2026-02-01 08:44:28.955373328 +0000 UTC m=+0.167673478 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, version=17.1.13, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc.)
Feb 01 08:44:28 np0005604215.localdomain podman[97422]: unhealthy
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:44:28 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:44:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:44:41 np0005604215.localdomain podman[97459]: 2026-02-01 08:44:41.86879646 +0000 UTC m=+0.081393363 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, io.openshift.expose-services=, tcib_managed=true, config_id=tripleo_step1, release=1766032510, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:44:42 np0005604215.localdomain podman[97459]: 2026-02-01 08:44:42.101954254 +0000 UTC m=+0.314551197 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.buildah.version=1.41.5, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team)
Feb 01 08:44:42 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:44:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:44:50 np0005604215.localdomain podman[97488]: 2026-02-01 08:44:50.882756545 +0000 UTC m=+0.098816979 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, architecture=x86_64, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-collectd-container)
Feb 01 08:44:50 np0005604215.localdomain podman[97488]: 2026-02-01 08:44:50.911250225 +0000 UTC m=+0.127310629 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, architecture=x86_64, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:44:50 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:44:52 np0005604215.localdomain podman[97519]: 2026-02-01 08:44:52.852366755 +0000 UTC m=+0.064596049 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: tmp-crun.zlUhAA.mount: Deactivated successfully.
Feb 01 08:44:52 np0005604215.localdomain podman[97510]: 2026-02-01 08:44:52.904220084 +0000 UTC m=+0.118597806 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.buildah.version=1.41.5, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid)
Feb 01 08:44:52 np0005604215.localdomain podman[97508]: 2026-02-01 08:44:52.956230969 +0000 UTC m=+0.174575484 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, url=https://www.redhat.com, managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron)
Feb 01 08:44:52 np0005604215.localdomain podman[97508]: 2026-02-01 08:44:52.964076785 +0000 UTC m=+0.182421340 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, release=1766032510, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:44:52 np0005604215.localdomain podman[97519]: 2026-02-01 08:44:52.98027745 +0000 UTC m=+0.192506754 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, distribution-scope=public, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:44:52 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:44:53 np0005604215.localdomain podman[97525]: 2026-02-01 08:44:53.06187416 +0000 UTC m=+0.269876543 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, release=1766032510, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, tcib_managed=true, io.openshift.expose-services=)
Feb 01 08:44:53 np0005604215.localdomain podman[97510]: 2026-02-01 08:44:53.087058037 +0000 UTC m=+0.301435779 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container)
Feb 01 08:44:53 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:44:53 np0005604215.localdomain podman[97509]: 2026-02-01 08:44:53.102009463 +0000 UTC m=+0.320506734 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, batch=17.1_20260112.1, release=1766032510, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, managed_by=tripleo_ansible, tcib_managed=true, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:44:53 np0005604215.localdomain podman[97525]: 2026-02-01 08:44:53.111099227 +0000 UTC m=+0.319101560 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, url=https://www.redhat.com, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true)
Feb 01 08:44:53 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:44:53 np0005604215.localdomain podman[97509]: 2026-02-01 08:44:53.128643085 +0000 UTC m=+0.347140376 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, architecture=x86_64, vcs-type=git, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:44:53 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:44:53 np0005604215.localdomain sudo[97627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:44:53 np0005604215.localdomain sudo[97627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:44:53 np0005604215.localdomain sudo[97627]: pam_unix(sudo:session): session closed for user root
Feb 01 08:44:53 np0005604215.localdomain sudo[97642]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 08:44:53 np0005604215.localdomain sudo[97642]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:44:54 np0005604215.localdomain systemd[1]: tmp-crun.y3oUci.mount: Deactivated successfully.
Feb 01 08:44:54 np0005604215.localdomain podman[97727]: 2026-02-01 08:44:54.302233268 +0000 UTC m=+0.078728731 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 08:44:54 np0005604215.localdomain podman[97727]: 2026-02-01 08:44:54.399084884 +0000 UTC m=+0.175580387 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, version=7)
Feb 01 08:44:54 np0005604215.localdomain sudo[97642]: pam_unix(sudo:session): session closed for user root
Feb 01 08:44:54 np0005604215.localdomain sudo[97796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:44:54 np0005604215.localdomain sudo[97796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:44:54 np0005604215.localdomain sudo[97796]: pam_unix(sudo:session): session closed for user root
Feb 01 08:44:54 np0005604215.localdomain sudo[97811]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:44:54 np0005604215.localdomain sudo[97811]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:44:55 np0005604215.localdomain sudo[97811]: pam_unix(sudo:session): session closed for user root
Feb 01 08:44:56 np0005604215.localdomain sudo[97857]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:44:56 np0005604215.localdomain sudo[97857]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:44:56 np0005604215.localdomain sudo[97857]: pam_unix(sudo:session): session closed for user root
Feb 01 08:44:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:44:57 np0005604215.localdomain podman[97872]: 2026-02-01 08:44:57.875224438 +0000 UTC m=+0.082384935 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vendor=Red Hat, Inc., managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com)
Feb 01 08:44:58 np0005604215.localdomain podman[97872]: 2026-02-01 08:44:58.252275696 +0000 UTC m=+0.459436183 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, config_id=tripleo_step4, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:44:58 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:44:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:44:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:44:59 np0005604215.localdomain podman[97897]: 2026-02-01 08:44:59.878841001 +0000 UTC m=+0.087840105 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, distribution-scope=public, config_id=tripleo_step4, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=)
Feb 01 08:44:59 np0005604215.localdomain podman[97897]: 2026-02-01 08:44:59.919547832 +0000 UTC m=+0.128546906 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64)
Feb 01 08:44:59 np0005604215.localdomain podman[97897]: unhealthy
Feb 01 08:44:59 np0005604215.localdomain podman[97896]: 2026-02-01 08:44:59.929963137 +0000 UTC m=+0.140981274 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, tcib_managed=true, io.openshift.expose-services=, url=https://www.redhat.com, config_id=tripleo_step4, vcs-type=git, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 01 08:44:59 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:44:59 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:44:59 np0005604215.localdomain podman[97896]: 2026-02-01 08:44:59.946787053 +0000 UTC m=+0.157805190 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 01 08:44:59 np0005604215.localdomain podman[97896]: unhealthy
Feb 01 08:44:59 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:44:59 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:45:10 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:45:10 np0005604215.localdomain recover_tripleo_nova_virtqemud[97937]: 62016
Feb 01 08:45:10 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:45:10 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:45:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:45:12 np0005604215.localdomain systemd[1]: tmp-crun.cFEbZb.mount: Deactivated successfully.
Feb 01 08:45:12 np0005604215.localdomain podman[97938]: 2026-02-01 08:45:12.860008461 +0000 UTC m=+0.077362348 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, container_name=metrics_qdr, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, distribution-scope=public)
Feb 01 08:45:13 np0005604215.localdomain podman[97938]: 2026-02-01 08:45:13.048179789 +0000 UTC m=+0.265533606 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vendor=Red Hat, Inc., container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:45:13 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:45:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:45:21 np0005604215.localdomain podman[97967]: 2026-02-01 08:45:21.865840563 +0000 UTC m=+0.082142837 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_id=tripleo_step3, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, tcib_managed=true, distribution-scope=public, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git)
Feb 01 08:45:21 np0005604215.localdomain podman[97967]: 2026-02-01 08:45:21.877648572 +0000 UTC m=+0.093950836 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, distribution-scope=public, managed_by=tripleo_ansible, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z)
Feb 01 08:45:21 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:45:23 np0005604215.localdomain podman[97992]: 2026-02-01 08:45:23.878012963 +0000 UTC m=+0.085932576 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, build-date=2026-01-12T23:07:47Z, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:45:23 np0005604215.localdomain podman[97992]: 2026-02-01 08:45:23.908712682 +0000 UTC m=+0.116632295 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, managed_by=tripleo_ansible, architecture=x86_64, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute)
Feb 01 08:45:23 np0005604215.localdomain podman[97989]: 2026-02-01 08:45:23.924810985 +0000 UTC m=+0.141799911 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, tcib_managed=true, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, release=1766032510, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64)
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:45:23 np0005604215.localdomain podman[97989]: 2026-02-01 08:45:23.934531078 +0000 UTC m=+0.151519984 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vcs-type=git, com.redhat.component=openstack-cron-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, release=1766032510, distribution-scope=public, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 01 08:45:23 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:45:23 np0005604215.localdomain podman[97990]: 2026-02-01 08:45:23.975134566 +0000 UTC m=+0.189104218 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute)
Feb 01 08:45:24 np0005604215.localdomain podman[97990]: 2026-02-01 08:45:24.00052397 +0000 UTC m=+0.214493632 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, maintainer=OpenStack TripleO Team, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, distribution-scope=public, tcib_managed=true, vcs-type=git, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:45:24 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:45:24 np0005604215.localdomain podman[97999]: 2026-02-01 08:45:24.076804723 +0000 UTC m=+0.281904928 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, container_name=ceilometer_agent_ipmi, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:45:24 np0005604215.localdomain podman[97999]: 2026-02-01 08:45:24.100459452 +0000 UTC m=+0.305559587 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:45:24 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:45:24 np0005604215.localdomain podman[97991]: 2026-02-01 08:45:24.186387486 +0000 UTC m=+0.396033382 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, config_id=tripleo_step3, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:45:24 np0005604215.localdomain podman[97991]: 2026-02-01 08:45:24.199510327 +0000 UTC m=+0.409156223 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, release=1766032510, build-date=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, vcs-type=git, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:45:24 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:45:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:45:28 np0005604215.localdomain podman[98107]: 2026-02-01 08:45:28.86160956 +0000 UTC m=+0.078237205 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.expose-services=, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:45:29 np0005604215.localdomain podman[98107]: 2026-02-01 08:45:29.239627409 +0000 UTC m=+0.456255104 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:45:29 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:45:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:45:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:45:30 np0005604215.localdomain systemd[1]: tmp-crun.3faqrz.mount: Deactivated successfully.
Feb 01 08:45:30 np0005604215.localdomain podman[98131]: 2026-02-01 08:45:30.883652919 +0000 UTC m=+0.098218289 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, tcib_managed=true, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-type=git, config_id=tripleo_step4)
Feb 01 08:45:30 np0005604215.localdomain systemd[1]: tmp-crun.M6Lcy7.mount: Deactivated successfully.
Feb 01 08:45:30 np0005604215.localdomain podman[98132]: 2026-02-01 08:45:30.929205512 +0000 UTC m=+0.141484562 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, distribution-scope=public, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510)
Feb 01 08:45:30 np0005604215.localdomain podman[98131]: 2026-02-01 08:45:30.950717744 +0000 UTC m=+0.165283084 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:45:30 np0005604215.localdomain podman[98131]: unhealthy
Feb 01 08:45:30 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:45:30 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:45:30 np0005604215.localdomain podman[98132]: 2026-02-01 08:45:30.982678602 +0000 UTC m=+0.194957582 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 01 08:45:30 np0005604215.localdomain podman[98132]: unhealthy
Feb 01 08:45:31 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:45:31 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:45:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:45:43 np0005604215.localdomain podman[98171]: 2026-02-01 08:45:43.859816631 +0000 UTC m=+0.075896882 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:45:44 np0005604215.localdomain podman[98171]: 2026-02-01 08:45:44.054856024 +0000 UTC m=+0.270936245 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, release=1766032510, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., container_name=metrics_qdr, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:45:44 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:45:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:45:53 np0005604215.localdomain podman[98200]: 2026-02-01 08:45:53.547876583 +0000 UTC m=+0.081269670 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, architecture=x86_64, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, url=https://www.redhat.com, container_name=collectd, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, config_id=tripleo_step3, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:45:53 np0005604215.localdomain podman[98200]: 2026-02-01 08:45:53.562644344 +0000 UTC m=+0.096037431 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, vendor=Red Hat, Inc., version=17.1.13, container_name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3)
Feb 01 08:45:53 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:45:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:45:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:45:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:45:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:45:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:45:54 np0005604215.localdomain podman[98222]: 2026-02-01 08:45:54.877639385 +0000 UTC m=+0.090311873 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, version=17.1.13, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:45:54 np0005604215.localdomain podman[98221]: 2026-02-01 08:45:54.928736991 +0000 UTC m=+0.144362321 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, version=17.1.13, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:45:54 np0005604215.localdomain podman[98226]: 2026-02-01 08:45:54.995392913 +0000 UTC m=+0.199963568 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:45:55 np0005604215.localdomain podman[98226]: 2026-02-01 08:45:55.022560651 +0000 UTC m=+0.227131346 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, distribution-scope=public, tcib_managed=true, release=1766032510, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, url=https://www.redhat.com)
Feb 01 08:45:55 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:45:55 np0005604215.localdomain podman[98235]: 2026-02-01 08:45:55.040794622 +0000 UTC m=+0.243155178 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_ipmi, release=1766032510, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 01 08:45:55 np0005604215.localdomain podman[98223]: 2026-02-01 08:45:55.090231655 +0000 UTC m=+0.299266229 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, maintainer=OpenStack TripleO Team, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vcs-type=git, container_name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:45:55 np0005604215.localdomain podman[98223]: 2026-02-01 08:45:55.103699246 +0000 UTC m=+0.312733870 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, architecture=x86_64, build-date=2026-01-12T22:34:43Z, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:45:55 np0005604215.localdomain podman[98222]: 2026-02-01 08:45:55.110032225 +0000 UTC m=+0.322704653 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, version=17.1.13, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z)
Feb 01 08:45:55 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:45:55 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:45:55 np0005604215.localdomain podman[98235]: 2026-02-01 08:45:55.118557561 +0000 UTC m=+0.320918037 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:30Z, architecture=x86_64, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi)
Feb 01 08:45:55 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:45:55 np0005604215.localdomain podman[98221]: 2026-02-01 08:45:55.163184645 +0000 UTC m=+0.378809955 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, distribution-scope=public, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:45:55 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:45:55 np0005604215.localdomain systemd[1]: tmp-crun.yA5CZs.mount: Deactivated successfully.
Feb 01 08:45:56 np0005604215.localdomain sudo[98343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:45:56 np0005604215.localdomain sudo[98343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:45:56 np0005604215.localdomain sudo[98343]: pam_unix(sudo:session): session closed for user root
Feb 01 08:45:56 np0005604215.localdomain sudo[98358]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:45:56 np0005604215.localdomain sudo[98358]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:45:57 np0005604215.localdomain sudo[98358]: pam_unix(sudo:session): session closed for user root
Feb 01 08:45:57 np0005604215.localdomain sudo[98405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:45:57 np0005604215.localdomain sudo[98405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:45:57 np0005604215.localdomain sudo[98405]: pam_unix(sudo:session): session closed for user root
Feb 01 08:45:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:45:59 np0005604215.localdomain podman[98420]: 2026-02-01 08:45:59.875010561 +0000 UTC m=+0.090052704 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step4, url=https://www.redhat.com, io.buildah.version=1.41.5)
Feb 01 08:46:00 np0005604215.localdomain podman[98420]: 2026-02-01 08:46:00.257972595 +0000 UTC m=+0.473014708 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.buildah.version=1.41.5, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:46:00 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:46:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:46:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:46:01 np0005604215.localdomain systemd[1]: tmp-crun.4wW9QA.mount: Deactivated successfully.
Feb 01 08:46:01 np0005604215.localdomain podman[98445]: 2026-02-01 08:46:01.889065959 +0000 UTC m=+0.100495269 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:46:01 np0005604215.localdomain podman[98445]: 2026-02-01 08:46:01.932363923 +0000 UTC m=+0.143793183 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vendor=Red Hat, Inc., managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, architecture=x86_64, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, config_id=tripleo_step4, com.redhat.component=openstack-ovn-controller-container)
Feb 01 08:46:01 np0005604215.localdomain podman[98445]: unhealthy
Feb 01 08:46:01 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:46:01 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:46:01 np0005604215.localdomain podman[98444]: 2026-02-01 08:46:01.936616755 +0000 UTC m=+0.150172852 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 01 08:46:02 np0005604215.localdomain podman[98444]: 2026-02-01 08:46:02.019741642 +0000 UTC m=+0.233297749 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., batch=17.1_20260112.1, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:46:02 np0005604215.localdomain podman[98444]: unhealthy
Feb 01 08:46:02 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:46:02 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:46:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:46:14 np0005604215.localdomain podman[98482]: 2026-02-01 08:46:14.867461832 +0000 UTC m=+0.083126916 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, version=17.1.13, container_name=metrics_qdr, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, tcib_managed=true, io.openshift.expose-services=, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:46:15 np0005604215.localdomain podman[98482]: 2026-02-01 08:46:15.058883643 +0000 UTC m=+0.274548717 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 01 08:46:15 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:46:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:46:23 np0005604215.localdomain systemd[1]: tmp-crun.DNtOEY.mount: Deactivated successfully.
Feb 01 08:46:23 np0005604215.localdomain podman[98511]: 2026-02-01 08:46:23.867542853 +0000 UTC m=+0.084068567 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, container_name=collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:46:23 np0005604215.localdomain podman[98511]: 2026-02-01 08:46:23.876755951 +0000 UTC m=+0.093281595 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, batch=17.1_20260112.1, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.component=openstack-collectd-container, distribution-scope=public, tcib_managed=true, container_name=collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:46:23 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:46:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:46:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:46:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:46:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:46:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:46:25 np0005604215.localdomain podman[98533]: 2026-02-01 08:46:25.880022993 +0000 UTC m=+0.091868209 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:46:25 np0005604215.localdomain podman[98536]: 2026-02-01 08:46:25.860854445 +0000 UTC m=+0.068058437 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_compute, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, tcib_managed=true)
Feb 01 08:46:25 np0005604215.localdomain podman[98533]: 2026-02-01 08:46:25.918553487 +0000 UTC m=+0.130398673 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1766032510)
Feb 01 08:46:25 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:46:25 np0005604215.localdomain podman[98542]: 2026-02-01 08:46:25.912383094 +0000 UTC m=+0.114828727 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:46:26 np0005604215.localdomain podman[98534]: 2026-02-01 08:46:25.966342319 +0000 UTC m=+0.175624416 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:46:26 np0005604215.localdomain podman[98535]: 2026-02-01 08:46:26.024688902 +0000 UTC m=+0.230548962 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, vendor=Red Hat, Inc., vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, version=17.1.13)
Feb 01 08:46:26 np0005604215.localdomain podman[98534]: 2026-02-01 08:46:26.044825331 +0000 UTC m=+0.254107408 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, config_id=tripleo_step5, version=17.1.13, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc.)
Feb 01 08:46:26 np0005604215.localdomain podman[98542]: 2026-02-01 08:46:26.04702193 +0000 UTC m=+0.249467583 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:46:26 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:46:26 np0005604215.localdomain podman[98535]: 2026-02-01 08:46:26.061587215 +0000 UTC m=+0.267447205 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, distribution-scope=public, release=1766032510, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, vendor=Red Hat, Inc., batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:46:26 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:46:26 np0005604215.localdomain podman[98536]: 2026-02-01 08:46:26.095974419 +0000 UTC m=+0.303178431 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, tcib_managed=true, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, distribution-scope=public)
Feb 01 08:46:26 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:46:26 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:46:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:46:30 np0005604215.localdomain podman[98645]: 2026-02-01 08:46:30.848985433 +0000 UTC m=+0.068727567 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:46:31 np0005604215.localdomain podman[98645]: 2026-02-01 08:46:31.242798406 +0000 UTC m=+0.462540570 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, io.buildah.version=1.41.5, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 01 08:46:31 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:46:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:46:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:46:32 np0005604215.localdomain podman[98668]: 2026-02-01 08:46:32.860856344 +0000 UTC m=+0.075901902 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, managed_by=tripleo_ansible, config_id=tripleo_step4, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vendor=Red Hat, Inc., io.buildah.version=1.41.5, release=1766032510, distribution-scope=public, container_name=ovn_metadata_agent, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:46:32 np0005604215.localdomain podman[98668]: 2026-02-01 08:46:32.879861078 +0000 UTC m=+0.094906636 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, vendor=Red Hat, Inc., config_id=tripleo_step4, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, managed_by=tripleo_ansible, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:46:32 np0005604215.localdomain podman[98668]: unhealthy
Feb 01 08:46:32 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:46:32 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:46:32 np0005604215.localdomain systemd[1]: tmp-crun.WLwa8I.mount: Deactivated successfully.
Feb 01 08:46:32 np0005604215.localdomain podman[98669]: 2026-02-01 08:46:32.974098042 +0000 UTC m=+0.185645991 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, tcib_managed=true, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, build-date=2026-01-12T22:36:40Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:46:33 np0005604215.localdomain podman[98669]: 2026-02-01 08:46:33.010373365 +0000 UTC m=+0.221921314 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, managed_by=tripleo_ansible, config_id=tripleo_step4, tcib_managed=true, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-type=git, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:46:33 np0005604215.localdomain podman[98669]: unhealthy
Feb 01 08:46:33 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:46:33 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:46:40 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:46:40 np0005604215.localdomain recover_tripleo_nova_virtqemud[98709]: 62016
Feb 01 08:46:40 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:46:40 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:46:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:46:45 np0005604215.localdomain systemd[1]: tmp-crun.nom87f.mount: Deactivated successfully.
Feb 01 08:46:45 np0005604215.localdomain podman[98710]: 2026-02-01 08:46:45.874511178 +0000 UTC m=+0.092955665 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_id=tripleo_step1, managed_by=tripleo_ansible, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd)
Feb 01 08:46:46 np0005604215.localdomain podman[98710]: 2026-02-01 08:46:46.059502557 +0000 UTC m=+0.277947044 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, vcs-type=git, maintainer=OpenStack TripleO Team, tcib_managed=true, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_id=tripleo_step1)
Feb 01 08:46:46 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:46:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:46:55 np0005604215.localdomain podman[98739]: 2026-02-01 08:46:55.29847636 +0000 UTC m=+0.514137873 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, architecture=x86_64, release=1766032510, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5)
Feb 01 08:46:55 np0005604215.localdomain podman[98739]: 2026-02-01 08:46:55.310718022 +0000 UTC m=+0.526379515 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, batch=17.1_20260112.1, container_name=collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Feb 01 08:46:55 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:46:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:46:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:46:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:46:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:46:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:46:56 np0005604215.localdomain podman[98760]: 2026-02-01 08:46:56.898083561 +0000 UTC m=+0.068856642 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step5, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:46:56 np0005604215.localdomain podman[98771]: 2026-02-01 08:46:56.955541356 +0000 UTC m=+0.124648895 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, url=https://www.redhat.com, com.redhat.component=openstack-iscsid-container, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, container_name=iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public)
Feb 01 08:46:56 np0005604215.localdomain podman[98771]: 2026-02-01 08:46:56.966481258 +0000 UTC m=+0.135588807 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., distribution-scope=public, tcib_managed=true)
Feb 01 08:46:56 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:46:56 np0005604215.localdomain podman[98759]: 2026-02-01 08:46:56.878477079 +0000 UTC m=+0.091671615 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.buildah.version=1.41.5, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com)
Feb 01 08:46:56 np0005604215.localdomain podman[98773]: 2026-02-01 08:46:56.936219512 +0000 UTC m=+0.097585900 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, release=1766032510)
Feb 01 08:46:57 np0005604215.localdomain podman[98759]: 2026-02-01 08:46:57.011550635 +0000 UTC m=+0.224745171 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible)
Feb 01 08:46:57 np0005604215.localdomain podman[98773]: 2026-02-01 08:46:57.014507838 +0000 UTC m=+0.175874246 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., release=1766032510, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:46:57 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:46:57 np0005604215.localdomain podman[98760]: 2026-02-01 08:46:57.031330153 +0000 UTC m=+0.202103264 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, tcib_managed=true, version=17.1.13, vcs-type=git, container_name=nova_compute, release=1766032510, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:46:57 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:46:57 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:46:57 np0005604215.localdomain podman[98772]: 2026-02-01 08:46:57.11890517 +0000 UTC m=+0.283667383 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc.)
Feb 01 08:46:57 np0005604215.localdomain podman[98772]: 2026-02-01 08:46:57.177363935 +0000 UTC m=+0.342126148 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:46:57 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:46:57 np0005604215.localdomain sudo[98880]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:46:57 np0005604215.localdomain sudo[98880]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:46:57 np0005604215.localdomain sudo[98880]: pam_unix(sudo:session): session closed for user root
Feb 01 08:46:58 np0005604215.localdomain sudo[98895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:46:58 np0005604215.localdomain sudo[98895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:46:58 np0005604215.localdomain sudo[98895]: pam_unix(sudo:session): session closed for user root
Feb 01 08:46:59 np0005604215.localdomain sudo[98943]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:46:59 np0005604215.localdomain sudo[98943]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:46:59 np0005604215.localdomain sudo[98943]: pam_unix(sudo:session): session closed for user root
Feb 01 08:47:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:47:01 np0005604215.localdomain podman[98958]: 2026-02-01 08:47:01.899283118 +0000 UTC m=+0.112907008 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:47:02 np0005604215.localdomain podman[98958]: 2026-02-01 08:47:02.269270466 +0000 UTC m=+0.482894366 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_migration_target, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z)
Feb 01 08:47:02 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: tmp-crun.SmyWWB.mount: Deactivated successfully.
Feb 01 08:47:03 np0005604215.localdomain podman[98981]: 2026-02-01 08:47:03.862777188 +0000 UTC m=+0.077284026 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, distribution-scope=public, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, architecture=x86_64, io.buildah.version=1.41.5, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:47:03 np0005604215.localdomain podman[98981]: 2026-02-01 08:47:03.899891747 +0000 UTC m=+0.114398595 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, build-date=2026-01-12T22:36:40Z, version=17.1.13, io.openshift.expose-services=, config_id=tripleo_step4, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team)
Feb 01 08:47:03 np0005604215.localdomain podman[98981]: unhealthy
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: tmp-crun.63DDDJ.mount: Deactivated successfully.
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:47:03 np0005604215.localdomain podman[98980]: 2026-02-01 08:47:03.916395312 +0000 UTC m=+0.130649872 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, architecture=x86_64, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, container_name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, vcs-type=git)
Feb 01 08:47:03 np0005604215.localdomain podman[98980]: 2026-02-01 08:47:03.953248714 +0000 UTC m=+0.167503234 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510)
Feb 01 08:47:03 np0005604215.localdomain podman[98980]: unhealthy
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:47:03 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:47:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:47:16 np0005604215.localdomain podman[99019]: 2026-02-01 08:47:16.867840003 +0000 UTC m=+0.083956504 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc.)
Feb 01 08:47:17 np0005604215.localdomain podman[99019]: 2026-02-01 08:47:17.069665588 +0000 UTC m=+0.285782129 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, io.buildah.version=1.41.5, distribution-scope=public, vcs-type=git, build-date=2026-01-12T22:10:14Z, container_name=metrics_qdr, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:47:17 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:47:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:47:25 np0005604215.localdomain podman[99049]: 2026-02-01 08:47:25.856007451 +0000 UTC m=+0.072814016 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, version=17.1.13, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:47:25 np0005604215.localdomain podman[99049]: 2026-02-01 08:47:25.867629274 +0000 UTC m=+0.084435869 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:47:25 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:47:27 np0005604215.localdomain podman[99070]: 2026-02-01 08:47:27.866472797 +0000 UTC m=+0.079958108 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, architecture=x86_64, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: tmp-crun.V52r65.mount: Deactivated successfully.
Feb 01 08:47:27 np0005604215.localdomain podman[99069]: 2026-02-01 08:47:27.935362099 +0000 UTC m=+0.151458952 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc., architecture=x86_64, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 01 08:47:27 np0005604215.localdomain podman[99070]: 2026-02-01 08:47:27.950630656 +0000 UTC m=+0.164115917 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:47:27 np0005604215.localdomain podman[99069]: 2026-02-01 08:47:27.967771201 +0000 UTC m=+0.183868064 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, release=1766032510, container_name=logrotate_crond, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:47:27 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:47:28 np0005604215.localdomain podman[99072]: 2026-02-01 08:47:28.04166697 +0000 UTC m=+0.248473943 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, tcib_managed=true, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:47:28 np0005604215.localdomain podman[99072]: 2026-02-01 08:47:28.072691669 +0000 UTC m=+0.279498712 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, url=https://www.redhat.com, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, version=17.1.13)
Feb 01 08:47:28 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:47:28 np0005604215.localdomain podman[99071]: 2026-02-01 08:47:28.09065714 +0000 UTC m=+0.298687421 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, container_name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, architecture=x86_64, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:47:28 np0005604215.localdomain podman[99071]: 2026-02-01 08:47:28.101620323 +0000 UTC m=+0.309650594 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, container_name=iscsid, version=17.1.13, vcs-type=git, tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid)
Feb 01 08:47:28 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:47:28 np0005604215.localdomain podman[99078]: 2026-02-01 08:47:27.906991703 +0000 UTC m=+0.112028910 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, build-date=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, maintainer=OpenStack TripleO Team)
Feb 01 08:47:28 np0005604215.localdomain podman[99078]: 2026-02-01 08:47:28.189669284 +0000 UTC m=+0.394706511 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, tcib_managed=true, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, batch=17.1_20260112.1)
Feb 01 08:47:28 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:47:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:47:32 np0005604215.localdomain podman[99190]: 2026-02-01 08:47:32.861825332 +0000 UTC m=+0.077898715 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_id=tripleo_step4, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:47:33 np0005604215.localdomain podman[99190]: 2026-02-01 08:47:33.223238702 +0000 UTC m=+0.439312085 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, tcib_managed=true, io.buildah.version=1.41.5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:47:33 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:47:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:47:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:47:34 np0005604215.localdomain podman[99214]: 2026-02-01 08:47:34.857506176 +0000 UTC m=+0.066756047 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, vendor=Red Hat, Inc., container_name=ovn_controller, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, release=1766032510)
Feb 01 08:47:34 np0005604215.localdomain podman[99214]: 2026-02-01 08:47:34.872517375 +0000 UTC m=+0.081767236 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 01 08:47:34 np0005604215.localdomain podman[99214]: unhealthy
Feb 01 08:47:34 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:47:34 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:47:34 np0005604215.localdomain podman[99213]: 2026-02-01 08:47:34.956772197 +0000 UTC m=+0.168069152 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1)
Feb 01 08:47:34 np0005604215.localdomain podman[99213]: 2026-02-01 08:47:34.974606514 +0000 UTC m=+0.185903459 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-type=git, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com)
Feb 01 08:47:34 np0005604215.localdomain podman[99213]: unhealthy
Feb 01 08:47:34 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:47:34 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:47:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:47:47 np0005604215.localdomain podman[99253]: 2026-02-01 08:47:47.864459321 +0000 UTC m=+0.080386543 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:10:14Z, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step1, vendor=Red Hat, Inc., batch=17.1_20260112.1)
Feb 01 08:47:48 np0005604215.localdomain podman[99253]: 2026-02-01 08:47:48.046678633 +0000 UTC m=+0.262605885 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container)
Feb 01 08:47:48 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:47:48 np0005604215.localdomain sshd[35780]: Received disconnect from 192.168.122.100 port 33956:11: disconnected by user
Feb 01 08:47:48 np0005604215.localdomain sshd[35780]: Disconnected from user tripleo-admin 192.168.122.100 port 33956
Feb 01 08:47:48 np0005604215.localdomain sshd[35759]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 01 08:47:48 np0005604215.localdomain systemd[1]: session-28.scope: Deactivated successfully.
Feb 01 08:47:48 np0005604215.localdomain systemd[1]: session-28.scope: Consumed 7min 3.530s CPU time.
Feb 01 08:47:48 np0005604215.localdomain systemd-logind[761]: Session 28 logged out. Waiting for processes to exit.
Feb 01 08:47:48 np0005604215.localdomain systemd-logind[761]: Removed session 28.
Feb 01 08:47:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:47:56 np0005604215.localdomain podman[99283]: 2026-02-01 08:47:56.87313753 +0000 UTC m=+0.088252439 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, batch=17.1_20260112.1, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.component=openstack-collectd-container, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, release=1766032510, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc.)
Feb 01 08:47:56 np0005604215.localdomain podman[99283]: 2026-02-01 08:47:56.912674045 +0000 UTC m=+0.127788994 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, architecture=x86_64, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, vcs-type=git, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, managed_by=tripleo_ansible, version=17.1.13)
Feb 01 08:47:56 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Activating special unit Exit the Session...
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Removed slice User Background Tasks Slice.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped target Main User Target.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped target Basic System.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped target Paths.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped target Sockets.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped target Timers.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Closed D-Bus User Message Bus Socket.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Stopped Create User's Volatile Files and Directories.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Removed slice User Application Slice.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Reached target Shutdown.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Finished Exit the Session.
Feb 01 08:47:58 np0005604215.localdomain systemd[35763]: Reached target Exit the Session.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: user@1003.service: Consumed 4.621s CPU time.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: user-1003.slice: Consumed 7min 8.178s CPU time.
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: tmp-crun.Uc5l0y.mount: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain podman[99305]: 2026-02-01 08:47:58.653525488 +0000 UTC m=+0.104202666 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, io.openshift.expose-services=, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, com.redhat.component=openstack-iscsid-container, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, tcib_managed=true, distribution-scope=public, vcs-type=git, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:47:58 np0005604215.localdomain podman[99305]: 2026-02-01 08:47:58.695730226 +0000 UTC m=+0.146407434 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, version=17.1.13, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, url=https://www.redhat.com, io.openshift.expose-services=, name=rhosp-rhel9/openstack-iscsid)
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain podman[99303]: 2026-02-01 08:47:58.744124369 +0000 UTC m=+0.200802324 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., container_name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, vcs-type=git, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4)
Feb 01 08:47:58 np0005604215.localdomain podman[99303]: 2026-02-01 08:47:58.751723326 +0000 UTC m=+0.208401311 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, architecture=x86_64, release=1766032510, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain podman[99306]: 2026-02-01 08:47:58.799494718 +0000 UTC m=+0.248012708 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, build-date=2026-01-12T23:07:47Z, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:47:58 np0005604215.localdomain podman[99304]: 2026-02-01 08:47:58.701973612 +0000 UTC m=+0.154893090 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, release=1766032510, container_name=nova_compute, config_id=tripleo_step5, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:47:58 np0005604215.localdomain podman[99312]: 2026-02-01 08:47:58.670391825 +0000 UTC m=+0.115964953 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, version=17.1.13, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:47:58 np0005604215.localdomain podman[99306]: 2026-02-01 08:47:58.828332049 +0000 UTC m=+0.276850050 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4)
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain podman[99312]: 2026-02-01 08:47:58.859677128 +0000 UTC m=+0.305250306 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:47:58 np0005604215.localdomain podman[99304]: 2026-02-01 08:47:58.913390556 +0000 UTC m=+0.366310014 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, config_id=tripleo_step5, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vendor=Red Hat, Inc., managed_by=tripleo_ansible, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:47:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:47:59 np0005604215.localdomain systemd[1]: tmp-crun.c9arQz.mount: Deactivated successfully.
Feb 01 08:47:59 np0005604215.localdomain sudo[99413]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:47:59 np0005604215.localdomain sudo[99413]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:47:59 np0005604215.localdomain sudo[99413]: pam_unix(sudo:session): session closed for user root
Feb 01 08:47:59 np0005604215.localdomain sudo[99428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:47:59 np0005604215.localdomain sudo[99428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:48:00 np0005604215.localdomain sudo[99428]: pam_unix(sudo:session): session closed for user root
Feb 01 08:48:01 np0005604215.localdomain sudo[99475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:48:01 np0005604215.localdomain sudo[99475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:48:01 np0005604215.localdomain sudo[99475]: pam_unix(sudo:session): session closed for user root
Feb 01 08:48:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:48:03 np0005604215.localdomain podman[99490]: 2026-02-01 08:48:03.86327652 +0000 UTC m=+0.080284679 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_id=tripleo_step4, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, version=17.1.13, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true)
Feb 01 08:48:04 np0005604215.localdomain podman[99490]: 2026-02-01 08:48:04.237764129 +0000 UTC m=+0.454772308 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, container_name=nova_migration_target, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13)
Feb 01 08:48:04 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:48:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:48:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:48:05 np0005604215.localdomain systemd[1]: tmp-crun.wlV0N1.mount: Deactivated successfully.
Feb 01 08:48:05 np0005604215.localdomain podman[99514]: 2026-02-01 08:48:05.876323369 +0000 UTC m=+0.083438788 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, version=17.1.13, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., build-date=2026-01-12T22:36:40Z, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ovn_controller, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5)
Feb 01 08:48:05 np0005604215.localdomain podman[99514]: 2026-02-01 08:48:05.895630271 +0000 UTC m=+0.102745650 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 01 08:48:05 np0005604215.localdomain podman[99514]: unhealthy
Feb 01 08:48:05 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:48:05 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:48:05 np0005604215.localdomain podman[99513]: 2026-02-01 08:48:05.973334959 +0000 UTC m=+0.184471094 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, config_id=tripleo_step4, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, vendor=Red Hat, Inc., release=1766032510, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, distribution-scope=public, tcib_managed=true, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:48:06 np0005604215.localdomain podman[99513]: 2026-02-01 08:48:06.015184266 +0000 UTC m=+0.226320411 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, architecture=x86_64, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step4, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 01 08:48:06 np0005604215.localdomain podman[99513]: unhealthy
Feb 01 08:48:06 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:48:06 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:48:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:48:18 np0005604215.localdomain podman[99552]: 2026-02-01 08:48:18.868345247 +0000 UTC m=+0.083898482 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z)
Feb 01 08:48:19 np0005604215.localdomain podman[99552]: 2026-02-01 08:48:19.144064661 +0000 UTC m=+0.359617906 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:48:19 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:48:20 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:48:20 np0005604215.localdomain recover_tripleo_nova_virtqemud[99582]: 62016
Feb 01 08:48:20 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:48:20 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:48:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:48:27 np0005604215.localdomain podman[99583]: 2026-02-01 08:48:27.859129507 +0000 UTC m=+0.077585675 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, io.buildah.version=1.41.5, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510)
Feb 01 08:48:27 np0005604215.localdomain podman[99583]: 2026-02-01 08:48:27.871647647 +0000 UTC m=+0.090103795 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, vcs-type=git, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, distribution-scope=public, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:48:27 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:48:28 np0005604215.localdomain podman[99603]: 2026-02-01 08:48:28.867237339 +0000 UTC m=+0.081185156 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z)
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:48:28 np0005604215.localdomain podman[99603]: 2026-02-01 08:48:28.903972718 +0000 UTC m=+0.117920495 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, build-date=2026-01-12T22:34:43Z, container_name=iscsid, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, managed_by=tripleo_ansible, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: tmp-crun.TsRp13.mount: Deactivated successfully.
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:48:28 np0005604215.localdomain podman[99604]: 2026-02-01 08:48:28.929045631 +0000 UTC m=+0.139168688 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, url=https://www.redhat.com, release=1766032510, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron)
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:48:28 np0005604215.localdomain podman[99604]: 2026-02-01 08:48:28.967596265 +0000 UTC m=+0.177719322 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_id=tripleo_step4, managed_by=tripleo_ansible, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com)
Feb 01 08:48:28 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:48:29 np0005604215.localdomain podman[99634]: 2026-02-01 08:48:28.991142901 +0000 UTC m=+0.096347632 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., build-date=2026-01-12T23:07:30Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, batch=17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, architecture=x86_64)
Feb 01 08:48:29 np0005604215.localdomain podman[99634]: 2026-02-01 08:48:29.077705545 +0000 UTC m=+0.182910276 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, batch=17.1_20260112.1, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, version=17.1.13, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:48:29 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:48:29 np0005604215.localdomain podman[99663]: 2026-02-01 08:48:29.092637211 +0000 UTC m=+0.138373823 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.expose-services=, release=1766032510, container_name=nova_compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, vcs-type=git, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:48:29 np0005604215.localdomain podman[99633]: 2026-02-01 08:48:29.056865814 +0000 UTC m=+0.167878096 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:48:29 np0005604215.localdomain podman[99663]: 2026-02-01 08:48:29.12110309 +0000 UTC m=+0.166839752 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., container_name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, config_id=tripleo_step5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:48:29 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:48:29 np0005604215.localdomain podman[99633]: 2026-02-01 08:48:29.142742807 +0000 UTC m=+0.253755119 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true)
Feb 01 08:48:29 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:48:29 np0005604215.localdomain systemd[1]: tmp-crun.KmrEQi.mount: Deactivated successfully.
Feb 01 08:48:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:48:34 np0005604215.localdomain podman[99722]: 2026-02-01 08:48:34.858976259 +0000 UTC m=+0.075474786 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:48:35 np0005604215.localdomain podman[99722]: 2026-02-01 08:48:35.215710369 +0000 UTC m=+0.432208976 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, architecture=x86_64, distribution-scope=public, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13)
Feb 01 08:48:35 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:48:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:48:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:48:36 np0005604215.localdomain podman[99745]: 2026-02-01 08:48:36.838636421 +0000 UTC m=+0.062498960 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_id=tripleo_step4, release=1766032510, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:48:36 np0005604215.localdomain podman[99746]: 2026-02-01 08:48:36.84852524 +0000 UTC m=+0.068516659 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, architecture=x86_64, distribution-scope=public, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, io.buildah.version=1.41.5, tcib_managed=true, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=)
Feb 01 08:48:36 np0005604215.localdomain podman[99745]: 2026-02-01 08:48:36.858313635 +0000 UTC m=+0.082176144 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, tcib_managed=true, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container)
Feb 01 08:48:36 np0005604215.localdomain podman[99745]: unhealthy
Feb 01 08:48:36 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:48:36 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:48:36 np0005604215.localdomain podman[99746]: 2026-02-01 08:48:36.914376114 +0000 UTC m=+0.134367573 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z)
Feb 01 08:48:36 np0005604215.localdomain podman[99746]: unhealthy
Feb 01 08:48:36 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:48:36 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:48:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:48:49 np0005604215.localdomain podman[99787]: 2026-02-01 08:48:49.875348126 +0000 UTC m=+0.091438644 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, tcib_managed=true, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, io.buildah.version=1.41.5)
Feb 01 08:48:50 np0005604215.localdomain podman[99787]: 2026-02-01 08:48:50.063001831 +0000 UTC m=+0.279092339 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., batch=17.1_20260112.1, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step1, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=)
Feb 01 08:48:50 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:48:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:48:58 np0005604215.localdomain podman[99816]: 2026-02-01 08:48:58.863459801 +0000 UTC m=+0.079984316 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, summary=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-collectd, build-date=2026-01-12T22:10:15Z, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, version=17.1.13, io.buildah.version=1.41.5, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=)
Feb 01 08:48:58 np0005604215.localdomain podman[99816]: 2026-02-01 08:48:58.898682681 +0000 UTC m=+0.115207156 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vcs-type=git)
Feb 01 08:48:58 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:48:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:48:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:48:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:48:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:48:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:48:59 np0005604215.localdomain podman[99845]: 2026-02-01 08:48:59.88014009 +0000 UTC m=+0.085075625 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.buildah.version=1.41.5, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, tcib_managed=true, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, build-date=2026-01-12T23:07:30Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:48:59 np0005604215.localdomain podman[99845]: 2026-02-01 08:48:59.931634626 +0000 UTC m=+0.136570131 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, release=1766032510, io.openshift.expose-services=, architecture=x86_64, vcs-type=git, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:48:59 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:48:59 np0005604215.localdomain podman[99837]: 2026-02-01 08:48:59.934370542 +0000 UTC m=+0.148147313 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, tcib_managed=true, build-date=2026-01-12T23:32:04Z, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, version=17.1.13, maintainer=OpenStack TripleO Team, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:49:00 np0005604215.localdomain podman[99836]: 2026-02-01 08:48:59.985956502 +0000 UTC m=+0.203482860 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, url=https://www.redhat.com, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, tcib_managed=true, version=17.1.13)
Feb 01 08:49:00 np0005604215.localdomain podman[99838]: 2026-02-01 08:49:00.041534206 +0000 UTC m=+0.251837878 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3)
Feb 01 08:49:00 np0005604215.localdomain podman[99838]: 2026-02-01 08:49:00.079666225 +0000 UTC m=+0.289969897 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, container_name=iscsid, batch=17.1_20260112.1, config_id=tripleo_step3, version=17.1.13)
Feb 01 08:49:00 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:49:00 np0005604215.localdomain podman[99839]: 2026-02-01 08:49:00.091229176 +0000 UTC m=+0.300064813 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, url=https://www.redhat.com, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git)
Feb 01 08:49:00 np0005604215.localdomain podman[99837]: 2026-02-01 08:49:00.115007188 +0000 UTC m=+0.328783959 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, architecture=x86_64, build-date=2026-01-12T23:32:04Z, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, url=https://www.redhat.com, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:49:00 np0005604215.localdomain podman[99836]: 2026-02-01 08:49:00.117747643 +0000 UTC m=+0.335274031 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, build-date=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-cron-container, vcs-type=git, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, distribution-scope=public, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 01 08:49:00 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:49:00 np0005604215.localdomain podman[99839]: 2026-02-01 08:49:00.17278187 +0000 UTC m=+0.381617457 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, vcs-type=git, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 01 08:49:00 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:49:00 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:49:01 np0005604215.localdomain sudo[99954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:49:01 np0005604215.localdomain sudo[99954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:49:01 np0005604215.localdomain sudo[99954]: pam_unix(sudo:session): session closed for user root
Feb 01 08:49:01 np0005604215.localdomain sudo[99969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:49:01 np0005604215.localdomain sudo[99969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:49:01 np0005604215.localdomain sudo[99969]: pam_unix(sudo:session): session closed for user root
Feb 01 08:49:02 np0005604215.localdomain sudo[100016]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:49:02 np0005604215.localdomain sudo[100016]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:49:02 np0005604215.localdomain sudo[100016]: pam_unix(sudo:session): session closed for user root
Feb 01 08:49:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:49:05 np0005604215.localdomain systemd[1]: tmp-crun.WrXfh1.mount: Deactivated successfully.
Feb 01 08:49:05 np0005604215.localdomain podman[100031]: 2026-02-01 08:49:05.871014636 +0000 UTC m=+0.088587506 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4)
Feb 01 08:49:06 np0005604215.localdomain podman[100031]: 2026-02-01 08:49:06.264103869 +0000 UTC m=+0.481676739 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, container_name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., distribution-scope=public, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team)
Feb 01 08:49:06 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:49:07 np0005604215.localdomain podman[100055]: 2026-02-01 08:49:07.872149598 +0000 UTC m=+0.081623268 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, tcib_managed=true, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, distribution-scope=public, io.buildah.version=1.41.5, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, build-date=2026-01-12T22:36:40Z, release=1766032510, vendor=Red Hat, Inc., managed_by=tripleo_ansible, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:49:07 np0005604215.localdomain podman[100055]: 2026-02-01 08:49:07.888174878 +0000 UTC m=+0.097648588 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, batch=17.1_20260112.1, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, distribution-scope=public, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:49:07 np0005604215.localdomain podman[100055]: unhealthy
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: tmp-crun.iutctQ.mount: Deactivated successfully.
Feb 01 08:49:07 np0005604215.localdomain podman[100054]: 2026-02-01 08:49:07.935664289 +0000 UTC m=+0.145740198 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, build-date=2026-01-12T22:56:19Z, batch=17.1_20260112.1, container_name=ovn_metadata_agent, release=1766032510, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:49:07 np0005604215.localdomain podman[100054]: 2026-02-01 08:49:07.951792422 +0000 UTC m=+0.161868421 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, build-date=2026-01-12T22:56:19Z, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, batch=17.1_20260112.1, version=17.1.13, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:49:07 np0005604215.localdomain podman[100054]: unhealthy
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:49:07 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:49:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:49:20 np0005604215.localdomain podman[100093]: 2026-02-01 08:49:20.872518907 +0000 UTC m=+0.085057805 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 01 08:49:21 np0005604215.localdomain podman[100093]: 2026-02-01 08:49:21.089705143 +0000 UTC m=+0.302244021 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=metrics_qdr, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, managed_by=tripleo_ansible, config_id=tripleo_step1, io.buildah.version=1.41.5, vcs-type=git)
Feb 01 08:49:21 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:49:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:49:29 np0005604215.localdomain systemd[1]: tmp-crun.AVlp9m.mount: Deactivated successfully.
Feb 01 08:49:29 np0005604215.localdomain podman[100121]: 2026-02-01 08:49:29.882260708 +0000 UTC m=+0.094156249 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, config_id=tripleo_step3, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13)
Feb 01 08:49:29 np0005604215.localdomain podman[100121]: 2026-02-01 08:49:29.89162745 +0000 UTC m=+0.103522951 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, name=rhosp-rhel9/openstack-collectd, release=1766032510, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:49:29 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: tmp-crun.q2aDMt.mount: Deactivated successfully.
Feb 01 08:49:30 np0005604215.localdomain podman[100143]: 2026-02-01 08:49:30.889275534 +0000 UTC m=+0.098781282 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, vcs-type=git, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:49:30 np0005604215.localdomain podman[100143]: 2026-02-01 08:49:30.894486707 +0000 UTC m=+0.103992455 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, architecture=x86_64, container_name=logrotate_crond, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: tmp-crun.Skafwu.mount: Deactivated successfully.
Feb 01 08:49:30 np0005604215.localdomain podman[100157]: 2026-02-01 08:49:30.907157323 +0000 UTC m=+0.098651450 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vendor=Red Hat, Inc., url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, config_id=tripleo_step4, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, vcs-type=git, tcib_managed=true)
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:49:30 np0005604215.localdomain podman[100157]: 2026-02-01 08:49:30.935508697 +0000 UTC m=+0.127002814 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ceilometer-ipmi, distribution-scope=public, batch=17.1_20260112.1, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:49:30 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:49:30 np0005604215.localdomain podman[100145]: 2026-02-01 08:49:30.939597744 +0000 UTC m=+0.140581316 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-iscsid-container, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, vcs-type=git, container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Feb 01 08:49:30 np0005604215.localdomain podman[100144]: 2026-02-01 08:49:30.995931882 +0000 UTC m=+0.202089416 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, config_id=tripleo_step5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, architecture=x86_64, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:49:31 np0005604215.localdomain podman[100151]: 2026-02-01 08:49:31.043364412 +0000 UTC m=+0.240742382 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true)
Feb 01 08:49:31 np0005604215.localdomain podman[100145]: 2026-02-01 08:49:31.068939749 +0000 UTC m=+0.269923331 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, tcib_managed=true, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., container_name=iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container)
Feb 01 08:49:31 np0005604215.localdomain podman[100144]: 2026-02-01 08:49:31.075264857 +0000 UTC m=+0.281422391 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, distribution-scope=public, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., container_name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_id=tripleo_step5, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true)
Feb 01 08:49:31 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:49:31 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:49:31 np0005604215.localdomain podman[100151]: 2026-02-01 08:49:31.096886131 +0000 UTC m=+0.294264101 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vendor=Red Hat, Inc., io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:47Z, io.openshift.expose-services=, release=1766032510, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:49:31 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:49:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:49:36 np0005604215.localdomain podman[100257]: 2026-02-01 08:49:36.863519821 +0000 UTC m=+0.078050716 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_migration_target, config_id=tripleo_step4, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, release=1766032510)
Feb 01 08:49:37 np0005604215.localdomain podman[100257]: 2026-02-01 08:49:37.233878086 +0000 UTC m=+0.448409021 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:49:37 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:49:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:49:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:49:38 np0005604215.localdomain podman[100280]: 2026-02-01 08:49:38.879247548 +0000 UTC m=+0.081037499 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, container_name=ovn_controller, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4)
Feb 01 08:49:38 np0005604215.localdomain podman[100280]: 2026-02-01 08:49:38.926077079 +0000 UTC m=+0.127867020 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, managed_by=tripleo_ansible, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, container_name=ovn_controller, io.openshift.expose-services=, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., tcib_managed=true, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.buildah.version=1.41.5, config_id=tripleo_step4)
Feb 01 08:49:38 np0005604215.localdomain podman[100280]: unhealthy
Feb 01 08:49:38 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:49:38 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:49:38 np0005604215.localdomain podman[100279]: 2026-02-01 08:49:38.92897959 +0000 UTC m=+0.132367361 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, tcib_managed=true, container_name=ovn_metadata_agent, distribution-scope=public, config_id=tripleo_step4, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, io.openshift.expose-services=)
Feb 01 08:49:39 np0005604215.localdomain podman[100279]: 2026-02-01 08:49:39.013767315 +0000 UTC m=+0.217155056 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Feb 01 08:49:39 np0005604215.localdomain podman[100279]: unhealthy
Feb 01 08:49:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:49:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:49:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:49:51 np0005604215.localdomain podman[100321]: 2026-02-01 08:49:51.856412865 +0000 UTC m=+0.074870037 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, container_name=metrics_qdr, config_id=tripleo_step1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:49:52 np0005604215.localdomain podman[100321]: 2026-02-01 08:49:52.069577226 +0000 UTC m=+0.288034398 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, architecture=x86_64, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, vendor=Red Hat, Inc., container_name=metrics_qdr, config_id=tripleo_step1, io.openshift.expose-services=, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:49:52 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:50:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:50:00 np0005604215.localdomain systemd[1]: tmp-crun.YYVVlp.mount: Deactivated successfully.
Feb 01 08:50:00 np0005604215.localdomain podman[100351]: 2026-02-01 08:50:00.884454686 +0000 UTC m=+0.091701111 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, com.redhat.component=openstack-collectd-container, distribution-scope=public, vendor=Red Hat, Inc., tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:50:00 np0005604215.localdomain podman[100351]: 2026-02-01 08:50:00.89641076 +0000 UTC m=+0.103657195 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, distribution-scope=public, config_id=tripleo_step3, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, container_name=collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git)
Feb 01 08:50:00 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:50:01 np0005604215.localdomain recover_tripleo_nova_virtqemud[100404]: 62016
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: tmp-crun.P3jjcv.mount: Deactivated successfully.
Feb 01 08:50:01 np0005604215.localdomain podman[100371]: 2026-02-01 08:50:01.884543807 +0000 UTC m=+0.106532515 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, version=17.1.13, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public)
Feb 01 08:50:01 np0005604215.localdomain podman[100372]: 2026-02-01 08:50:01.895716836 +0000 UTC m=+0.109820187 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5)
Feb 01 08:50:01 np0005604215.localdomain podman[100371]: 2026-02-01 08:50:01.920237801 +0000 UTC m=+0.142226479 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, tcib_managed=true, vcs-type=git, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, architecture=x86_64, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, config_id=tripleo_step4, distribution-scope=public)
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:50:01 np0005604215.localdomain podman[100373]: 2026-02-01 08:50:01.93653562 +0000 UTC m=+0.149475694 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true)
Feb 01 08:50:01 np0005604215.localdomain podman[100372]: 2026-02-01 08:50:01.957680669 +0000 UTC m=+0.171784040 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., vcs-type=git, version=17.1.13, build-date=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, container_name=nova_compute)
Feb 01 08:50:01 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:50:01 np0005604215.localdomain podman[100375]: 2026-02-01 08:50:01.977314521 +0000 UTC m=+0.189934166 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, release=1766032510, distribution-scope=public, vcs-type=git, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:50:02 np0005604215.localdomain podman[100373]: 2026-02-01 08:50:01.996039796 +0000 UTC m=+0.208979840 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, com.redhat.component=openstack-iscsid-container, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:50:02 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:50:02 np0005604215.localdomain podman[100375]: 2026-02-01 08:50:02.055869693 +0000 UTC m=+0.268489278 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z)
Feb 01 08:50:02 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:50:02 np0005604215.localdomain podman[100380]: 2026-02-01 08:50:02.075862987 +0000 UTC m=+0.286593213 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, release=1766032510, io.openshift.expose-services=, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, build-date=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:50:02 np0005604215.localdomain podman[100380]: 2026-02-01 08:50:02.101402173 +0000 UTC m=+0.312132429 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vendor=Red Hat, Inc., version=17.1.13, distribution-scope=public, com.redhat.component=openstack-ceilometer-ipmi-container, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, architecture=x86_64)
Feb 01 08:50:02 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:50:02 np0005604215.localdomain sudo[100489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:50:02 np0005604215.localdomain sudo[100489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:50:02 np0005604215.localdomain sudo[100489]: pam_unix(sudo:session): session closed for user root
Feb 01 08:50:02 np0005604215.localdomain systemd[1]: tmp-crun.V8kZ54.mount: Deactivated successfully.
Feb 01 08:50:02 np0005604215.localdomain sudo[100504]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:50:02 np0005604215.localdomain sudo[100504]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:50:03 np0005604215.localdomain sudo[100504]: pam_unix(sudo:session): session closed for user root
Feb 01 08:50:04 np0005604215.localdomain sudo[100551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:50:04 np0005604215.localdomain sudo[100551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:50:04 np0005604215.localdomain sudo[100551]: pam_unix(sudo:session): session closed for user root
Feb 01 08:50:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:50:07 np0005604215.localdomain podman[100566]: 2026-02-01 08:50:07.869840739 +0000 UTC m=+0.082146673 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, vcs-type=git, config_id=tripleo_step4, tcib_managed=true, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:50:08 np0005604215.localdomain podman[100566]: 2026-02-01 08:50:08.244742146 +0000 UTC m=+0.457048060 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, release=1766032510, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc.)
Feb 01 08:50:08 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:50:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:50:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:50:09 np0005604215.localdomain podman[100590]: 2026-02-01 08:50:09.885759854 +0000 UTC m=+0.095985437 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:50:09 np0005604215.localdomain podman[100589]: 2026-02-01 08:50:09.927497725 +0000 UTC m=+0.140369500 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, version=17.1.13, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, release=1766032510, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:50:09 np0005604215.localdomain podman[100589]: 2026-02-01 08:50:09.949750469 +0000 UTC m=+0.162622274 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, tcib_managed=true, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, distribution-scope=public, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:50:09 np0005604215.localdomain podman[100589]: unhealthy
Feb 01 08:50:09 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:50:09 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:50:09 np0005604215.localdomain podman[100590]: 2026-02-01 08:50:09.979404975 +0000 UTC m=+0.189630498 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:50:09 np0005604215.localdomain podman[100590]: unhealthy
Feb 01 08:50:09 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:50:09 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:50:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:50:22 np0005604215.localdomain podman[100629]: 2026-02-01 08:50:22.873002704 +0000 UTC m=+0.090838226 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, config_id=tripleo_step1, io.openshift.expose-services=, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, url=https://www.redhat.com)
Feb 01 08:50:23 np0005604215.localdomain podman[100629]: 2026-02-01 08:50:23.059805981 +0000 UTC m=+0.277641523 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_id=tripleo_step1, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, com.redhat.component=openstack-qdrouterd-container, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public)
Feb 01 08:50:23 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:50:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:50:31 np0005604215.localdomain podman[100658]: 2026-02-01 08:50:31.870045297 +0000 UTC m=+0.074714922 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, com.redhat.component=openstack-collectd-container, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=collectd, io.buildah.version=1.41.5)
Feb 01 08:50:31 np0005604215.localdomain podman[100658]: 2026-02-01 08:50:31.882465455 +0000 UTC m=+0.087135070 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, container_name=collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-collectd-container, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 01 08:50:31 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: tmp-crun.r94fDt.mount: Deactivated successfully.
Feb 01 08:50:32 np0005604215.localdomain podman[100679]: 2026-02-01 08:50:32.8775389 +0000 UTC m=+0.087981027 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: tmp-crun.7E7S4u.mount: Deactivated successfully.
Feb 01 08:50:32 np0005604215.localdomain podman[100678]: 2026-02-01 08:50:32.927035803 +0000 UTC m=+0.140767972 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, architecture=x86_64, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.openshift.expose-services=, com.redhat.component=openstack-cron-container, container_name=logrotate_crond)
Feb 01 08:50:32 np0005604215.localdomain podman[100678]: 2026-02-01 08:50:32.936552951 +0000 UTC m=+0.150285050 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, batch=17.1_20260112.1, tcib_managed=true, release=1766032510, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:50:32 np0005604215.localdomain podman[100679]: 2026-02-01 08:50:32.953122738 +0000 UTC m=+0.163564835 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, managed_by=tripleo_ansible, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, tcib_managed=true, vcs-type=git, container_name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:50:32 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:50:32 np0005604215.localdomain podman[100686]: 2026-02-01 08:50:32.99069732 +0000 UTC m=+0.190510695 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, batch=17.1_20260112.1, vcs-type=git, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:50:33 np0005604215.localdomain podman[100680]: 2026-02-01 08:50:32.906424041 +0000 UTC m=+0.111047886 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-iscsid-container, distribution-scope=public)
Feb 01 08:50:33 np0005604215.localdomain podman[100680]: 2026-02-01 08:50:33.039688569 +0000 UTC m=+0.244312094 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=iscsid, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vendor=Red Hat, Inc., config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, release=1766032510, vcs-type=git)
Feb 01 08:50:33 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:50:33 np0005604215.localdomain podman[100692]: 2026-02-01 08:50:33.092144775 +0000 UTC m=+0.286703706 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, managed_by=tripleo_ansible, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:50:33 np0005604215.localdomain podman[100686]: 2026-02-01 08:50:33.098369849 +0000 UTC m=+0.298183254 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, release=1766032510, io.openshift.expose-services=, distribution-scope=public, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, build-date=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 01 08:50:33 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:50:33 np0005604215.localdomain podman[100692]: 2026-02-01 08:50:33.122580314 +0000 UTC m=+0.317139285 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, build-date=2026-01-12T23:07:30Z, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:50:33 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:50:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:50:38 np0005604215.localdomain podman[100794]: 2026-02-01 08:50:38.865047711 +0000 UTC m=+0.083251649 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, com.redhat.component=openstack-nova-compute-container, io.buildah.version=1.41.5, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute)
Feb 01 08:50:39 np0005604215.localdomain podman[100794]: 2026-02-01 08:50:39.251871799 +0000 UTC m=+0.470075677 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, com.redhat.component=openstack-nova-compute-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:50:39 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:50:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:50:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:50:40 np0005604215.localdomain podman[100817]: 2026-02-01 08:50:40.865684148 +0000 UTC m=+0.079084969 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, url=https://www.redhat.com, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.expose-services=, container_name=ovn_metadata_agent, io.buildah.version=1.41.5, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, release=1766032510)
Feb 01 08:50:40 np0005604215.localdomain podman[100817]: 2026-02-01 08:50:40.879748186 +0000 UTC m=+0.093149037 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, container_name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, tcib_managed=true, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, batch=17.1_20260112.1, vcs-type=git, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z)
Feb 01 08:50:40 np0005604215.localdomain podman[100817]: unhealthy
Feb 01 08:50:40 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:50:40 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:50:40 np0005604215.localdomain podman[100818]: 2026-02-01 08:50:40.931546553 +0000 UTC m=+0.141476706 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, url=https://www.redhat.com, vendor=Red Hat, Inc., release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, distribution-scope=public, config_id=tripleo_step4, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:50:40 np0005604215.localdomain podman[100818]: 2026-02-01 08:50:40.945418756 +0000 UTC m=+0.155348899 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, architecture=x86_64, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:50:40 np0005604215.localdomain podman[100818]: unhealthy
Feb 01 08:50:40 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:50:40 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:50:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:50:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:50:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 08:50:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4200.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 08:50:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:50:54 np0005604215.localdomain podman[100858]: 2026-02-01 08:50:54.339499868 +0000 UTC m=+0.083014771 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, vendor=Red Hat, Inc., container_name=metrics_qdr, managed_by=tripleo_ansible, config_id=tripleo_step1, url=https://www.redhat.com, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team)
Feb 01 08:50:54 np0005604215.localdomain podman[100858]: 2026-02-01 08:50:54.553754822 +0000 UTC m=+0.297269695 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, config_id=tripleo_step1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 qdrouterd, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, container_name=metrics_qdr, io.buildah.version=1.41.5)
Feb 01 08:50:54 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:51:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:51:02 np0005604215.localdomain systemd[1]: tmp-crun.4VEgP1.mount: Deactivated successfully.
Feb 01 08:51:02 np0005604215.localdomain podman[100887]: 2026-02-01 08:51:02.872997748 +0000 UTC m=+0.084225638 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container)
Feb 01 08:51:02 np0005604215.localdomain podman[100887]: 2026-02-01 08:51:02.911624404 +0000 UTC m=+0.122852234 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, batch=17.1_20260112.1, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-collectd-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, vcs-type=git, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:51:02 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:51:03 np0005604215.localdomain podman[100909]: 2026-02-01 08:51:03.856672517 +0000 UTC m=+0.068773586 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, container_name=iscsid, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, release=1766032510, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, config_id=tripleo_step3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc.)
Feb 01 08:51:03 np0005604215.localdomain podman[100909]: 2026-02-01 08:51:03.866783253 +0000 UTC m=+0.078884312 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., distribution-scope=public, release=1766032510, summary=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-iscsid-container, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:51:03 np0005604215.localdomain podman[100916]: 2026-02-01 08:51:03.908626668 +0000 UTC m=+0.117093734 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-ipmi-container, distribution-scope=public, architecture=x86_64, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, release=1766032510, version=17.1.13, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:30Z, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:51:03 np0005604215.localdomain systemd[1]: tmp-crun.4clC77.mount: Deactivated successfully.
Feb 01 08:51:03 np0005604215.localdomain podman[100908]: 2026-02-01 08:51:03.930549883 +0000 UTC m=+0.144074647 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64)
Feb 01 08:51:03 np0005604215.localdomain podman[100911]: 2026-02-01 08:51:03.993017721 +0000 UTC m=+0.198736951 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.5, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, com.redhat.component=openstack-ceilometer-compute-container, url=https://www.redhat.com, managed_by=tripleo_ansible, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, version=17.1.13, build-date=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 01 08:51:04 np0005604215.localdomain podman[100908]: 2026-02-01 08:51:04.011583751 +0000 UTC m=+0.225108565 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, name=rhosp-rhel9/openstack-nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:51:04 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:51:04 np0005604215.localdomain podman[100911]: 2026-02-01 08:51:04.023656687 +0000 UTC m=+0.229375857 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, batch=17.1_20260112.1, container_name=ceilometer_agent_compute, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.buildah.version=1.41.5, vcs-type=git, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:51:04 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:51:04 np0005604215.localdomain podman[100916]: 2026-02-01 08:51:04.061871899 +0000 UTC m=+0.270338965 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., distribution-scope=public, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:51:04 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:51:04 np0005604215.localdomain podman[100907]: 2026-02-01 08:51:03.962182949 +0000 UTC m=+0.179818251 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., batch=17.1_20260112.1, url=https://www.redhat.com, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, com.redhat.component=openstack-cron-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, version=17.1.13, managed_by=tripleo_ansible, distribution-scope=public)
Feb 01 08:51:04 np0005604215.localdomain podman[100907]: 2026-02-01 08:51:04.145695124 +0000 UTC m=+0.363330486 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, com.redhat.component=openstack-cron-container, tcib_managed=true, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., version=17.1.13, container_name=logrotate_crond, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible)
Feb 01 08:51:04 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:51:04 np0005604215.localdomain sudo[101027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:51:04 np0005604215.localdomain sudo[101027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:51:04 np0005604215.localdomain sudo[101027]: pam_unix(sudo:session): session closed for user root
Feb 01 08:51:04 np0005604215.localdomain sudo[101042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:51:04 np0005604215.localdomain sudo[101042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:51:05 np0005604215.localdomain sudo[101042]: pam_unix(sudo:session): session closed for user root
Feb 01 08:51:07 np0005604215.localdomain sudo[101089]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:51:07 np0005604215.localdomain sudo[101089]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:51:07 np0005604215.localdomain sudo[101089]: pam_unix(sudo:session): session closed for user root
Feb 01 08:51:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:51:09 np0005604215.localdomain podman[101104]: 2026-02-01 08:51:09.872603156 +0000 UTC m=+0.086053746 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:51:10 np0005604215.localdomain podman[101104]: 2026-02-01 08:51:10.258524826 +0000 UTC m=+0.471975376 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.expose-services=, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, config_id=tripleo_step4)
Feb 01 08:51:10 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:51:11 np0005604215.localdomain podman[101128]: 2026-02-01 08:51:11.871434697 +0000 UTC m=+0.080590115 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, vendor=Red Hat, Inc., batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:51:11 np0005604215.localdomain podman[101128]: 2026-02-01 08:51:11.917621458 +0000 UTC m=+0.126776846 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=ovn_controller, version=17.1.13, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc.)
Feb 01 08:51:11 np0005604215.localdomain podman[101128]: unhealthy
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: tmp-crun.0cg4Fs.mount: Deactivated successfully.
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:51:11 np0005604215.localdomain podman[101127]: 2026-02-01 08:51:11.938400006 +0000 UTC m=+0.149550096 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, tcib_managed=true, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:51:11 np0005604215.localdomain podman[101127]: 2026-02-01 08:51:11.956604785 +0000 UTC m=+0.167754875 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, build-date=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:51:11 np0005604215.localdomain podman[101127]: unhealthy
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:51:11 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:51:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:51:24 np0005604215.localdomain podman[101170]: 2026-02-01 08:51:24.874854041 +0000 UTC m=+0.089586566 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, name=rhosp-rhel9/openstack-qdrouterd, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:51:25 np0005604215.localdomain podman[101170]: 2026-02-01 08:51:25.104858206 +0000 UTC m=+0.319590701 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, container_name=metrics_qdr, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.openshift.expose-services=, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510)
Feb 01 08:51:25 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:51:30 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:51:30 np0005604215.localdomain recover_tripleo_nova_virtqemud[101200]: 62016
Feb 01 08:51:30 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:51:30 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:51:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:51:33 np0005604215.localdomain podman[101201]: 2026-02-01 08:51:33.866162875 +0000 UTC m=+0.080967697 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, com.redhat.component=openstack-collectd-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step3, container_name=collectd, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:51:33 np0005604215.localdomain podman[101201]: 2026-02-01 08:51:33.873926297 +0000 UTC m=+0.088731109 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, tcib_managed=true, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.openshift.expose-services=, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, config_id=tripleo_step3, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container)
Feb 01 08:51:33 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:51:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:51:33 np0005604215.localdomain systemd[1]: tmp-crun.1InDP2.mount: Deactivated successfully.
Feb 01 08:51:33 np0005604215.localdomain podman[101220]: 2026-02-01 08:51:33.990181004 +0000 UTC m=+0.082469164 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, container_name=iscsid, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, batch=17.1_20260112.1, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, config_id=tripleo_step3, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, version=17.1.13)
Feb 01 08:51:34 np0005604215.localdomain podman[101220]: 2026-02-01 08:51:34.00095338 +0000 UTC m=+0.093241570 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, managed_by=tripleo_ansible, container_name=iscsid, architecture=x86_64, tcib_managed=true, vcs-type=git, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13)
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:51:34 np0005604215.localdomain podman[101238]: 2026-02-01 08:51:34.87533294 +0000 UTC m=+0.088856054 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-cron, version=17.1.13, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4, vcs-type=git, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:51:34 np0005604215.localdomain podman[101238]: 2026-02-01 08:51:34.911608171 +0000 UTC m=+0.125131195 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, distribution-scope=public, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vcs-type=git, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: tmp-crun.n6dGij.mount: Deactivated successfully.
Feb 01 08:51:34 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:51:34 np0005604215.localdomain podman[101239]: 2026-02-01 08:51:34.924834183 +0000 UTC m=+0.134719794 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, tcib_managed=true, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, managed_by=tripleo_ansible, container_name=nova_compute, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Feb 01 08:51:34 np0005604215.localdomain podman[101240]: 2026-02-01 08:51:34.96190012 +0000 UTC m=+0.168650282 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, release=1766032510, build-date=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, container_name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team)
Feb 01 08:51:34 np0005604215.localdomain podman[101239]: 2026-02-01 08:51:34.997259693 +0000 UTC m=+0.207145304 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 01 08:51:35 np0005604215.localdomain podman[101240]: 2026-02-01 08:51:35.004240471 +0000 UTC m=+0.210990623 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vendor=Red Hat, Inc., version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, architecture=x86_64, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=)
Feb 01 08:51:35 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:51:35 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:51:35 np0005604215.localdomain podman[101246]: 2026-02-01 08:51:35.034461694 +0000 UTC m=+0.235801378 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, batch=17.1_20260112.1, tcib_managed=true, container_name=ceilometer_agent_ipmi, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, release=1766032510)
Feb 01 08:51:35 np0005604215.localdomain podman[101246]: 2026-02-01 08:51:35.084740523 +0000 UTC m=+0.286080237 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, release=1766032510, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, build-date=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, config_id=tripleo_step4)
Feb 01 08:51:35 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:51:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:51:40 np0005604215.localdomain podman[101332]: 2026-02-01 08:51:40.86699688 +0000 UTC m=+0.080140562 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13)
Feb 01 08:51:41 np0005604215.localdomain podman[101332]: 2026-02-01 08:51:41.237409276 +0000 UTC m=+0.450552928 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, container_name=nova_migration_target)
Feb 01 08:51:41 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:51:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:51:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:51:42 np0005604215.localdomain podman[101357]: 2026-02-01 08:51:42.871912579 +0000 UTC m=+0.084995372 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, io.openshift.expose-services=, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, maintainer=OpenStack TripleO Team, version=17.1.13, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:51:42 np0005604215.localdomain podman[101357]: 2026-02-01 08:51:42.919720221 +0000 UTC m=+0.132802964 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, io.buildah.version=1.41.5, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, container_name=ovn_metadata_agent, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team)
Feb 01 08:51:42 np0005604215.localdomain podman[101357]: unhealthy
Feb 01 08:51:42 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:51:42 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:51:42 np0005604215.localdomain podman[101358]: 2026-02-01 08:51:42.926854564 +0000 UTC m=+0.136252732 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, distribution-scope=public)
Feb 01 08:51:43 np0005604215.localdomain podman[101358]: 2026-02-01 08:51:43.009844653 +0000 UTC m=+0.219242841 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, com.redhat.component=openstack-ovn-controller-container, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4)
Feb 01 08:51:43 np0005604215.localdomain podman[101358]: unhealthy
Feb 01 08:51:43 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:51:43 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:51:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:51:55 np0005604215.localdomain podman[101397]: 2026-02-01 08:51:55.863313271 +0000 UTC m=+0.079273035 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, managed_by=tripleo_ansible, config_id=tripleo_step1, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510)
Feb 01 08:51:56 np0005604215.localdomain podman[101397]: 2026-02-01 08:51:56.078984219 +0000 UTC m=+0.294943983 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, container_name=metrics_qdr, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, managed_by=tripleo_ansible, io.openshift.expose-services=, config_id=tripleo_step1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, vendor=Red Hat, Inc., url=https://www.redhat.com)
Feb 01 08:51:56 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:52:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:52:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:52:04 np0005604215.localdomain systemd[1]: tmp-crun.mQxsQ8.mount: Deactivated successfully.
Feb 01 08:52:04 np0005604215.localdomain podman[101427]: 2026-02-01 08:52:04.855981545 +0000 UTC m=+0.070769408 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, architecture=x86_64, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, managed_by=tripleo_ansible, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, container_name=iscsid, com.redhat.component=openstack-iscsid-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:34:43Z)
Feb 01 08:52:04 np0005604215.localdomain podman[101427]: 2026-02-01 08:52:04.865413939 +0000 UTC m=+0.080201832 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, release=1766032510, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, build-date=2026-01-12T22:34:43Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-iscsid-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid)
Feb 01 08:52:04 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:52:04 np0005604215.localdomain podman[101428]: 2026-02-01 08:52:04.934488375 +0000 UTC m=+0.145656186 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, io.buildah.version=1.41.5, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, distribution-scope=public, tcib_managed=true, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, architecture=x86_64, maintainer=OpenStack TripleO Team, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:52:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:52:04 np0005604215.localdomain podman[101428]: 2026-02-01 08:52:04.973715279 +0000 UTC m=+0.184883130 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, container_name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, release=1766032510, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 01 08:52:04 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:52:05 np0005604215.localdomain podman[101466]: 2026-02-01 08:52:05.035389853 +0000 UTC m=+0.081731611 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_id=tripleo_step4, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container)
Feb 01 08:52:05 np0005604215.localdomain podman[101482]: 2026-02-01 08:52:05.106946425 +0000 UTC m=+0.074401332 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., tcib_managed=true, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, container_name=nova_compute, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:52:05 np0005604215.localdomain podman[101485]: 2026-02-01 08:52:05.156182971 +0000 UTC m=+0.119102187 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, build-date=2026-01-12T23:07:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, url=https://www.redhat.com)
Feb 01 08:52:05 np0005604215.localdomain podman[101482]: 2026-02-01 08:52:05.163765448 +0000 UTC m=+0.131220335 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13)
Feb 01 08:52:05 np0005604215.localdomain podman[101466]: 2026-02-01 08:52:05.179406945 +0000 UTC m=+0.225748773 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, tcib_managed=true, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, architecture=x86_64, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, io.openshift.expose-services=, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team)
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:52:05 np0005604215.localdomain podman[101485]: 2026-02-01 08:52:05.189870492 +0000 UTC m=+0.152789698 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, config_id=tripleo_step4, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-type=git, distribution-scope=public, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:52:05 np0005604215.localdomain podman[101526]: 2026-02-01 08:52:05.250794973 +0000 UTC m=+0.092896229 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, tcib_managed=true, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ceilometer_agent_ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, release=1766032510, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:30Z, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64)
Feb 01 08:52:05 np0005604215.localdomain podman[101526]: 2026-02-01 08:52:05.28277781 +0000 UTC m=+0.124879116 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, io.buildah.version=1.41.5, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, tcib_managed=true)
Feb 01 08:52:05 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:52:07 np0005604215.localdomain sudo[101568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:52:07 np0005604215.localdomain sudo[101568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:52:07 np0005604215.localdomain sudo[101568]: pam_unix(sudo:session): session closed for user root
Feb 01 08:52:07 np0005604215.localdomain sudo[101583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:52:07 np0005604215.localdomain sudo[101583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:52:08 np0005604215.localdomain sudo[101583]: pam_unix(sudo:session): session closed for user root
Feb 01 08:52:10 np0005604215.localdomain sudo[101630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:52:10 np0005604215.localdomain sudo[101630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:52:10 np0005604215.localdomain sudo[101630]: pam_unix(sudo:session): session closed for user root
Feb 01 08:52:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:52:11 np0005604215.localdomain podman[101645]: 2026-02-01 08:52:11.862247339 +0000 UTC m=+0.079016326 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, version=17.1.13, maintainer=OpenStack TripleO Team, container_name=nova_migration_target, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, vendor=Red Hat, Inc., batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.openshift.expose-services=)
Feb 01 08:52:12 np0005604215.localdomain podman[101645]: 2026-02-01 08:52:12.231082966 +0000 UTC m=+0.447851993 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-type=git, distribution-scope=public, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, release=1766032510, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-compute, container_name=nova_migration_target, maintainer=OpenStack TripleO Team, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:52:12 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:52:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:52:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:52:13 np0005604215.localdomain podman[101669]: 2026-02-01 08:52:13.87231138 +0000 UTC m=+0.089398880 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:52:13 np0005604215.localdomain podman[101669]: 2026-02-01 08:52:13.915754996 +0000 UTC m=+0.132842426 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, build-date=2026-01-12T22:36:40Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container)
Feb 01 08:52:13 np0005604215.localdomain podman[101669]: unhealthy
Feb 01 08:52:13 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:52:13 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:52:13 np0005604215.localdomain podman[101668]: 2026-02-01 08:52:13.938182126 +0000 UTC m=+0.157043861 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, architecture=x86_64)
Feb 01 08:52:13 np0005604215.localdomain podman[101668]: 2026-02-01 08:52:13.947528017 +0000 UTC m=+0.166389802 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.buildah.version=1.41.5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, distribution-scope=public, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true)
Feb 01 08:52:13 np0005604215.localdomain podman[101668]: unhealthy
Feb 01 08:52:13 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:52:13 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:52:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:52:26 np0005604215.localdomain podman[101707]: 2026-02-01 08:52:26.86974012 +0000 UTC m=+0.084998593 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, build-date=2026-01-12T22:10:14Z, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, managed_by=tripleo_ansible, config_id=tripleo_step1, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13)
Feb 01 08:52:27 np0005604215.localdomain podman[101707]: 2026-02-01 08:52:27.062928167 +0000 UTC m=+0.278186630 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, config_id=tripleo_step1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, vcs-type=git, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:52:27 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:52:35 np0005604215.localdomain podman[101736]: 2026-02-01 08:52:35.886214019 +0000 UTC m=+0.100180626 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, release=1766032510, name=rhosp-rhel9/openstack-cron, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: tmp-crun.5skrqR.mount: Deactivated successfully.
Feb 01 08:52:35 np0005604215.localdomain podman[101736]: 2026-02-01 08:52:35.973599765 +0000 UTC m=+0.187566372 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, config_id=tripleo_step4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, build-date=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, architecture=x86_64, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:52:35 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:52:35 np0005604215.localdomain podman[101739]: 2026-02-01 08:52:35.996138219 +0000 UTC m=+0.202648634 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, distribution-scope=public, io.openshift.expose-services=, vcs-type=git, container_name=ceilometer_agent_compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, build-date=2026-01-12T23:07:47Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:52:36 np0005604215.localdomain podman[101738]: 2026-02-01 08:52:36.041187004 +0000 UTC m=+0.250558429 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_id=tripleo_step3, container_name=iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, vcs-type=git, distribution-scope=public, release=1766032510, url=https://www.redhat.com, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:52:36 np0005604215.localdomain podman[101739]: 2026-02-01 08:52:36.053951522 +0000 UTC m=+0.260461977 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-compute-container)
Feb 01 08:52:36 np0005604215.localdomain podman[101750]: 2026-02-01 08:52:35.959383772 +0000 UTC m=+0.158953450 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, release=1766032510, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, vendor=Red Hat, Inc., container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64)
Feb 01 08:52:36 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:52:36 np0005604215.localdomain podman[101738]: 2026-02-01 08:52:36.07503806 +0000 UTC m=+0.284409535 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, architecture=x86_64, vcs-ref=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, container_name=iscsid, distribution-scope=public, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_id=tripleo_step3, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, version=17.1.13)
Feb 01 08:52:36 np0005604215.localdomain podman[101750]: 2026-02-01 08:52:36.089242983 +0000 UTC m=+0.288812621 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3)
Feb 01 08:52:36 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:52:36 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:52:36 np0005604215.localdomain podman[101737]: 2026-02-01 08:52:35.958576236 +0000 UTC m=+0.171931155 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, container_name=nova_compute, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, config_id=tripleo_step5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:52:36 np0005604215.localdomain podman[101737]: 2026-02-01 08:52:36.145782627 +0000 UTC m=+0.359137506 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, architecture=x86_64, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, config_id=tripleo_step5, version=17.1.13, container_name=nova_compute, batch=17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 01 08:52:36 np0005604215.localdomain podman[101745]: 2026-02-01 08:52:36.153624752 +0000 UTC m=+0.353136069 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-type=git, container_name=ceilometer_agent_ipmi, url=https://www.redhat.com, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z)
Feb 01 08:52:36 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:52:36 np0005604215.localdomain podman[101745]: 2026-02-01 08:52:36.187610152 +0000 UTC m=+0.387121509 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, batch=17.1_20260112.1, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:07:30Z, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., release=1766032510, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, url=https://www.redhat.com, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi)
Feb 01 08:52:36 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:52:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:52:42 np0005604215.localdomain systemd[1]: tmp-crun.dLjBUN.mount: Deactivated successfully.
Feb 01 08:52:42 np0005604215.localdomain podman[101874]: 2026-02-01 08:52:42.864501681 +0000 UTC m=+0.083181786 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:52:43 np0005604215.localdomain podman[101874]: 2026-02-01 08:52:43.231717428 +0000 UTC m=+0.450397513 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com)
Feb 01 08:52:43 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:52:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:52:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:52:44 np0005604215.localdomain podman[101899]: 2026-02-01 08:52:44.85750274 +0000 UTC m=+0.070636845 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, tcib_managed=true, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, version=17.1.13, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn)
Feb 01 08:52:44 np0005604215.localdomain podman[101899]: 2026-02-01 08:52:44.874756208 +0000 UTC m=+0.087890323 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, url=https://www.redhat.com, container_name=ovn_metadata_agent, config_id=tripleo_step4, release=1766032510, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc.)
Feb 01 08:52:44 np0005604215.localdomain podman[101899]: unhealthy
Feb 01 08:52:44 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:52:44 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:52:44 np0005604215.localdomain podman[101900]: 2026-02-01 08:52:44.927522464 +0000 UTC m=+0.134639651 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, distribution-scope=public, release=1766032510, vcs-type=git, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team)
Feb 01 08:52:44 np0005604215.localdomain podman[101900]: 2026-02-01 08:52:44.970830815 +0000 UTC m=+0.177948052 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:52:44 np0005604215.localdomain podman[101900]: unhealthy
Feb 01 08:52:44 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:52:44 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:52:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:52:57 np0005604215.localdomain systemd[1]: tmp-crun.BfpgYH.mount: Deactivated successfully.
Feb 01 08:52:57 np0005604215.localdomain podman[101938]: 2026-02-01 08:52:57.880834135 +0000 UTC m=+0.090100952 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, architecture=x86_64, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.buildah.version=1.41.5, vcs-type=git, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, version=17.1.13, name=rhosp-rhel9/openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:52:58 np0005604215.localdomain podman[101938]: 2026-02-01 08:52:58.081597229 +0000 UTC m=+0.290863986 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, build-date=2026-01-12T22:10:14Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, vcs-type=git, batch=17.1_20260112.1, distribution-scope=public)
Feb 01 08:52:58 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: tmp-crun.ZymYXa.mount: Deactivated successfully.
Feb 01 08:53:06 np0005604215.localdomain podman[101968]: 2026-02-01 08:53:06.883984399 +0000 UTC m=+0.098153733 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., batch=17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: tmp-crun.P4A0YL.mount: Deactivated successfully.
Feb 01 08:53:06 np0005604215.localdomain podman[101968]: 2026-02-01 08:53:06.899940366 +0000 UTC m=+0.114109800 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, tcib_managed=true, container_name=nova_compute, distribution-scope=public, architecture=x86_64, config_id=tripleo_step5, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:53:06 np0005604215.localdomain podman[101982]: 2026-02-01 08:53:06.932673448 +0000 UTC m=+0.128764429 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, vendor=Red Hat, Inc., io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:53:06 np0005604215.localdomain podman[101988]: 2026-02-01 08:53:06.970752806 +0000 UTC m=+0.166918429 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, container_name=collectd, name=rhosp-rhel9/openstack-collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:15Z, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git)
Feb 01 08:53:06 np0005604215.localdomain podman[101981]: 2026-02-01 08:53:06.97632226 +0000 UTC m=+0.176263890 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vendor=Red Hat, Inc., version=17.1.13, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, vcs-type=git, build-date=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 01 08:53:06 np0005604215.localdomain podman[101988]: 2026-02-01 08:53:06.978470336 +0000 UTC m=+0.174635959 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, com.redhat.component=openstack-collectd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, distribution-scope=public, io.openshift.expose-services=, name=rhosp-rhel9/openstack-collectd, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:53:06 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:53:07 np0005604215.localdomain podman[101982]: 2026-02-01 08:53:07.003254639 +0000 UTC m=+0.199345610 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:53:07 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:53:07 np0005604215.localdomain podman[101967]: 2026-02-01 08:53:07.04171646 +0000 UTC m=+0.255331207 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com)
Feb 01 08:53:07 np0005604215.localdomain podman[101969]: 2026-02-01 08:53:06.901697781 +0000 UTC m=+0.106574646 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, distribution-scope=public, config_id=tripleo_step3, batch=17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container)
Feb 01 08:53:07 np0005604215.localdomain podman[101981]: 2026-02-01 08:53:07.071082145 +0000 UTC m=+0.271023765 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true, batch=17.1_20260112.1, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, distribution-scope=public, com.redhat.component=openstack-ceilometer-compute-container, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:47Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute)
Feb 01 08:53:07 np0005604215.localdomain podman[101967]: 2026-02-01 08:53:07.078957531 +0000 UTC m=+0.292572328 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, com.redhat.component=openstack-cron-container, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, distribution-scope=public, tcib_managed=true, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, batch=17.1_20260112.1, version=17.1.13, managed_by=tripleo_ansible, vendor=Red Hat, Inc.)
Feb 01 08:53:07 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:53:07 np0005604215.localdomain podman[101969]: 2026-02-01 08:53:07.086548488 +0000 UTC m=+0.291425323 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:53:07 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:53:07 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:53:10 np0005604215.localdomain sudo[102106]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:53:10 np0005604215.localdomain sudo[102106]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:53:10 np0005604215.localdomain sudo[102106]: pam_unix(sudo:session): session closed for user root
Feb 01 08:53:10 np0005604215.localdomain sudo[102121]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 08:53:10 np0005604215.localdomain sudo[102121]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:53:11 np0005604215.localdomain sudo[102121]: pam_unix(sudo:session): session closed for user root
Feb 01 08:53:11 np0005604215.localdomain sudo[102156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:53:11 np0005604215.localdomain sudo[102156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:53:11 np0005604215.localdomain sudo[102156]: pam_unix(sudo:session): session closed for user root
Feb 01 08:53:11 np0005604215.localdomain sudo[102171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:53:11 np0005604215.localdomain sudo[102171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:53:12 np0005604215.localdomain sudo[102171]: pam_unix(sudo:session): session closed for user root
Feb 01 08:53:12 np0005604215.localdomain sudo[102219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:53:12 np0005604215.localdomain sudo[102219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:53:12 np0005604215.localdomain sudo[102219]: pam_unix(sudo:session): session closed for user root
Feb 01 08:53:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:53:13 np0005604215.localdomain podman[102234]: 2026-02-01 08:53:13.874029337 +0000 UTC m=+0.088796811 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.buildah.version=1.41.5, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:53:14 np0005604215.localdomain podman[102234]: 2026-02-01 08:53:14.248730968 +0000 UTC m=+0.463498422 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, maintainer=OpenStack TripleO Team, architecture=x86_64, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, vcs-type=git)
Feb 01 08:53:14 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:53:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:53:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:53:15 np0005604215.localdomain podman[102257]: 2026-02-01 08:53:15.87099711 +0000 UTC m=+0.086660075 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, version=17.1.13, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:53:15 np0005604215.localdomain podman[102258]: 2026-02-01 08:53:15.914443615 +0000 UTC m=+0.102716815 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vendor=Red Hat, Inc., config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']})
Feb 01 08:53:15 np0005604215.localdomain podman[102257]: 2026-02-01 08:53:15.955858987 +0000 UTC m=+0.171522022 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:53:15 np0005604215.localdomain podman[102257]: unhealthy
Feb 01 08:53:15 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:53:15 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:53:16 np0005604215.localdomain podman[102258]: 2026-02-01 08:53:16.009130669 +0000 UTC m=+0.197403829 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=17.1.13, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, release=1766032510, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:53:16 np0005604215.localdomain podman[102258]: unhealthy
Feb 01 08:53:16 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:53:16 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:53:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:53:28 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:53:28 np0005604215.localdomain recover_tripleo_nova_virtqemud[102303]: 62016
Feb 01 08:53:28 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:53:28 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:53:28 np0005604215.localdomain systemd[1]: tmp-crun.EaOd1K.mount: Deactivated successfully.
Feb 01 08:53:28 np0005604215.localdomain podman[102296]: 2026-02-01 08:53:28.875049467 +0000 UTC m=+0.089416271 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.buildah.version=1.41.5, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, version=17.1.13, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git)
Feb 01 08:53:29 np0005604215.localdomain podman[102296]: 2026-02-01 08:53:29.068713129 +0000 UTC m=+0.283079853 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, tcib_managed=true, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:10:14Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=17.1.13, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, container_name=metrics_qdr, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team)
Feb 01 08:53:29 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:53:37 np0005604215.localdomain podman[102331]: 2026-02-01 08:53:37.884715954 +0000 UTC m=+0.084904150 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, build-date=2026-01-12T23:07:47Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, config_id=tripleo_step4, tcib_managed=true)
Feb 01 08:53:37 np0005604215.localdomain systemd[1]: tmp-crun.7UMhZT.mount: Deactivated successfully.
Feb 01 08:53:37 np0005604215.localdomain podman[102329]: 2026-02-01 08:53:37.939238504 +0000 UTC m=+0.143091215 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step5, batch=17.1_20260112.1, version=17.1.13, build-date=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git)
Feb 01 08:53:37 np0005604215.localdomain podman[102328]: 2026-02-01 08:53:37.988522942 +0000 UTC m=+0.192965971 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, container_name=logrotate_crond, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, tcib_managed=true, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container)
Feb 01 08:53:37 np0005604215.localdomain podman[102329]: 2026-02-01 08:53:37.993713444 +0000 UTC m=+0.197566165 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.buildah.version=1.41.5, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13)
Feb 01 08:53:38 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:53:38 np0005604215.localdomain podman[102328]: 2026-02-01 08:53:38.021631116 +0000 UTC m=+0.226074125 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.buildah.version=1.41.5, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, architecture=x86_64, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:53:38 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:53:38 np0005604215.localdomain podman[102333]: 2026-02-01 08:53:38.038841902 +0000 UTC m=+0.233499976 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, version=17.1.13, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, batch=17.1_20260112.1, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi)
Feb 01 08:53:38 np0005604215.localdomain podman[102331]: 2026-02-01 08:53:38.041858946 +0000 UTC m=+0.242047142 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:47Z, build-date=2026-01-12T23:07:47Z, version=17.1.13, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:53:38 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:53:38 np0005604215.localdomain podman[102333]: 2026-02-01 08:53:38.091721952 +0000 UTC m=+0.286380076 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, managed_by=tripleo_ansible, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, container_name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:53:38 np0005604215.localdomain podman[102343]: 2026-02-01 08:53:38.101543988 +0000 UTC m=+0.291855146 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, url=https://www.redhat.com, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.component=openstack-collectd-container, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, config_id=tripleo_step3, version=17.1.13, distribution-scope=public, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:53:38 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:53:38 np0005604215.localdomain podman[102343]: 2026-02-01 08:53:38.116519886 +0000 UTC m=+0.306831014 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-type=git, io.buildah.version=1.41.5, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-collectd-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com)
Feb 01 08:53:38 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:53:38 np0005604215.localdomain podman[102330]: 2026-02-01 08:53:38.201367113 +0000 UTC m=+0.399216117 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-type=git, config_id=tripleo_step3)
Feb 01 08:53:38 np0005604215.localdomain podman[102330]: 2026-02-01 08:53:38.213636505 +0000 UTC m=+0.411485509 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, config_id=tripleo_step3, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.openshift.expose-services=, distribution-scope=public, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, release=1766032510, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, name=rhosp-rhel9/openstack-iscsid, io.buildah.version=1.41.5, vcs-ref=705339545363fec600102567c4e923938e0f43b3, architecture=x86_64, version=17.1.13, container_name=iscsid, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 iscsid, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 01 08:53:38 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:53:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:53:44 np0005604215.localdomain podman[102461]: 2026-02-01 08:53:44.843546168 +0000 UTC m=+0.058903439 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, container_name=nova_migration_target, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, distribution-scope=public, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, version=17.1.13)
Feb 01 08:53:45 np0005604215.localdomain podman[102461]: 2026-02-01 08:53:45.223691918 +0000 UTC m=+0.439049139 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, version=17.1.13, vendor=Red Hat, Inc., container_name=nova_migration_target, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, managed_by=tripleo_ansible)
Feb 01 08:53:45 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: tmp-crun.DosN8y.mount: Deactivated successfully.
Feb 01 08:53:46 np0005604215.localdomain podman[102486]: 2026-02-01 08:53:46.867811572 +0000 UTC m=+0.083583308 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, architecture=x86_64, build-date=2026-01-12T22:36:40Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, container_name=ovn_controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, version=17.1.13, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: tmp-crun.ffdjGP.mount: Deactivated successfully.
Feb 01 08:53:46 np0005604215.localdomain podman[102485]: 2026-02-01 08:53:46.922831938 +0000 UTC m=+0.138233713 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, distribution-scope=public, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:56:19Z, architecture=x86_64, config_id=tripleo_step4)
Feb 01 08:53:46 np0005604215.localdomain podman[102486]: 2026-02-01 08:53:46.936843336 +0000 UTC m=+0.152615042 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-type=git, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, container_name=ovn_controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:53:46 np0005604215.localdomain podman[102486]: unhealthy
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:53:46 np0005604215.localdomain podman[102485]: 2026-02-01 08:53:46.967683598 +0000 UTC m=+0.183085403 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, tcib_managed=true, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, distribution-scope=public, io.buildah.version=1.41.5)
Feb 01 08:53:46 np0005604215.localdomain podman[102485]: unhealthy
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:53:46 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:53:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:53:59 np0005604215.localdomain podman[102527]: 2026-02-01 08:53:59.856847381 +0000 UTC m=+0.075992502 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, build-date=2026-01-12T22:10:14Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_id=tripleo_step1, managed_by=tripleo_ansible, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.expose-services=, com.redhat.component=openstack-qdrouterd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, architecture=x86_64, vendor=Red Hat, Inc., tcib_managed=true, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:54:00 np0005604215.localdomain podman[102527]: 2026-02-01 08:54:00.019068932 +0000 UTC m=+0.238214093 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, url=https://www.redhat.com, architecture=x86_64, tcib_managed=true, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, batch=17.1_20260112.1, com.redhat.component=openstack-qdrouterd-container, io.buildah.version=1.41.5, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:54:00 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: tmp-crun.527m4V.mount: Deactivated successfully.
Feb 01 08:54:08 np0005604215.localdomain podman[102563]: 2026-02-01 08:54:08.886100257 +0000 UTC m=+0.090457353 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, version=17.1.13, distribution-scope=public, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: tmp-crun.WTTePs.mount: Deactivated successfully.
Feb 01 08:54:08 np0005604215.localdomain podman[102557]: 2026-02-01 08:54:08.941362282 +0000 UTC m=+0.153493320 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, tcib_managed=true, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:54:08 np0005604215.localdomain podman[102558]: 2026-02-01 08:54:08.957063182 +0000 UTC m=+0.164894716 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.buildah.version=1.41.5, tcib_managed=true, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, maintainer=OpenStack TripleO Team, vcs-type=git)
Feb 01 08:54:08 np0005604215.localdomain podman[102557]: 2026-02-01 08:54:08.973522864 +0000 UTC m=+0.185653902 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, config_id=tripleo_step5, container_name=nova_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, tcib_managed=true, release=1766032510)
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:54:08 np0005604215.localdomain podman[102563]: 2026-02-01 08:54:08.987358646 +0000 UTC m=+0.191715742 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:07:47Z, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:54:08 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:54:09 np0005604215.localdomain podman[102558]: 2026-02-01 08:54:09.040506904 +0000 UTC m=+0.248338488 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, container_name=iscsid, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-iscsid, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3)
Feb 01 08:54:09 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:54:09 np0005604215.localdomain podman[102576]: 2026-02-01 08:54:09.041722123 +0000 UTC m=+0.236756028 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, architecture=x86_64, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-type=git, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3)
Feb 01 08:54:09 np0005604215.localdomain podman[102556]: 2026-02-01 08:54:09.143343813 +0000 UTC m=+0.357186384 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, org.opencontainers.image.created=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-cron, summary=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc.)
Feb 01 08:54:09 np0005604215.localdomain podman[102556]: 2026-02-01 08:54:09.154498981 +0000 UTC m=+0.368341572 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, tcib_managed=true, container_name=logrotate_crond, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-cron, build-date=2026-01-12T22:10:15Z)
Feb 01 08:54:09 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:54:09 np0005604215.localdomain podman[102576]: 2026-02-01 08:54:09.175510197 +0000 UTC m=+0.370544172 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, distribution-scope=public, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-collectd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3, release=1766032510, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:54:09 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:54:09 np0005604215.localdomain podman[102570]: 2026-02-01 08:54:09.095048447 +0000 UTC m=+0.295047137 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, build-date=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi)
Feb 01 08:54:09 np0005604215.localdomain podman[102570]: 2026-02-01 08:54:09.227814348 +0000 UTC m=+0.427812998 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, container_name=ceilometer_agent_ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vendor=Red Hat, Inc., config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:30Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, com.redhat.component=openstack-ceilometer-ipmi-container)
Feb 01 08:54:09 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:54:09 np0005604215.localdomain systemd[1]: tmp-crun.MXBKg7.mount: Deactivated successfully.
Feb 01 08:54:12 np0005604215.localdomain sudo[102695]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:54:12 np0005604215.localdomain sudo[102695]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:54:12 np0005604215.localdomain sudo[102695]: pam_unix(sudo:session): session closed for user root
Feb 01 08:54:12 np0005604215.localdomain sudo[102710]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:54:12 np0005604215.localdomain sudo[102710]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:54:13 np0005604215.localdomain sudo[102710]: pam_unix(sudo:session): session closed for user root
Feb 01 08:54:14 np0005604215.localdomain sudo[102756]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:54:14 np0005604215.localdomain sudo[102756]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:54:14 np0005604215.localdomain sudo[102756]: pam_unix(sudo:session): session closed for user root
Feb 01 08:54:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:54:15 np0005604215.localdomain podman[102771]: 2026-02-01 08:54:15.834161775 +0000 UTC m=+0.051847218 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, vcs-type=git, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, config_id=tripleo_step4)
Feb 01 08:54:16 np0005604215.localdomain podman[102771]: 2026-02-01 08:54:16.23691027 +0000 UTC m=+0.454595723 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, version=17.1.13, io.openshift.expose-services=, managed_by=tripleo_ansible, io.buildah.version=1.41.5, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, tcib_managed=true, release=1766032510)
Feb 01 08:54:16 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:54:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:54:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:54:17 np0005604215.localdomain podman[102795]: 2026-02-01 08:54:17.879964882 +0000 UTC m=+0.083883318 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, architecture=x86_64, container_name=ovn_metadata_agent, config_id=tripleo_step4, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, tcib_managed=true, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:54:17 np0005604215.localdomain podman[102796]: 2026-02-01 08:54:17.932009475 +0000 UTC m=+0.132360940 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:54:17 np0005604215.localdomain podman[102795]: 2026-02-01 08:54:17.951208734 +0000 UTC m=+0.155127180 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, batch=17.1_20260112.1, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:54:17 np0005604215.localdomain podman[102796]: 2026-02-01 08:54:17.951768621 +0000 UTC m=+0.152120056 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., distribution-scope=public, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, build-date=2026-01-12T22:36:40Z)
Feb 01 08:54:17 np0005604215.localdomain podman[102795]: unhealthy
Feb 01 08:54:17 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:54:17 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:54:18 np0005604215.localdomain podman[102796]: unhealthy
Feb 01 08:54:18 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:54:18 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:54:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:54:30 np0005604215.localdomain systemd[1]: tmp-crun.78x7K7.mount: Deactivated successfully.
Feb 01 08:54:30 np0005604215.localdomain podman[102834]: 2026-02-01 08:54:30.872743116 +0000 UTC m=+0.083714053 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, distribution-scope=public, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:54:31 np0005604215.localdomain podman[102834]: 2026-02-01 08:54:31.069738671 +0000 UTC m=+0.280709608 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, release=1766032510, container_name=metrics_qdr, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:14Z)
Feb 01 08:54:31 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: tmp-crun.I7yv3U.mount: Deactivated successfully.
Feb 01 08:54:39 np0005604215.localdomain podman[102864]: 2026-02-01 08:54:39.877727804 +0000 UTC m=+0.095645304 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, release=1766032510, vendor=Red Hat, Inc., architecture=x86_64, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.component=openstack-cron-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, tcib_managed=true, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1)
Feb 01 08:54:39 np0005604215.localdomain podman[102866]: 2026-02-01 08:54:39.938772669 +0000 UTC m=+0.146572093 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, name=rhosp-rhel9/openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, container_name=iscsid, maintainer=OpenStack TripleO Team, version=17.1.13, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, org.opencontainers.image.created=2026-01-12T22:34:43Z, vcs-ref=705339545363fec600102567c4e923938e0f43b3, release=1766032510, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:54:39 np0005604215.localdomain podman[102866]: 2026-02-01 08:54:39.95068363 +0000 UTC m=+0.158483064 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, architecture=x86_64, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, build-date=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, version=17.1.13, config_id=tripleo_step3, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid)
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:54:39 np0005604215.localdomain podman[102864]: 2026-02-01 08:54:39.966070111 +0000 UTC m=+0.183987611 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, config_id=tripleo_step4, container_name=logrotate_crond, io.openshift.expose-services=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, release=1766032510, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, batch=17.1_20260112.1)
Feb 01 08:54:39 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:54:40 np0005604215.localdomain podman[102878]: 2026-02-01 08:54:40.038249602 +0000 UTC m=+0.241543776 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, tcib_managed=true, release=1766032510, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=ceilometer_agent_ipmi, config_id=tripleo_step4, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13)
Feb 01 08:54:40 np0005604215.localdomain podman[102878]: 2026-02-01 08:54:40.089501592 +0000 UTC m=+0.292795756 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, tcib_managed=true, vendor=Red Hat, Inc., config_id=tripleo_step4, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, release=1766032510, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:54:40 np0005604215.localdomain podman[102865]: 2026-02-01 08:54:40.089471091 +0000 UTC m=+0.302139698 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, tcib_managed=true, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, url=https://www.redhat.com, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, io.openshift.expose-services=)
Feb 01 08:54:40 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Deactivated successfully.
Feb 01 08:54:40 np0005604215.localdomain podman[102872]: 2026-02-01 08:54:40.143682282 +0000 UTC m=+0.349339750 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, config_id=tripleo_step4, managed_by=tripleo_ansible, container_name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, architecture=x86_64, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, release=1766032510, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vendor=Red Hat, Inc., io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, distribution-scope=public, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13)
Feb 01 08:54:40 np0005604215.localdomain podman[102872]: 2026-02-01 08:54:40.169764555 +0000 UTC m=+0.375422043 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, url=https://www.redhat.com, distribution-scope=public, vendor=Red Hat, Inc., container_name=ceilometer_agent_compute, io.buildah.version=1.41.5, architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:47Z, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.openshift.expose-services=, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, version=17.1.13, build-date=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:54:40 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:54:40 np0005604215.localdomain podman[102865]: 2026-02-01 08:54:40.218460895 +0000 UTC m=+0.431129992 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, io.openshift.expose-services=, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, vendor=Red Hat, Inc., version=17.1.13, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, build-date=2026-01-12T23:32:04Z, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible)
Feb 01 08:54:40 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Deactivated successfully.
Feb 01 08:54:40 np0005604215.localdomain podman[102879]: 2026-02-01 08:54:40.293432444 +0000 UTC m=+0.491322360 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, container_name=collectd, build-date=2026-01-12T22:10:15Z, batch=17.1_20260112.1, com.redhat.component=openstack-collectd-container, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:54:40 np0005604215.localdomain podman[102879]: 2026-02-01 08:54:40.329637873 +0000 UTC m=+0.527527829 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, container_name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, tcib_managed=true, url=https://www.redhat.com, architecture=x86_64, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., config_id=tripleo_step3, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, vcs-type=git, distribution-scope=public, managed_by=tripleo_ansible)
Feb 01 08:54:40 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:54:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:54:46 np0005604215.localdomain podman[102998]: 2026-02-01 08:54:46.866303837 +0000 UTC m=+0.080374919 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=nova_migration_target, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public)
Feb 01 08:54:47 np0005604215.localdomain podman[102998]: 2026-02-01 08:54:47.189702326 +0000 UTC m=+0.403773418 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, tcib_managed=true, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, release=1766032510, summary=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:54:47 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:54:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:54:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:54:48 np0005604215.localdomain podman[103021]: 2026-02-01 08:54:48.869598426 +0000 UTC m=+0.084794356 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., url=https://www.redhat.com, container_name=ovn_metadata_agent, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, architecture=x86_64, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, release=1766032510)
Feb 01 08:54:48 np0005604215.localdomain podman[103021]: 2026-02-01 08:54:48.885655707 +0000 UTC m=+0.100851617 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, tcib_managed=true, release=1766032510, vcs-type=git, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent)
Feb 01 08:54:48 np0005604215.localdomain podman[103021]: unhealthy
Feb 01 08:54:48 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:54:48 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:54:48 np0005604215.localdomain podman[103022]: 2026-02-01 08:54:48.972200338 +0000 UTC m=+0.183831997 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, release=1766032510, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:54:48 np0005604215.localdomain podman[103022]: 2026-02-01 08:54:48.989036113 +0000 UTC m=+0.200667772 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, vcs-type=git, io.buildah.version=1.41.5, com.redhat.component=openstack-ovn-controller-container, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:54:48 np0005604215.localdomain podman[103022]: unhealthy
Feb 01 08:54:49 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:54:49 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:55:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:55:01 np0005604215.localdomain podman[103061]: 2026-02-01 08:55:01.842201872 +0000 UTC m=+0.065147795 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, build-date=2026-01-12T22:10:14Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, vcs-type=git, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, release=1766032510, architecture=x86_64, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:55:02 np0005604215.localdomain podman[103061]: 2026-02-01 08:55:02.045943277 +0000 UTC m=+0.268889150 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, architecture=x86_64, io.buildah.version=1.41.5, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-qdrouterd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, container_name=metrics_qdr, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, url=https://www.redhat.com, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, org.opencontainers.image.created=2026-01-12T22:10:14Z, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:14Z)
Feb 01 08:55:02 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:55:10 np0005604215.localdomain recover_tripleo_nova_virtqemud[103124]: 62016
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:55:10 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:55:10 np0005604215.localdomain podman[103089]: 2026-02-01 08:55:10.874524673 +0000 UTC m=+0.083945810 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, architecture=x86_64, batch=17.1_20260112.1, io.buildah.version=1.41.5, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:55:10 np0005604215.localdomain podman[103106]: 2026-02-01 08:55:10.888485339 +0000 UTC m=+0.078263523 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, version=17.1.13, architecture=x86_64, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, org.opencontainers.image.created=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, release=1766032510, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, distribution-scope=public, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:55:10 np0005604215.localdomain podman[103088]: 2026-02-01 08:55:10.922018165 +0000 UTC m=+0.133494226 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:10:15Z, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, name=rhosp-rhel9/openstack-cron, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, distribution-scope=public)
Feb 01 08:55:10 np0005604215.localdomain podman[103091]: 2026-02-01 08:55:10.944969851 +0000 UTC m=+0.145812960 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, name=rhosp-rhel9/openstack-ceilometer-compute, build-date=2026-01-12T23:07:47Z, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']})
Feb 01 08:55:10 np0005604215.localdomain podman[103091]: 2026-02-01 08:55:10.971626803 +0000 UTC m=+0.172469842 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-compute, vcs-type=git, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:07:47Z, com.redhat.component=openstack-ceilometer-compute-container, io.openshift.expose-services=, tcib_managed=true, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5)
Feb 01 08:55:10 np0005604215.localdomain podman[103110]: 2026-02-01 08:55:10.984403121 +0000 UTC m=+0.175002110 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-collectd-container, tcib_managed=true, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, vcs-type=git, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z)
Feb 01 08:55:11 np0005604215.localdomain podman[103088]: 2026-02-01 08:55:11.004695274 +0000 UTC m=+0.216171385 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, io.openshift.expose-services=, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=logrotate_crond, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 cron)
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:55:11 np0005604215.localdomain podman[103110]: 2026-02-01 08:55:11.017778763 +0000 UTC m=+0.208377782 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, managed_by=tripleo_ansible, vcs-type=git, url=https://www.redhat.com, tcib_managed=true, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:55:11 np0005604215.localdomain podman[103106]: 2026-02-01 08:55:11.060476074 +0000 UTC m=+0.250254278 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, version=17.1.13, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510)
Feb 01 08:55:11 np0005604215.localdomain podman[103106]: unhealthy
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:55:11 np0005604215.localdomain podman[103090]: 2026-02-01 08:55:11.144510586 +0000 UTC m=+0.355043227 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, name=rhosp-rhel9/openstack-iscsid, batch=17.1_20260112.1, io.openshift.expose-services=, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, config_id=tripleo_step3, tcib_managed=true, version=17.1.13, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, vcs-type=git, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, container_name=iscsid)
Feb 01 08:55:11 np0005604215.localdomain podman[103089]: 2026-02-01 08:55:11.169932269 +0000 UTC m=+0.379353456 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., io.openshift.expose-services=, config_id=tripleo_step5, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, container_name=nova_compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:55:11 np0005604215.localdomain podman[103089]: unhealthy
Feb 01 08:55:11 np0005604215.localdomain podman[103090]: 2026-02-01 08:55:11.17954681 +0000 UTC m=+0.390079491 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, build-date=2026-01-12T22:34:43Z, vcs-type=git, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.openshift.expose-services=, version=17.1.13, distribution-scope=public, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, container_name=iscsid, config_id=tripleo_step3, tcib_managed=true, com.redhat.component=openstack-iscsid-container)
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:55:11 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:55:14 np0005604215.localdomain sudo[103226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:55:14 np0005604215.localdomain sudo[103226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:55:14 np0005604215.localdomain sudo[103226]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:14 np0005604215.localdomain sudo[103241]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 08:55:14 np0005604215.localdomain sudo[103241]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:55:15 np0005604215.localdomain podman[103327]: 2026-02-01 08:55:15.330881023 +0000 UTC m=+0.092754254 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, build-date=2025-12-08T17:28:53Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, ceph=True, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc.)
Feb 01 08:55:15 np0005604215.localdomain podman[103327]: 2026-02-01 08:55:15.436072925 +0000 UTC m=+0.197946186 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, release=1764794109, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, version=7, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4)
Feb 01 08:55:15 np0005604215.localdomain sudo[103241]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:15 np0005604215.localdomain sudo[103395]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:55:15 np0005604215.localdomain sudo[103395]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:55:15 np0005604215.localdomain sudo[103395]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:15 np0005604215.localdomain sudo[103410]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:55:15 np0005604215.localdomain sudo[103410]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:55:16 np0005604215.localdomain sudo[103410]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:17 np0005604215.localdomain sudo[103456]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:55:17 np0005604215.localdomain sudo[103456]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:55:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:55:17 np0005604215.localdomain sudo[103456]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:17 np0005604215.localdomain systemd[1]: tmp-crun.qz6gQd.mount: Deactivated successfully.
Feb 01 08:55:17 np0005604215.localdomain podman[103471]: 2026-02-01 08:55:17.343769982 +0000 UTC m=+0.093145027 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_migration_target, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, vcs-type=git, release=1766032510, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vendor=Red Hat, Inc., tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:55:17 np0005604215.localdomain podman[103471]: 2026-02-01 08:55:17.723888811 +0000 UTC m=+0.473263876 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, maintainer=OpenStack TripleO Team, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_migration_target, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, url=https://www.redhat.com, vcs-type=git, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true)
Feb 01 08:55:17 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:55:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:55:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:55:19 np0005604215.localdomain podman[103493]: 2026-02-01 08:55:19.861047237 +0000 UTC m=+0.078143199 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, distribution-scope=public, vendor=Red Hat, Inc., config_id=tripleo_step4, version=17.1.13, build-date=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, release=1766032510, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:55:19 np0005604215.localdomain podman[103493]: 2026-02-01 08:55:19.878748759 +0000 UTC m=+0.095844711 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, build-date=2026-01-12T22:56:19Z, vcs-type=git, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']})
Feb 01 08:55:19 np0005604215.localdomain podman[103493]: unhealthy
Feb 01 08:55:19 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:19 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:55:19 np0005604215.localdomain podman[103494]: 2026-02-01 08:55:19.923737793 +0000 UTC m=+0.140290458 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, distribution-scope=public, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:55:19 np0005604215.localdomain podman[103494]: 2026-02-01 08:55:19.967708855 +0000 UTC m=+0.184261540 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, tcib_managed=true, io.buildah.version=1.41.5, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, config_id=tripleo_step4, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:55:19 np0005604215.localdomain podman[103494]: unhealthy
Feb 01 08:55:19 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:19 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:55:23 np0005604215.localdomain sshd[103534]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:55:23 np0005604215.localdomain sshd[103534]: Invalid user reza from 85.206.171.113 port 57560
Feb 01 08:55:23 np0005604215.localdomain sshd[103534]: Received disconnect from 85.206.171.113 port 57560:11: Bye Bye [preauth]
Feb 01 08:55:23 np0005604215.localdomain sshd[103534]: Disconnected from invalid user reza 85.206.171.113 port 57560 [preauth]
Feb 01 08:55:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:55:32 np0005604215.localdomain systemd[1]: tmp-crun.b2wV68.mount: Deactivated successfully.
Feb 01 08:55:32 np0005604215.localdomain podman[103536]: 2026-02-01 08:55:32.879899603 +0000 UTC m=+0.093532759 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, container_name=metrics_qdr, architecture=x86_64, batch=17.1_20260112.1, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-type=git, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:55:33 np0005604215.localdomain podman[103536]: 2026-02-01 08:55:33.09809309 +0000 UTC m=+0.311726256 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, tcib_managed=true, version=17.1.13, vendor=Red Hat, Inc., managed_by=tripleo_ansible, vcs-type=git, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, config_id=tripleo_step1, com.redhat.component=openstack-qdrouterd-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, container_name=metrics_qdr, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:55:33 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: tmp-crun.dPT9Ea.mount: Deactivated successfully.
Feb 01 08:55:41 np0005604215.localdomain podman[103566]: 2026-02-01 08:55:41.940705425 +0000 UTC m=+0.142853688 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:32:04Z, distribution-scope=public, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, url=https://www.redhat.com, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, vendor=Red Hat, Inc.)
Feb 01 08:55:41 np0005604215.localdomain podman[103566]: 2026-02-01 08:55:41.958540642 +0000 UTC m=+0.160688915 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, release=1766032510, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, architecture=x86_64)
Feb 01 08:55:41 np0005604215.localdomain podman[103566]: unhealthy
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:41 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:55:42 np0005604215.localdomain podman[103567]: 2026-02-01 08:55:42.036208255 +0000 UTC m=+0.236496610 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, build-date=2026-01-12T22:34:43Z, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-iscsid-container, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-type=git, distribution-scope=public, io.buildah.version=1.41.5, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3)
Feb 01 08:55:42 np0005604215.localdomain podman[103567]: 2026-02-01 08:55:42.067443429 +0000 UTC m=+0.267731794 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., config_id=tripleo_step3, tcib_managed=true, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, description=Red Hat OpenStack Platform 17.1 iscsid, container_name=iscsid, com.redhat.component=openstack-iscsid-container, vcs-ref=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, name=rhosp-rhel9/openstack-iscsid, version=17.1.13, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:55:42 np0005604215.localdomain podman[103574]: 2026-02-01 08:55:42.149269702 +0000 UTC m=+0.341504056 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, architecture=x86_64, com.redhat.component=openstack-ceilometer-ipmi-container, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, distribution-scope=public, release=1766032510, container_name=ceilometer_agent_ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.created=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.expose-services=, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc.)
Feb 01 08:55:42 np0005604215.localdomain podman[103585]: 2026-02-01 08:55:41.916362666 +0000 UTC m=+0.109218499 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=collectd, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, url=https://www.redhat.com, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, vcs-type=git, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:55:42 np0005604215.localdomain podman[103565]: 2026-02-01 08:55:42.195810894 +0000 UTC m=+0.400295799 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, config_id=tripleo_step4, release=1766032510, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, batch=17.1_20260112.1, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=logrotate_crond, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:15Z)
Feb 01 08:55:42 np0005604215.localdomain podman[103565]: 2026-02-01 08:55:42.209569843 +0000 UTC m=+0.414054728 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, com.redhat.component=openstack-cron-container, tcib_managed=true, distribution-scope=public, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, url=https://www.redhat.com)
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:55:42 np0005604215.localdomain podman[103574]: 2026-02-01 08:55:42.232357124 +0000 UTC m=+0.424591518 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, release=1766032510, version=17.1.13, url=https://www.redhat.com, build-date=2026-01-12T23:07:30Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, vcs-type=git, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, name=rhosp-rhel9/openstack-ceilometer-ipmi)
Feb 01 08:55:42 np0005604215.localdomain podman[103574]: unhealthy
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:55:42 np0005604215.localdomain podman[103585]: 2026-02-01 08:55:42.256853318 +0000 UTC m=+0.449709181 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, com.redhat.component=openstack-collectd-container, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., managed_by=tripleo_ansible)
Feb 01 08:55:42 np0005604215.localdomain podman[103568]: 2026-02-01 08:55:42.213916619 +0000 UTC m=+0.408069623 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.buildah.version=1.41.5, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, container_name=ceilometer_agent_compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.expose-services=, tcib_managed=true, url=https://www.redhat.com, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ceilometer-compute, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:55:42 np0005604215.localdomain podman[103568]: 2026-02-01 08:55:42.296699802 +0000 UTC m=+0.490852816 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., tcib_managed=true, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, container_name=ceilometer_agent_compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, org.opencontainers.image.created=2026-01-12T23:07:47Z, name=rhosp-rhel9/openstack-ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64)
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:55:42 np0005604215.localdomain systemd[1]: tmp-crun.zvWTnb.mount: Deactivated successfully.
Feb 01 08:55:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:55:47 np0005604215.localdomain podman[103696]: 2026-02-01 08:55:47.861032589 +0000 UTC m=+0.076587061 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, maintainer=OpenStack TripleO Team, vcs-type=git, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, architecture=x86_64, tcib_managed=true, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Feb 01 08:55:48 np0005604215.localdomain podman[103696]: 2026-02-01 08:55:48.268179811 +0000 UTC m=+0.483734263 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, release=1766032510, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, distribution-scope=public, version=17.1.13, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, tcib_managed=true, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, architecture=x86_64)
Feb 01 08:55:48 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:55:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:55:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:55:50 np0005604215.localdomain podman[103719]: 2026-02-01 08:55:50.872038097 +0000 UTC m=+0.086325484 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:55:50 np0005604215.localdomain podman[103719]: 2026-02-01 08:55:50.887819409 +0000 UTC m=+0.102106836 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, io.buildah.version=1.41.5, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., com.redhat.component=openstack-neutron-metadata-agent-ovn-container, container_name=ovn_metadata_agent, url=https://www.redhat.com, architecture=x86_64, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:56:19Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, org.opencontainers.image.created=2026-01-12T22:56:19Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 08:55:50 np0005604215.localdomain podman[103719]: unhealthy
Feb 01 08:55:50 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:50 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:55:50 np0005604215.localdomain systemd[1]: tmp-crun.1rvglo.mount: Deactivated successfully.
Feb 01 08:55:50 np0005604215.localdomain podman[103720]: 2026-02-01 08:55:50.98176195 +0000 UTC m=+0.191586899 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, name=rhosp-rhel9/openstack-ovn-controller, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, vcs-type=git, build-date=2026-01-12T22:36:40Z, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, distribution-scope=public, container_name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, com.redhat.component=openstack-ovn-controller-container, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team)
Feb 01 08:55:51 np0005604215.localdomain podman[103720]: 2026-02-01 08:55:51.024816363 +0000 UTC m=+0.234641262 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, architecture=x86_64, io.buildah.version=1.41.5, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:55:51 np0005604215.localdomain podman[103720]: unhealthy
Feb 01 08:55:51 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:55:51 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:55:57 np0005604215.localdomain sshd[103759]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:55:57 np0005604215.localdomain sshd[103759]: Accepted publickey for zuul from 192.168.122.31 port 58848 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 08:55:57 np0005604215.localdomain systemd-logind[761]: New session 35 of user zuul.
Feb 01 08:55:57 np0005604215.localdomain systemd[1]: Started Session 35 of User zuul.
Feb 01 08:55:57 np0005604215.localdomain sshd[103759]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 08:55:58 np0005604215.localdomain sudo[103852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sduhilphmwifnxgqfokecirtvjeuciqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936157.6728942-23-110014756293067/AnsiballZ_stat.py
Feb 01 08:55:58 np0005604215.localdomain sudo[103852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:55:58 np0005604215.localdomain python3.9[103854]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 08:55:58 np0005604215.localdomain sudo[103852]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:59 np0005604215.localdomain sudo[103946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-emwbkqsohyroszuqcqydxtfypcvaacmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936158.727445-60-88201628735241/AnsiballZ_command.py
Feb 01 08:55:59 np0005604215.localdomain sudo[103946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:55:59 np0005604215.localdomain python3.9[103948]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/nova_libvirt/etc/nova/nova.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 08:55:59 np0005604215.localdomain sudo[103946]: pam_unix(sudo:session): session closed for user root
Feb 01 08:55:59 np0005604215.localdomain sudo[104039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbdxhfdwbzosxsglwkbczekzbwwlkclv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936159.6260295-83-246789849260842/AnsiballZ_stat.py
Feb 01 08:55:59 np0005604215.localdomain sudo[104039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:56:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63626 DF PROTO=TCP SPT=47630 DPT=9882 SEQ=2619242476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643C86C0000000001030307) 
Feb 01 08:56:00 np0005604215.localdomain python3.9[104041]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 08:56:00 np0005604215.localdomain sudo[104039]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:00 np0005604215.localdomain sudo[104133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsrilxxpwavyymtnrvnojvycmnsjqigu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936160.3224905-107-26997282642728/AnsiballZ_command.py
Feb 01 08:56:00 np0005604215.localdomain sudo[104133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:56:00 np0005604215.localdomain python3.9[104135]: ansible-ansible.legacy.command Invoked with cmd=python3 -c "import configparser as c; p = c.ConfigParser(strict=False); p.read('/var/lib/config-data/puppet-generated/neutron/etc/neutron/neutron.conf'); print(p['DEFAULT']['host'])"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 08:56:00 np0005604215.localdomain sudo[104133]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63627 DF PROTO=TCP SPT=47630 DPT=9882 SEQ=2619242476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643CC8E0000000001030307) 
Feb 01 08:56:01 np0005604215.localdomain sudo[104226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suwvdubipdzzxegvytzugsufwtnuvlab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936161.105795-135-211992406976216/AnsiballZ_command.py
Feb 01 08:56:01 np0005604215.localdomain sudo[104226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:56:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7398 DF PROTO=TCP SPT=59308 DPT=9105 SEQ=3528091407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643CE750000000001030307) 
Feb 01 08:56:01 np0005604215.localdomain python3.9[104228]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 08:56:01 np0005604215.localdomain sudo[104226]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:02 np0005604215.localdomain python3.9[104319]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 01 08:56:02 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7399 DF PROTO=TCP SPT=59308 DPT=9105 SEQ=3528091407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643D28D0000000001030307) 
Feb 01 08:56:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63628 DF PROTO=TCP SPT=47630 DPT=9882 SEQ=2619242476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643D48D0000000001030307) 
Feb 01 08:56:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48511 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=1568214925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643D6000000000001030307) 
Feb 01 08:56:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:56:03 np0005604215.localdomain podman[104379]: 2026-02-01 08:56:03.873495939 +0000 UTC m=+0.080678028 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, managed_by=tripleo_ansible, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.expose-services=, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, config_id=tripleo_step1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, container_name=metrics_qdr, io.buildah.version=1.41.5)
Feb 01 08:56:04 np0005604215.localdomain python3.9[104421]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 08:56:04 np0005604215.localdomain podman[104379]: 2026-02-01 08:56:04.095721142 +0000 UTC m=+0.302903251 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, url=https://www.redhat.com, name=rhosp-rhel9/openstack-qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.openshift.expose-services=, version=17.1.13, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, architecture=x86_64, container_name=metrics_qdr, com.redhat.component=openstack-qdrouterd-container, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, org.opencontainers.image.created=2026-01-12T22:10:14Z, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:56:04 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:56:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48512 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=1568214925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643DA0E0000000001030307) 
Feb 01 08:56:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7400 DF PROTO=TCP SPT=59308 DPT=9105 SEQ=3528091407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643DA8D0000000001030307) 
Feb 01 08:56:04 np0005604215.localdomain python3.9[104532]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 01 08:56:05 np0005604215.localdomain python3.9[104622]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 08:56:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48513 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=1568214925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643E20D0000000001030307) 
Feb 01 08:56:06 np0005604215.localdomain python3.9[104670]: ansible-ansible.legacy.dnf Invoked with name=['systemd-container'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 08:56:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31412 DF PROTO=TCP SPT=34918 DPT=9100 SEQ=1191802207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643E27B0000000001030307) 
Feb 01 08:56:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63629 DF PROTO=TCP SPT=47630 DPT=9882 SEQ=2619242476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643E44D0000000001030307) 
Feb 01 08:56:07 np0005604215.localdomain sshd[103759]: pam_unix(sshd:session): session closed for user zuul
Feb 01 08:56:07 np0005604215.localdomain systemd-logind[761]: Session 35 logged out. Waiting for processes to exit.
Feb 01 08:56:07 np0005604215.localdomain systemd[1]: session-35.scope: Deactivated successfully.
Feb 01 08:56:07 np0005604215.localdomain systemd[1]: session-35.scope: Consumed 4.728s CPU time.
Feb 01 08:56:07 np0005604215.localdomain systemd-logind[761]: Removed session 35.
Feb 01 08:56:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31413 DF PROTO=TCP SPT=34918 DPT=9100 SEQ=1191802207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643E68D0000000001030307) 
Feb 01 08:56:08 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7401 DF PROTO=TCP SPT=59308 DPT=9105 SEQ=3528091407 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643EA4E0000000001030307) 
Feb 01 08:56:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31414 DF PROTO=TCP SPT=34918 DPT=9100 SEQ=1191802207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643EE8D0000000001030307) 
Feb 01 08:56:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54381 DF PROTO=TCP SPT=50200 DPT=9101 SEQ=1738375931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643EF9F0000000001030307) 
Feb 01 08:56:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48514 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=1568214925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643F1CD0000000001030307) 
Feb 01 08:56:11 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54382 DF PROTO=TCP SPT=50200 DPT=9101 SEQ=1738375931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643F38D0000000001030307) 
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: tmp-crun.9S1I6G.mount: Deactivated successfully.
Feb 01 08:56:12 np0005604215.localdomain podman[104686]: 2026-02-01 08:56:12.884742105 +0000 UTC m=+0.097746651 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, batch=17.1_20260112.1, url=https://www.redhat.com, io.buildah.version=1.41.5, managed_by=tripleo_ansible, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-cron-container, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, config_id=tripleo_step4, vendor=Red Hat, Inc., distribution-scope=public, build-date=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 cron, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, vcs-type=git, name=rhosp-rhel9/openstack-cron)
Feb 01 08:56:12 np0005604215.localdomain podman[104687]: 2026-02-01 08:56:12.900038732 +0000 UTC m=+0.106131202 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=healthy, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, build-date=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, vcs-type=git, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1766032510, managed_by=tripleo_ansible, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.buildah.version=1.41.5, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, distribution-scope=public)
Feb 01 08:56:12 np0005604215.localdomain podman[104687]: 2026-02-01 08:56:12.916704492 +0000 UTC m=+0.122796922 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:32:04Z, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, architecture=x86_64, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, distribution-scope=public, vcs-type=git)
Feb 01 08:56:12 np0005604215.localdomain podman[104687]: unhealthy
Feb 01 08:56:12 np0005604215.localdomain podman[104686]: 2026-02-01 08:56:12.922661678 +0000 UTC m=+0.135666254 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-cron-container, version=17.1.13, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, container_name=logrotate_crond, release=1766032510, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.buildah.version=1.41.5)
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:56:12 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:56:12 np0005604215.localdomain podman[104695]: 2026-02-01 08:56:12.996959656 +0000 UTC m=+0.191075902 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=healthy, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, release=1766032510, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ceilometer_agent_ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, distribution-scope=public, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, config_id=tripleo_step4, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, batch=17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:07:30Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06)
Feb 01 08:56:13 np0005604215.localdomain podman[104694]: 2026-02-01 08:56:13.041620739 +0000 UTC m=+0.243262680 container health_status 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:07:47Z, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, container_name=ceilometer_agent_compute, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, url=https://www.redhat.com, distribution-scope=public, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, release=1766032510, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z)
Feb 01 08:56:13 np0005604215.localdomain podman[104695]: 2026-02-01 08:56:13.0438801 +0000 UTC m=+0.237996366 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, vcs-type=git, name=rhosp-rhel9/openstack-ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, container_name=ceilometer_agent_ipmi, version=17.1.13, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, release=1766032510, distribution-scope=public)
Feb 01 08:56:13 np0005604215.localdomain podman[104705]: 2026-02-01 08:56:13.052472878 +0000 UTC m=+0.244644183 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=collectd, com.redhat.component=openstack-collectd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step3, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, batch=17.1_20260112.1, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:56:13 np0005604215.localdomain podman[104705]: 2026-02-01 08:56:13.06312992 +0000 UTC m=+0.255301255 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, io.buildah.version=1.41.5, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, summary=Red Hat OpenStack Platform 17.1 collectd, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, release=1766032510, container_name=collectd, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_id=tripleo_step3)
Feb 01 08:56:13 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:56:13 np0005604215.localdomain podman[104695]: unhealthy
Feb 01 08:56:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54383 DF PROTO=TCP SPT=50200 DPT=9101 SEQ=1738375931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643FB8D0000000001030307) 
Feb 01 08:56:13 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:13 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:56:13 np0005604215.localdomain podman[104694]: 2026-02-01 08:56:13.121584374 +0000 UTC m=+0.323226335 container exec_died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, architecture=x86_64, distribution-scope=public, batch=17.1_20260112.1, config_id=tripleo_step4, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, container_name=ceilometer_agent_compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, release=1766032510, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, io.buildah.version=1.41.5, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-ceilometer-compute, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z)
Feb 01 08:56:13 np0005604215.localdomain podman[104688]: 2026-02-01 08:56:13.151025852 +0000 UTC m=+0.355206122 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-iscsid, io.openshift.expose-services=, vcs-ref=705339545363fec600102567c4e923938e0f43b3, summary=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.buildah.version=1.41.5, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=iscsid)
Feb 01 08:56:13 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Deactivated successfully.
Feb 01 08:56:13 np0005604215.localdomain podman[104688]: 2026-02-01 08:56:13.190554826 +0000 UTC m=+0.394735106 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:34:43Z, container_name=iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.component=openstack-iscsid-container, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, config_id=tripleo_step3, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, release=1766032510, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:56:13 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:56:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31415 DF PROTO=TCP SPT=34918 DPT=9100 SEQ=1191802207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA643FE4D0000000001030307) 
Feb 01 08:56:13 np0005604215.localdomain systemd[1]: tmp-crun.xSNd8v.mount: Deactivated successfully.
Feb 01 08:56:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63630 DF PROTO=TCP SPT=47630 DPT=9882 SEQ=2619242476 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644050E0000000001030307) 
Feb 01 08:56:17 np0005604215.localdomain sudo[104813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:56:17 np0005604215.localdomain sudo[104813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:56:17 np0005604215.localdomain sudo[104813]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:17 np0005604215.localdomain sudo[104828]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:56:17 np0005604215.localdomain sudo[104828]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:56:18 np0005604215.localdomain sudo[104828]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:56:18 np0005604215.localdomain sudo[104876]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:56:18 np0005604215.localdomain sudo[104876]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:56:18 np0005604215.localdomain sudo[104876]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:18 np0005604215.localdomain podman[104875]: 2026-02-01 08:56:18.861000853 +0000 UTC m=+0.074806245 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, tcib_managed=true, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc., managed_by=tripleo_ansible, version=17.1.13, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, config_id=tripleo_step4, name=rhosp-rhel9/openstack-nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:56:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48515 DF PROTO=TCP SPT=47574 DPT=9102 SEQ=1568214925 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644130D0000000001030307) 
Feb 01 08:56:19 np0005604215.localdomain podman[104875]: 2026-02-01 08:56:19.220110697 +0000 UTC m=+0.433916119 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, tcib_managed=true, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, batch=17.1_20260112.1, container_name=nova_migration_target, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:56:19 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:56:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:56:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:56:21 np0005604215.localdomain systemd[1]: tmp-crun.4Fi5UU.mount: Deactivated successfully.
Feb 01 08:56:21 np0005604215.localdomain podman[104915]: 2026-02-01 08:56:21.862060631 +0000 UTC m=+0.077683745 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, batch=17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ovn-controller-container, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, build-date=2026-01-12T22:36:40Z)
Feb 01 08:56:21 np0005604215.localdomain podman[104915]: 2026-02-01 08:56:21.880784535 +0000 UTC m=+0.096407679 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, vendor=Red Hat, Inc., com.redhat.component=openstack-ovn-controller-container, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.13, distribution-scope=public, vcs-type=git, config_id=tripleo_step4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.expose-services=, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, tcib_managed=true)
Feb 01 08:56:21 np0005604215.localdomain podman[104915]: unhealthy
Feb 01 08:56:21 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:21 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:56:21 np0005604215.localdomain systemd[1]: tmp-crun.ejFTCy.mount: Deactivated successfully.
Feb 01 08:56:21 np0005604215.localdomain podman[104914]: 2026-02-01 08:56:21.971278928 +0000 UTC m=+0.185306632 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, container_name=ovn_metadata_agent, url=https://www.redhat.com, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, version=17.1.13, release=1766032510, architecture=x86_64, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:56:21 np0005604215.localdomain podman[104914]: 2026-02-01 08:56:21.991616852 +0000 UTC m=+0.205644536 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, maintainer=OpenStack TripleO Team, vcs-type=git, config_id=tripleo_step4, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., io.openshift.expose-services=, managed_by=tripleo_ansible, version=17.1.13)
Feb 01 08:56:21 np0005604215.localdomain podman[104914]: unhealthy
Feb 01 08:56:22 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:22 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:56:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31416 DF PROTO=TCP SPT=34918 DPT=9100 SEQ=1191802207 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6441F0D0000000001030307) 
Feb 01 08:56:22 np0005604215.localdomain sshd[104956]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:56:22 np0005604215.localdomain sshd[104956]: Accepted publickey for zuul from 192.168.122.31 port 47500 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 08:56:22 np0005604215.localdomain systemd-logind[761]: New session 36 of user zuul.
Feb 01 08:56:22 np0005604215.localdomain systemd[1]: Started Session 36 of User zuul.
Feb 01 08:56:22 np0005604215.localdomain sshd[104956]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 08:56:23 np0005604215.localdomain sudo[105049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbwcbihfpltlrljvftsjosznuvhrqdag ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936182.5431092-20-7444861644719/AnsiballZ_systemd_service.py
Feb 01 08:56:23 np0005604215.localdomain sudo[105049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:56:23 np0005604215.localdomain python3.9[105051]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 08:56:23 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:56:23 np0005604215.localdomain systemd-sysv-generator[105078]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:56:23 np0005604215.localdomain systemd-rc-local-generator[105072]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:56:23 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:56:23 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:56:23 np0005604215.localdomain sudo[105049]: pam_unix(sudo:session): session closed for user root
Feb 01 08:56:23 np0005604215.localdomain recover_tripleo_nova_virtqemud[105089]: 62016
Feb 01 08:56:23 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:56:23 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:56:24 np0005604215.localdomain python3.9[105179]: ansible-ansible.builtin.service_facts Invoked
Feb 01 08:56:24 np0005604215.localdomain network[105196]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 08:56:24 np0005604215.localdomain network[105197]: 'network-scripts' will be removed from distribution in near future.
Feb 01 08:56:24 np0005604215.localdomain network[105198]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 08:56:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54385 DF PROTO=TCP SPT=50200 DPT=9101 SEQ=1738375931 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6442B0D0000000001030307) 
Feb 01 08:56:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:56:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20731 DF PROTO=TCP SPT=45992 DPT=9882 SEQ=1838796460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6443D9D0000000001030307) 
Feb 01 08:56:30 np0005604215.localdomain python3.9[105397]: ansible-ansible.builtin.service_facts Invoked
Feb 01 08:56:30 np0005604215.localdomain network[105414]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 08:56:30 np0005604215.localdomain network[105415]: 'network-scripts' will be removed from distribution in near future.
Feb 01 08:56:30 np0005604215.localdomain network[105416]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 08:56:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20732 DF PROTO=TCP SPT=45992 DPT=9882 SEQ=1838796460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644418E0000000001030307) 
Feb 01 08:56:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:56:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20733 DF PROTO=TCP SPT=45992 DPT=9882 SEQ=1838796460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644498D0000000001030307) 
Feb 01 08:56:34 np0005604215.localdomain sudo[105614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fumtutmzszmcmagkcjvvairtsruhstns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936193.7533696-111-137723213423900/AnsiballZ_systemd_service.py
Feb 01 08:56:34 np0005604215.localdomain sudo[105614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:56:34 np0005604215.localdomain python3.9[105616]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:56:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:56:34 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:56:34 np0005604215.localdomain podman[105618]: 2026-02-01 08:56:34.456346261 +0000 UTC m=+0.114473202 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, io.buildah.version=1.41.5, distribution-scope=public, container_name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, tcib_managed=true, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, build-date=2026-01-12T22:10:14Z, summary=Red Hat OpenStack Platform 17.1 qdrouterd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, com.redhat.component=openstack-qdrouterd-container, vcs-type=git, vendor=Red Hat, Inc., config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, release=1766032510, config_id=tripleo_step1, batch=17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 08:56:34 np0005604215.localdomain systemd-sysv-generator[105667]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:56:34 np0005604215.localdomain systemd-rc-local-generator[105663]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:56:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:56:34 np0005604215.localdomain podman[105618]: 2026-02-01 08:56:34.62941627 +0000 UTC m=+0.287543141 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, maintainer=OpenStack TripleO Team, architecture=x86_64, vendor=Red Hat, Inc., version=17.1.13, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, distribution-scope=public, config_id=tripleo_step1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-qdrouterd-container, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:56:34 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:56:34 np0005604215.localdomain systemd[1]: Stopping ceilometer_agent_compute container...
Feb 01 08:56:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17861 DF PROTO=TCP SPT=45846 DPT=9102 SEQ=885822400 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644574D0000000001030307) 
Feb 01 08:56:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48224 DF PROTO=TCP SPT=59532 DPT=9100 SEQ=2392031059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64463CD0000000001030307) 
Feb 01 08:56:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40750 DF PROTO=TCP SPT=49748 DPT=9101 SEQ=534350193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64470CD0000000001030307) 
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:56:43 np0005604215.localdomain podman[105699]: 2026-02-01 08:56:43.391013847 +0000 UTC m=+0.100013321 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-cron, config_id=tripleo_step4, version=17.1.13, com.redhat.component=openstack-cron-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:15Z, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-type=git, container_name=logrotate_crond, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, batch=17.1_20260112.1, url=https://www.redhat.com, release=1766032510)
Feb 01 08:56:43 np0005604215.localdomain podman[105699]: 2026-02-01 08:56:43.397394826 +0000 UTC m=+0.106394230 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=logrotate_crond, release=1766032510, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-cron, com.redhat.component=openstack-cron-container, description=Red Hat OpenStack Platform 17.1 cron, batch=17.1_20260112.1, version=17.1.13, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, architecture=x86_64)
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:56:43 np0005604215.localdomain podman[105701]: 2026-02-01 08:56:43.437525729 +0000 UTC m=+0.141624180 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, release=1766032510, config_id=tripleo_step3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, version=17.1.13, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., container_name=iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com)
Feb 01 08:56:43 np0005604215.localdomain podman[105701]: 2026-02-01 08:56:43.44301411 +0000 UTC m=+0.147112531 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.openshift.expose-services=, release=1766032510, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=iscsid, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., build-date=2026-01-12T22:34:43Z, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid)
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:56:43 np0005604215.localdomain podman[105721]: 2026-02-01 08:56:43.493196276 +0000 UTC m=+0.187520132 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, io.openshift.expose-services=, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, config_id=tripleo_step3, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 collectd, summary=Red Hat OpenStack Platform 17.1 collectd, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, container_name=collectd, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, name=rhosp-rhel9/openstack-collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, distribution-scope=public, release=1766032510, com.redhat.component=openstack-collectd-container, vcs-type=git, version=17.1.13, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc.)
Feb 01 08:56:43 np0005604215.localdomain podman[105721]: 2026-02-01 08:56:43.501430312 +0000 UTC m=+0.195754218 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, build-date=2026-01-12T22:10:15Z, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, description=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, batch=17.1_20260112.1, config_id=tripleo_step3, tcib_managed=true, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, container_name=collectd, managed_by=tripleo_ansible, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd)
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:56:43 np0005604215.localdomain podman[105702]: Error: container 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 is not running
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Main process exited, code=exited, status=125/n/a
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Failed with result 'exit-code'.
Feb 01 08:56:43 np0005604215.localdomain podman[105700]: 2026-02-01 08:56:43.539069056 +0000 UTC m=+0.245423688 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step5, managed_by=tripleo_ansible, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, vcs-type=git, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.buildah.version=1.41.5, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510)
Feb 01 08:56:43 np0005604215.localdomain podman[105700]: 2026-02-01 08:56:43.558319687 +0000 UTC m=+0.264674329 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, container_name=nova_compute, io.buildah.version=1.41.5, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team)
Feb 01 08:56:43 np0005604215.localdomain podman[105700]: unhealthy
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:56:43 np0005604215.localdomain podman[105708]: 2026-02-01 08:56:43.602070861 +0000 UTC m=+0.299056400 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vcs-type=git, container_name=ceilometer_agent_ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vendor=Red Hat, Inc., tcib_managed=true, build-date=2026-01-12T23:07:30Z, version=17.1.13, config_id=tripleo_step4, io.buildah.version=1.41.5, url=https://www.redhat.com, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:56:43 np0005604215.localdomain podman[105708]: 2026-02-01 08:56:43.628018032 +0000 UTC m=+0.325003571 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, distribution-scope=public, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, config_id=tripleo_step4, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, version=17.1.13, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:07:30Z, com.redhat.component=openstack-ceilometer-ipmi-container, io.openshift.expose-services=, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:56:43 np0005604215.localdomain podman[105708]: unhealthy
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:43 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:56:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20735 DF PROTO=TCP SPT=45992 DPT=9882 SEQ=1838796460 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644790E0000000001030307) 
Feb 01 08:56:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17863 DF PROTO=TCP SPT=45846 DPT=9102 SEQ=885822400 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644870D0000000001030307) 
Feb 01 08:56:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:56:49 np0005604215.localdomain systemd[1]: tmp-crun.8PCscJ.mount: Deactivated successfully.
Feb 01 08:56:49 np0005604215.localdomain podman[105822]: 2026-02-01 08:56:49.625517344 +0000 UTC m=+0.094442778 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, release=1766032510, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, io.openshift.expose-services=, architecture=x86_64, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, build-date=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, config_id=tripleo_step4)
Feb 01 08:56:49 np0005604215.localdomain podman[105822]: 2026-02-01 08:56:49.989024514 +0000 UTC m=+0.457949958 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, org.opencontainers.image.created=2026-01-12T23:32:04Z, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, url=https://www.redhat.com, com.redhat.component=openstack-nova-compute-container, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.buildah.version=1.41.5, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., container_name=nova_migration_target, tcib_managed=true)
Feb 01 08:56:50 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:56:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48226 DF PROTO=TCP SPT=59532 DPT=9100 SEQ=2392031059 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644930E0000000001030307) 
Feb 01 08:56:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:56:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:56:52 np0005604215.localdomain podman[105845]: 2026-02-01 08:56:52.372115353 +0000 UTC m=+0.087079368 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, build-date=2026-01-12T22:56:19Z, version=17.1.13, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, batch=17.1_20260112.1, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:56:52 np0005604215.localdomain podman[105846]: 2026-02-01 08:56:52.427655866 +0000 UTC m=+0.138861234 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, tcib_managed=true, com.redhat.component=openstack-ovn-controller-container, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, maintainer=OpenStack TripleO Team, version=17.1.13, vendor=Red Hat, Inc., release=1766032510, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, distribution-scope=public, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z)
Feb 01 08:56:52 np0005604215.localdomain podman[105845]: 2026-02-01 08:56:52.441475957 +0000 UTC m=+0.156440032 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-type=git, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, release=1766032510, version=17.1.13, architecture=x86_64, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true)
Feb 01 08:56:52 np0005604215.localdomain podman[105845]: unhealthy
Feb 01 08:56:52 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:52 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:56:52 np0005604215.localdomain podman[105846]: 2026-02-01 08:56:52.466930791 +0000 UTC m=+0.178136119 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, batch=17.1_20260112.1, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, url=https://www.redhat.com, tcib_managed=true, vcs-type=git)
Feb 01 08:56:52 np0005604215.localdomain podman[105846]: unhealthy
Feb 01 08:56:52 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:56:52 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:56:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40752 DF PROTO=TCP SPT=49748 DPT=9101 SEQ=534350193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644A10E0000000001030307) 
Feb 01 08:57:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40398 DF PROTO=TCP SPT=47730 DPT=9882 SEQ=2753011288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644B2CC0000000001030307) 
Feb 01 08:57:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40399 DF PROTO=TCP SPT=47730 DPT=9882 SEQ=2753011288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644B6CD0000000001030307) 
Feb 01 08:57:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40400 DF PROTO=TCP SPT=47730 DPT=9882 SEQ=2753011288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644BECD0000000001030307) 
Feb 01 08:57:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:57:04 np0005604215.localdomain podman[105882]: 2026-02-01 08:57:04.854436783 +0000 UTC m=+0.075296881 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, container_name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:57:05 np0005604215.localdomain podman[105882]: 2026-02-01 08:57:05.070565605 +0000 UTC m=+0.291425703 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, name=rhosp-rhel9/openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, distribution-scope=public, url=https://www.redhat.com, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, tcib_managed=true, io.openshift.expose-services=, build-date=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step1, io.buildah.version=1.41.5, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, batch=17.1_20260112.1)
Feb 01 08:57:05 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:57:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44258 DF PROTO=TCP SPT=32870 DPT=9102 SEQ=3247779277 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644CC8D0000000001030307) 
Feb 01 08:57:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22093 DF PROTO=TCP SPT=39610 DPT=9100 SEQ=778640740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644D8CD0000000001030307) 
Feb 01 08:57:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17493 DF PROTO=TCP SPT=59604 DPT=9101 SEQ=2326194972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644E60E0000000001030307) 
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:57:13 np0005604215.localdomain podman[105911]: 2026-02-01 08:57:13.621006426 +0000 UTC m=+0.082650449 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, container_name=logrotate_crond, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, vcs-type=git, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.openshift.expose-services=)
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:57:13 np0005604215.localdomain podman[105911]: 2026-02-01 08:57:13.639482573 +0000 UTC m=+0.101126546 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, managed_by=tripleo_ansible, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, name=rhosp-rhel9/openstack-cron, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, tcib_managed=true, io.openshift.expose-services=, vcs-type=git, architecture=x86_64)
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: tmp-crun.nQEEt4.mount: Deactivated successfully.
Feb 01 08:57:13 np0005604215.localdomain podman[105913]: 2026-02-01 08:57:13.701954502 +0000 UTC m=+0.154937815 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, version=17.1.13, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, build-date=2026-01-12T22:10:15Z, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, release=1766032510, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=collectd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, config_id=tripleo_step3, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:57:13 np0005604215.localdomain podman[105954]: Error: container 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 is not running
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Main process exited, code=exited, status=125/n/a
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Failed with result 'exit-code'.
Feb 01 08:57:13 np0005604215.localdomain podman[105913]: 2026-02-01 08:57:13.785743346 +0000 UTC m=+0.238726689 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 collectd, org.opencontainers.image.created=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_id=tripleo_step3, container_name=collectd, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, vendor=Red Hat, Inc., distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, batch=17.1_20260112.1, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, managed_by=tripleo_ansible, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-collectd)
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:57:13 np0005604215.localdomain podman[105953]: 2026-02-01 08:57:13.838279145 +0000 UTC m=+0.193659832 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, maintainer=OpenStack TripleO Team, build-date=2026-01-12T23:32:04Z, description=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, container_name=nova_compute, config_id=tripleo_step5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:32:04Z, managed_by=tripleo_ansible, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public)
Feb 01 08:57:13 np0005604215.localdomain podman[105965]: 2026-02-01 08:57:13.767128245 +0000 UTC m=+0.089091401 container health_status 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, health_status=unhealthy, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, vcs-type=git, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, release=1766032510, url=https://www.redhat.com, container_name=ceilometer_agent_ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, managed_by=tripleo_ansible, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:07:30Z)
Feb 01 08:57:13 np0005604215.localdomain podman[105912]: 2026-02-01 08:57:13.739553004 +0000 UTC m=+0.196709757 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T22:34:43Z, container_name=iscsid, version=17.1.13, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, release=1766032510, maintainer=OpenStack TripleO Team, vcs-type=git, vcs-ref=705339545363fec600102567c4e923938e0f43b3, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-iscsid-container, summary=Red Hat OpenStack Platform 17.1 iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, io.buildah.version=1.41.5)
Feb 01 08:57:13 np0005604215.localdomain podman[105953]: 2026-02-01 08:57:13.897033348 +0000 UTC m=+0.252414005 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, name=rhosp-rhel9/openstack-nova-compute, com.redhat.component=openstack-nova-compute-container, description=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step5, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, container_name=nova_compute, managed_by=tripleo_ansible, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:57:13 np0005604215.localdomain podman[105953]: unhealthy
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:57:13 np0005604215.localdomain podman[105912]: 2026-02-01 08:57:13.923761362 +0000 UTC m=+0.380918085 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, vendor=Red Hat, Inc., container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, com.redhat.component=openstack-iscsid-container, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, managed_by=tripleo_ansible, vcs-ref=705339545363fec600102567c4e923938e0f43b3, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, architecture=x86_64, version=17.1.13, build-date=2026-01-12T22:34:43Z, batch=17.1_20260112.1, tcib_managed=true, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:34:43Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, distribution-scope=public)
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:57:13 np0005604215.localdomain podman[105965]: 2026-02-01 08:57:13.948002508 +0000 UTC m=+0.269965704 container exec_died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, release=1766032510, managed_by=tripleo_ansible, vcs-type=git, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, maintainer=OpenStack TripleO Team, config_id=tripleo_step4, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:07:30Z, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi)
Feb 01 08:57:13 np0005604215.localdomain podman[105965]: unhealthy
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:13 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:57:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40402 DF PROTO=TCP SPT=47730 DPT=9882 SEQ=2753011288 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA644EF0D0000000001030307) 
Feb 01 08:57:16 np0005604215.localdomain podman[105685]: time="2026-02-01T08:57:16Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_compute in 42 seconds, resorting to SIGKILL"
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: libpod-35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.scope: Deactivated successfully.
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: libpod-35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.scope: Consumed 5.611s CPU time.
Feb 01 08:57:16 np0005604215.localdomain podman[105685]: 2026-02-01 08:57:16.815042265 +0000 UTC m=+42.092407497 container stop 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, release=1766032510, container_name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, com.redhat.component=openstack-ceilometer-compute-container, managed_by=tripleo_ansible, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:47Z, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:57:16 np0005604215.localdomain podman[105685]: 2026-02-01 08:57:16.843921566 +0000 UTC m=+42.121286868 container died 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ceilometer-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.expose-services=, version=17.1.13, vendor=Red Hat, Inc., url=https://www.redhat.com, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, container_name=ceilometer_agent_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-type=git, release=1766032510, managed_by=tripleo_ansible, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute)
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.timer: Deactivated successfully.
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Failed to open /run/systemd/transient/35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: No such file or directory
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: tmp-crun.Sws7Xe.mount: Deactivated successfully.
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9-userdata-shm.mount: Deactivated successfully.
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e8a1138cbb1c83236f4de65652beadb5bc0b1f3b8c525083bd1db3fda89ebbe0-merged.mount: Deactivated successfully.
Feb 01 08:57:16 np0005604215.localdomain podman[105685]: 2026-02-01 08:57:16.955821227 +0000 UTC m=+42.233186409 container cleanup 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, version=17.1.13, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, url=https://www.redhat.com, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.expose-services=, batch=17.1_20260112.1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ceilometer_agent_compute, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, org.opencontainers.image.created=2026-01-12T23:07:47Z, managed_by=tripleo_ansible, vendor=Red Hat, Inc., com.redhat.component=openstack-ceilometer-compute-container, tcib_managed=true)
Feb 01 08:57:16 np0005604215.localdomain podman[105685]: ceilometer_agent_compute
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.timer: Failed to open /run/systemd/transient/35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.timer: No such file or directory
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Failed to open /run/systemd/transient/35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: No such file or directory
Feb 01 08:57:16 np0005604215.localdomain podman[106030]: 2026-02-01 08:57:16.973625732 +0000 UTC m=+0.140945588 container cleanup 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, build-date=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, io.buildah.version=1.41.5, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, config_id=tripleo_step4, managed_by=tripleo_ansible, distribution-scope=public, container_name=ceilometer_agent_compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, release=1766032510, name=rhosp-rhel9/openstack-ceilometer-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, version=17.1.13, com.redhat.component=openstack-ceilometer-compute-container, vendor=Red Hat, Inc.)
Feb 01 08:57:16 np0005604215.localdomain systemd[1]: libpod-conmon-35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.scope: Deactivated successfully.
Feb 01 08:57:17 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.timer: Failed to open /run/systemd/transient/35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.timer: No such file or directory
Feb 01 08:57:17 np0005604215.localdomain systemd[1]: 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: Failed to open /run/systemd/transient/35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9.service: No such file or directory
Feb 01 08:57:17 np0005604215.localdomain podman[106044]: 2026-02-01 08:57:17.072244449 +0000 UTC m=+0.066043812 container cleanup 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, batch=17.1_20260112.1, distribution-scope=public, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, version=17.1.13, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-compute-container, config_id=tripleo_step4, managed_by=tripleo_ansible, vcs-type=git, release=1766032510, url=https://www.redhat.com, container_name=ceilometer_agent_compute, name=rhosp-rhel9/openstack-ceilometer-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:07:47Z, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, architecture=x86_64, build-date=2026-01-12T23:07:47Z, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, io.openshift.expose-services=)
Feb 01 08:57:17 np0005604215.localdomain podman[106044]: ceilometer_agent_compute
Feb 01 08:57:17 np0005604215.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Deactivated successfully.
Feb 01 08:57:17 np0005604215.localdomain systemd[1]: Stopped ceilometer_agent_compute container.
Feb 01 08:57:17 np0005604215.localdomain systemd[1]: tripleo_ceilometer_agent_compute.service: Consumed 1.068s CPU time, no IO.
Feb 01 08:57:17 np0005604215.localdomain sudo[105614]: pam_unix(sudo:session): session closed for user root
Feb 01 08:57:17 np0005604215.localdomain sudo[106146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuoictxofauokstbihrwpflurplzomch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936237.281391-111-220842164539713/AnsiballZ_systemd_service.py
Feb 01 08:57:17 np0005604215.localdomain sudo[106146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:57:17 np0005604215.localdomain python3.9[106148]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_ipmi.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:57:17 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:57:17 np0005604215.localdomain systemd-rc-local-generator[106172]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:57:17 np0005604215.localdomain systemd-sysv-generator[106176]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:57:18 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:57:18 np0005604215.localdomain systemd[1]: Stopping ceilometer_agent_ipmi container...
Feb 01 08:57:18 np0005604215.localdomain systemd[1]: tmp-crun.XEAlK8.mount: Deactivated successfully.
Feb 01 08:57:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:a1:06:ee MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=46032 SEQ=464998252 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 01 08:57:18 np0005604215.localdomain sudo[106202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:57:18 np0005604215.localdomain sudo[106202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:57:18 np0005604215.localdomain sudo[106202]: pam_unix(sudo:session): session closed for user root
Feb 01 08:57:19 np0005604215.localdomain sudo[106217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:57:19 np0005604215.localdomain sudo[106217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:57:19 np0005604215.localdomain sudo[106217]: pam_unix(sudo:session): session closed for user root
Feb 01 08:57:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:57:20 np0005604215.localdomain systemd[1]: tmp-crun.Kkj144.mount: Deactivated successfully.
Feb 01 08:57:20 np0005604215.localdomain podman[106263]: 2026-02-01 08:57:20.37371711 +0000 UTC m=+0.087368688 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, distribution-scope=public, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, architecture=x86_64, url=https://www.redhat.com, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, io.openshift.expose-services=, build-date=2026-01-12T23:32:04Z, name=rhosp-rhel9/openstack-nova-compute, version=17.1.13, config_id=tripleo_step4, container_name=nova_migration_target, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 08:57:20 np0005604215.localdomain sudo[106286]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:57:20 np0005604215.localdomain sudo[106286]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:57:20 np0005604215.localdomain sudo[106286]: pam_unix(sudo:session): session closed for user root
Feb 01 08:57:20 np0005604215.localdomain podman[106263]: 2026-02-01 08:57:20.742906347 +0000 UTC m=+0.456557935 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, url=https://www.redhat.com, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, vcs-type=git, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, vendor=Red Hat, Inc.)
Feb 01 08:57:20 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:57:21 np0005604215.localdomain sshd[106302]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:57:21 np0005604215.localdomain sshd[106302]: Invalid user frappeuser from 85.206.171.113 port 49370
Feb 01 08:57:22 np0005604215.localdomain sshd[106302]: Received disconnect from 85.206.171.113 port 49370:11: Bye Bye [preauth]
Feb 01 08:57:22 np0005604215.localdomain sshd[106302]: Disconnected from invalid user frappeuser 85.206.171.113 port 49370 [preauth]
Feb 01 08:57:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22095 DF PROTO=TCP SPT=39610 DPT=9100 SEQ=778640740 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645090E0000000001030307) 
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: tmp-crun.64nsJU.mount: Deactivated successfully.
Feb 01 08:57:22 np0005604215.localdomain podman[106304]: 2026-02-01 08:57:22.617434429 +0000 UTC m=+0.078560052 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, build-date=2026-01-12T22:56:19Z, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, managed_by=tripleo_ansible, distribution-scope=public, io.openshift.expose-services=, config_id=tripleo_step4, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com)
Feb 01 08:57:22 np0005604215.localdomain podman[106304]: 2026-02-01 08:57:22.632649745 +0000 UTC m=+0.093775368 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, version=17.1.13, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, distribution-scope=public, vcs-type=git, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, url=https://www.redhat.com, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, container_name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.created=2026-01-12T22:56:19Z, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:57:22 np0005604215.localdomain podman[106304]: unhealthy
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:57:22 np0005604215.localdomain podman[106305]: 2026-02-01 08:57:22.718810233 +0000 UTC m=+0.174808956 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_id=tripleo_step4, release=1766032510, build-date=2026-01-12T22:36:40Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, tcib_managed=true, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, url=https://www.redhat.com, com.redhat.component=openstack-ovn-controller-container, name=rhosp-rhel9/openstack-ovn-controller, vendor=Red Hat, Inc.)
Feb 01 08:57:22 np0005604215.localdomain podman[106305]: 2026-02-01 08:57:22.759085449 +0000 UTC m=+0.215084162 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, com.redhat.component=openstack-ovn-controller-container, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vcs-type=git, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, version=17.1.13, architecture=x86_64, io.openshift.expose-services=, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:57:22 np0005604215.localdomain podman[106305]: unhealthy
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:22 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:57:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17495 DF PROTO=TCP SPT=59604 DPT=9101 SEQ=2326194972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645170D0000000001030307) 
Feb 01 08:57:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22113 DF PROTO=TCP SPT=51962 DPT=9882 SEQ=2863784275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64527FC0000000001030307) 
Feb 01 08:57:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:a1:06:ee MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.110 DST=192.168.122.108 LEN=40 TOS=0x00 PREC=0x00 TTL=64 ID=0 DF PROTO=TCP SPT=6379 DPT=46032 SEQ=464998252 ACK=0 WINDOW=0 RES=0x00 RST URGP=0 
Feb 01 08:57:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22115 DF PROTO=TCP SPT=51962 DPT=9882 SEQ=2863784275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645340D0000000001030307) 
Feb 01 08:57:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:57:35 np0005604215.localdomain podman[106344]: 2026-02-01 08:57:35.363779333 +0000 UTC m=+0.080249954 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, version=17.1.13, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, url=https://www.redhat.com, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:14Z, container_name=metrics_qdr, io.openshift.expose-services=, managed_by=tripleo_ansible, vcs-type=git, com.redhat.component=openstack-qdrouterd-container, config_id=tripleo_step1, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd)
Feb 01 08:57:35 np0005604215.localdomain podman[106344]: 2026-02-01 08:57:35.586781451 +0000 UTC m=+0.303252052 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, vcs-type=git, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, release=1766032510, distribution-scope=public, tcib_managed=true, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:10:14Z, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, name=rhosp-rhel9/openstack-qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, architecture=x86_64, io.openshift.expose-services=, managed_by=tripleo_ansible, com.redhat.component=openstack-qdrouterd-container, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr)
Feb 01 08:57:35 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:57:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21569 DF PROTO=TCP SPT=44318 DPT=9102 SEQ=930796333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645418D0000000001030307) 
Feb 01 08:57:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27303 DF PROTO=TCP SPT=43764 DPT=9100 SEQ=1052198409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6454E0E0000000001030307) 
Feb 01 08:57:42 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17496 DF PROTO=TCP SPT=59604 DPT=9101 SEQ=2326194972 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645570D0000000001030307) 
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: tmp-crun.c0sbFG.mount: Deactivated successfully.
Feb 01 08:57:43 np0005604215.localdomain podman[106373]: 2026-02-01 08:57:43.872975986 +0000 UTC m=+0.086801379 container health_status 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, health_status=healthy, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 cron, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-cron-container, distribution-scope=public, release=1766032510, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-type=git, version=17.1.13, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, tcib_managed=true, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, container_name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee)
Feb 01 08:57:43 np0005604215.localdomain podman[106373]: 2026-02-01 08:57:43.884896918 +0000 UTC m=+0.098722301 container exec_died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, container_name=logrotate_crond, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:10:15Z, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, description=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, tcib_managed=true, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-cron, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, architecture=x86_64, batch=17.1_20260112.1, io.openshift.expose-services=)
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Deactivated successfully.
Feb 01 08:57:43 np0005604215.localdomain podman[106392]: 2026-02-01 08:57:43.962030475 +0000 UTC m=+0.069607163 container health_status e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, health_status=healthy, url=https://www.redhat.com, tcib_managed=true, config_id=tripleo_step3, managed_by=tripleo_ansible, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, build-date=2026-01-12T22:10:15Z, com.redhat.component=openstack-collectd-container, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, distribution-scope=public, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:57:43 np0005604215.localdomain podman[106392]: 2026-02-01 08:57:43.976698342 +0000 UTC m=+0.084275010 container exec_died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-collectd-container, name=rhosp-rhel9/openstack-collectd, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:15Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, build-date=2026-01-12T22:10:15Z, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 collectd, container_name=collectd, version=17.1.13, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']})
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:57:43 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Deactivated successfully.
Feb 01 08:57:44 np0005604215.localdomain podman[106412]: 2026-02-01 08:57:44.061948151 +0000 UTC m=+0.072789702 container health_status 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, health_status=healthy, com.redhat.component=openstack-iscsid-container, managed_by=tripleo_ansible, version=17.1.13, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, io.buildah.version=1.41.5, io.openshift.expose-services=, config_id=tripleo_step3, url=https://www.redhat.com, name=rhosp-rhel9/openstack-iscsid, release=1766032510, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., container_name=iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 iscsid, vcs-type=git, distribution-scope=public, architecture=x86_64, batch=17.1_20260112.1, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:34:43Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:57:44 np0005604215.localdomain podman[106412]: 2026-02-01 08:57:44.069886689 +0000 UTC m=+0.080728220 container exec_died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-iscsid-container, config_id=tripleo_step3, vendor=Red Hat, Inc., org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, vcs-ref=705339545363fec600102567c4e923938e0f43b3, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, description=Red Hat OpenStack Platform 17.1 iscsid, batch=17.1_20260112.1, version=17.1.13, io.buildah.version=1.41.5, build-date=2026-01-12T22:34:43Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, vcs-type=git, architecture=x86_64, name=rhosp-rhel9/openstack-iscsid, tcib_managed=true, distribution-scope=public, container_name=iscsid, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:57:44 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Deactivated successfully.
Feb 01 08:57:44 np0005604215.localdomain podman[106411]: 2026-02-01 08:57:44.120907591 +0000 UTC m=+0.133575008 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, tcib_managed=true, managed_by=tripleo_ansible, vendor=Red Hat, Inc., url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, distribution-scope=public, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step5, container_name=nova_compute, architecture=x86_64, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:57:44 np0005604215.localdomain podman[106411]: 2026-02-01 08:57:44.141724401 +0000 UTC m=+0.154391818 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, architecture=x86_64, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, release=1766032510, com.redhat.component=openstack-nova-compute-container, tcib_managed=true, maintainer=OpenStack TripleO Team, distribution-scope=public, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step5, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute)
Feb 01 08:57:44 np0005604215.localdomain podman[106411]: unhealthy
Feb 01 08:57:44 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:44 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:57:44 np0005604215.localdomain podman[106413]: Error: container 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c is not running
Feb 01 08:57:44 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Main process exited, code=exited, status=125/n/a
Feb 01 08:57:44 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed with result 'exit-code'.
Feb 01 08:57:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22117 DF PROTO=TCP SPT=51962 DPT=9882 SEQ=2863784275 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645650D0000000001030307) 
Feb 01 08:57:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21571 DF PROTO=TCP SPT=44318 DPT=9102 SEQ=930796333 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645710D0000000001030307) 
Feb 01 08:57:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:57:50 np0005604215.localdomain podman[106463]: 2026-02-01 08:57:50.848570824 +0000 UTC m=+0.070078458 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, name=rhosp-rhel9/openstack-nova-compute, vcs-type=git, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=nova_migration_target, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, release=1766032510, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:57:51 np0005604215.localdomain podman[106463]: 2026-02-01 08:57:51.207042417 +0000 UTC m=+0.428550041 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vendor=Red Hat, Inc., build-date=2026-01-12T23:32:04Z, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:32:04Z, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, container_name=nova_migration_target, managed_by=tripleo_ansible, config_id=tripleo_step4, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, architecture=x86_64)
Feb 01 08:57:51 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:57:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27305 DF PROTO=TCP SPT=43764 DPT=9100 SEQ=1052198409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6457F0D0000000001030307) 
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: tmp-crun.vBIcv5.mount: Deactivated successfully.
Feb 01 08:57:52 np0005604215.localdomain podman[106485]: 2026-02-01 08:57:52.87135221 +0000 UTC m=+0.084581339 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, org.opencontainers.image.created=2026-01-12T22:56:19Z, build-date=2026-01-12T22:56:19Z, managed_by=tripleo_ansible, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, tcib_managed=true, container_name=ovn_metadata_agent, release=1766032510, distribution-scope=public, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.buildah.version=1.41.5)
Feb 01 08:57:52 np0005604215.localdomain podman[106486]: 2026-02-01 08:57:52.910550953 +0000 UTC m=+0.123850214 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, io.openshift.expose-services=, architecture=x86_64, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:36:40Z, version=17.1.13, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step4, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, managed_by=tripleo_ansible, container_name=ovn_controller, distribution-scope=public, vendor=Red Hat, Inc., release=1766032510)
Feb 01 08:57:52 np0005604215.localdomain podman[106486]: 2026-02-01 08:57:52.922328361 +0000 UTC m=+0.135627622 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, tcib_managed=true, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, vendor=Red Hat, Inc., version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, config_id=tripleo_step4, release=1766032510, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller)
Feb 01 08:57:52 np0005604215.localdomain podman[106486]: unhealthy
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:57:52 np0005604215.localdomain podman[106485]: 2026-02-01 08:57:52.962914957 +0000 UTC m=+0.176144046 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, io.openshift.expose-services=, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, tcib_managed=true, build-date=2026-01-12T22:56:19Z, release=1766032510, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, container_name=ovn_metadata_agent, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, url=https://www.redhat.com, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, config_id=tripleo_step4, io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn)
Feb 01 08:57:52 np0005604215.localdomain podman[106485]: unhealthy
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:57:52 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:57:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33389 DF PROTO=TCP SPT=37260 DPT=9101 SEQ=3828165523 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6458B0D0000000001030307) 
Feb 01 08:57:56 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:57:56 np0005604215.localdomain recover_tripleo_nova_virtqemud[106527]: 62016
Feb 01 08:57:56 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:57:56 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:58:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36115 DF PROTO=TCP SPT=46678 DPT=9882 SEQ=1152359193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6459D2C0000000001030307) 
Feb 01 08:58:00 np0005604215.localdomain podman[106189]: time="2026-02-01T08:58:00Z" level=warning msg="StopSignal SIGTERM failed to stop container ceilometer_agent_ipmi in 42 seconds, resorting to SIGKILL"
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: libpod-79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.scope: Deactivated successfully.
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: libpod-79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.scope: Consumed 6.232s CPU time.
Feb 01 08:58:00 np0005604215.localdomain podman[106189]: 2026-02-01 08:58:00.306146043 +0000 UTC m=+42.082398570 container stop 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-ceilometer-ipmi-container, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, vendor=Red Hat, Inc., konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, tcib_managed=true, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, managed_by=tripleo_ansible, url=https://www.redhat.com, name=rhosp-rhel9/openstack-ceilometer-ipmi, build-date=2026-01-12T23:07:30Z, io.openshift.expose-services=, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-type=git, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, distribution-scope=public, container_name=ceilometer_agent_ipmi, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:58:00 np0005604215.localdomain podman[106189]: 2026-02-01 08:58:00.339769842 +0000 UTC m=+42.116022389 container died 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-ceilometer-ipmi-container, konflux.additional-tags=17.1.13 17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.expose-services=, batch=17.1_20260112.1, url=https://www.redhat.com, architecture=x86_64, vcs-type=git, distribution-scope=public, release=1766032510, build-date=2026-01-12T23:07:30Z, maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, tcib_managed=true, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, container_name=ceilometer_agent_ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible)
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.timer: Deactivated successfully.
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed to open /run/systemd/transient/79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: No such file or directory
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c-userdata-shm.mount: Deactivated successfully.
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-19867aa9ce07feb42ab4d071eed0ec581b8be5de4a737b08d8913c4970e7b3a5-merged.mount: Deactivated successfully.
Feb 01 08:58:00 np0005604215.localdomain podman[106189]: 2026-02-01 08:58:00.388774111 +0000 UTC m=+42.165026598 container cleanup 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, release=1766032510, tcib_managed=true, name=rhosp-rhel9/openstack-ceilometer-ipmi, com.redhat.component=openstack-ceilometer-ipmi-container, cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_id=tripleo_step4, url=https://www.redhat.com, architecture=x86_64, managed_by=tripleo_ansible, vcs-type=git, container_name=ceilometer_agent_ipmi, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, distribution-scope=public, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., version=17.1.13, maintainer=OpenStack TripleO Team)
Feb 01 08:58:00 np0005604215.localdomain podman[106189]: ceilometer_agent_ipmi
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.timer: Failed to open /run/systemd/transient/79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.timer: No such file or directory
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed to open /run/systemd/transient/79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: No such file or directory
Feb 01 08:58:00 np0005604215.localdomain podman[106529]: 2026-02-01 08:58:00.436438538 +0000 UTC m=+0.117770855 container cleanup 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, batch=17.1_20260112.1, com.redhat.component=openstack-ceilometer-ipmi-container, build-date=2026-01-12T23:07:30Z, url=https://www.redhat.com, distribution-scope=public, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, name=rhosp-rhel9/openstack-ceilometer-ipmi, version=17.1.13, container_name=ceilometer_agent_ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, release=1766032510, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T23:07:30Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, tcib_managed=true)
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: libpod-conmon-79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.scope: Deactivated successfully.
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.timer: Failed to open /run/systemd/transient/79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.timer: No such file or directory
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: Failed to open /run/systemd/transient/79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c.service: No such file or directory
Feb 01 08:58:00 np0005604215.localdomain podman[106546]: 2026-02-01 08:58:00.536746867 +0000 UTC m=+0.065154833 container cleanup 79d19d207a1ed8a10afe7744d76c358403339bb60509d58b24293c2dabffe10c (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1, name=ceilometer_agent_ipmi, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-ipmi:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer-agent-ipmi.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, version=17.1.13, io.buildah.version=1.41.5, release=1766032510, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:07:30Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-ipmi, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ceilometer_agent_ipmi, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:07:30Z, url=https://www.redhat.com, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-ceilometer-ipmi, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, com.redhat.component=openstack-ceilometer-ipmi-container, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-ipmi, tcib_managed=true)
Feb 01 08:58:00 np0005604215.localdomain podman[106546]: ceilometer_agent_ipmi
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: tripleo_ceilometer_agent_ipmi.service: Deactivated successfully.
Feb 01 08:58:00 np0005604215.localdomain systemd[1]: Stopped ceilometer_agent_ipmi container.
Feb 01 08:58:00 np0005604215.localdomain sudo[106146]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36116 DF PROTO=TCP SPT=46678 DPT=9882 SEQ=1152359193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645A14D0000000001030307) 
Feb 01 08:58:01 np0005604215.localdomain sudo[106646]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmxhlyibqvjmfxrpkkixhlolreuzggii ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936280.7047925-111-11361196753580/AnsiballZ_systemd_service.py
Feb 01 08:58:01 np0005604215.localdomain sudo[106646]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:01 np0005604215.localdomain python3.9[106648]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_collectd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:02 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:58:02 np0005604215.localdomain systemd-sysv-generator[106676]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:58:02 np0005604215.localdomain systemd-rc-local-generator[106672]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:58:02 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:58:02 np0005604215.localdomain systemd[1]: Stopping collectd container...
Feb 01 08:58:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36117 DF PROTO=TCP SPT=46678 DPT=9882 SEQ=1152359193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645A94D0000000001030307) 
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: libpod-e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.scope: Deactivated successfully.
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: libpod-e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.scope: Consumed 2.107s CPU time.
Feb 01 08:58:05 np0005604215.localdomain podman[106689]: 2026-02-01 08:58:05.331147475 +0000 UTC m=+2.416318807 container stop e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vendor=Red Hat, Inc., batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, com.redhat.component=openstack-collectd-container, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 collectd, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-type=git, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 collectd, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, release=1766032510, version=17.1.13, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, io.openshift.expose-services=, distribution-scope=public)
Feb 01 08:58:05 np0005604215.localdomain podman[106689]: 2026-02-01 08:58:05.360031686 +0000 UTC m=+2.445202978 container died e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, com.redhat.component=openstack-collectd-container, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, vcs-type=git, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, summary=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, batch=17.1_20260112.1, build-date=2026-01-12T22:10:15Z, url=https://www.redhat.com, config_id=tripleo_step3, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, tcib_managed=true)
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: tmp-crun.BGh5Ss.mount: Deactivated successfully.
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.timer: Deactivated successfully.
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Failed to open /run/systemd/transient/e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: No such file or directory
Feb 01 08:58:05 np0005604215.localdomain podman[106689]: 2026-02-01 08:58:05.428944076 +0000 UTC m=+2.514115358 container cleanup e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step3, managed_by=tripleo_ansible, build-date=2026-01-12T22:10:15Z, org.opencontainers.image.created=2026-01-12T22:10:15Z, description=Red Hat OpenStack Platform 17.1 collectd, version=17.1.13, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-collectd-container, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, name=rhosp-rhel9/openstack-collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, architecture=x86_64, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, distribution-scope=public, container_name=collectd, summary=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:58:05 np0005604215.localdomain podman[106689]: collectd
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.timer: Failed to open /run/systemd/transient/e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.timer: No such file or directory
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Failed to open /run/systemd/transient/e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: No such file or directory
Feb 01 08:58:05 np0005604215.localdomain podman[106701]: 2026-02-01 08:58:05.453425359 +0000 UTC m=+0.110164747 container cleanup e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, release=1766032510, org.opencontainers.image.created=2026-01-12T22:10:15Z, architecture=x86_64, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, maintainer=OpenStack TripleO Team, vcs-type=git, tcib_managed=true, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, description=Red Hat OpenStack Platform 17.1 collectd, name=rhosp-rhel9/openstack-collectd, vendor=Red Hat, Inc., url=https://www.redhat.com, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.component=openstack-collectd-container, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, distribution-scope=public, build-date=2026-01-12T22:10:15Z, io.openshift.expose-services=, config_id=tripleo_step3, container_name=collectd, version=17.1.13, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 collectd, io.k8s.description=Red Hat OpenStack Platform 17.1 collectd)
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: tripleo_collectd.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: libpod-conmon-e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.scope: Deactivated successfully.
Feb 01 08:58:05 np0005604215.localdomain podman[106731]: error opening file `/run/crun/e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2/status`: No such file or directory
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.timer: Failed to open /run/systemd/transient/e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.timer: No such file or directory
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: Failed to open /run/systemd/transient/e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2.service: No such file or directory
Feb 01 08:58:05 np0005604215.localdomain podman[106719]: 2026-02-01 08:58:05.561524592 +0000 UTC m=+0.077101186 container cleanup e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2 (image=registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1, name=collectd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.buildah.version=1.41.5, config_id=tripleo_step3, name=rhosp-rhel9/openstack-collectd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 collectd, com.redhat.component=openstack-collectd-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 collectd, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-collectd, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 collectd, vcs-type=git, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, org.opencontainers.image.created=2026-01-12T22:10:15Z, maintainer=OpenStack TripleO Team, config_data={'cap_add': ['IPC_LOCK'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'd31718fcd17fdeee6489534105191c7a'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-collectd:17.1', 'memory': '512m', 'net': 'host', 'pid': 'host', 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/collectd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/containers/storage/overlay-containers:/var/lib/containers/storage/overlay-containers:ro', '/var/lib/config-data/puppet-generated/collectd:/var/lib/kolla/config_files/src:ro', '/var/log/containers/collectd:/var/log/collectd:rw,z', '/var/lib/container-config-scripts:/config-scripts:ro', '/var/lib/container-user-scripts:/scripts:z', '/run:/run:rw', '/sys/fs/cgroup:/sys/fs/cgroup:ro']}, architecture=x86_64, build-date=2026-01-12T22:10:15Z, container_name=collectd, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 collectd, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.expose-services=)
Feb 01 08:58:05 np0005604215.localdomain podman[106719]: collectd
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: tripleo_collectd.service: Failed with result 'exit-code'.
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: Stopped collectd container.
Feb 01 08:58:05 np0005604215.localdomain sudo[106646]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:58:05 np0005604215.localdomain podman[106780]: 2026-02-01 08:58:05.876862751 +0000 UTC m=+0.085017234 container health_status 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, health_status=healthy, description=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, release=1766032510, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, container_name=metrics_qdr, summary=Red Hat OpenStack Platform 17.1 qdrouterd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.buildah.version=1.41.5, version=17.1.13, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 01 08:58:06 np0005604215.localdomain sudo[106850]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpqlpooyztanovwrhqklgjhqjypcesbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936285.7289643-111-263080495466590/AnsiballZ_systemd_service.py
Feb 01 08:58:06 np0005604215.localdomain sudo[106850]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:06 np0005604215.localdomain podman[106780]: 2026-02-01 08:58:06.10317305 +0000 UTC m=+0.311327533 container exec_died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=metrics_qdr, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, url=https://www.redhat.com, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-qdrouterd-container, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, config_id=tripleo_step1, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, version=17.1.13, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, architecture=x86_64, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-type=git, name=rhosp-rhel9/openstack-qdrouterd)
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain python3.9[106852]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_iscsid.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:58:06 np0005604215.localdomain systemd-sysv-generator[106883]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:58:06 np0005604215.localdomain systemd-rc-local-generator[106877]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:58:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42357 DF PROTO=TCP SPT=39894 DPT=9102 SEQ=1683018467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645B6CD0000000001030307) 
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e861c060c0ea21c0188d51464d865661f600bc180bfdb1205536c0574a9a82a2-userdata-shm.mount: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8f493ed320f2136eba98c6f6d73d7580e3273443b9599c34d1438e87453daf45-merged.mount: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: Stopping iscsid container...
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: libpod-28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.scope: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: libpod-28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.scope: Consumed 1.005s CPU time.
Feb 01 08:58:06 np0005604215.localdomain podman[106893]: 2026-02-01 08:58:06.731806783 +0000 UTC m=+0.079139399 container died 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, com.redhat.component=openstack-iscsid-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, vcs-type=git, description=Red Hat OpenStack Platform 17.1 iscsid, url=https://www.redhat.com, build-date=2026-01-12T22:34:43Z, summary=Red Hat OpenStack Platform 17.1 iscsid, tcib_managed=true, release=1766032510, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, io.buildah.version=1.41.5, io.openshift.expose-services=, version=17.1.13)
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.timer: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Failed to open /run/systemd/transient/28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: No such file or directory
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: tmp-crun.XN5HDE.mount: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504-userdata-shm.mount: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain podman[106893]: 2026-02-01 08:58:06.773093532 +0000 UTC m=+0.120426158 container cleanup 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, description=Red Hat OpenStack Platform 17.1 iscsid, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:34:43Z, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, vcs-type=git, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, tcib_managed=true, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, name=rhosp-rhel9/openstack-iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, managed_by=tripleo_ansible, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, container_name=iscsid)
Feb 01 08:58:06 np0005604215.localdomain podman[106893]: iscsid
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.timer: Failed to open /run/systemd/transient/28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.timer: No such file or directory
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Failed to open /run/systemd/transient/28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: No such file or directory
Feb 01 08:58:06 np0005604215.localdomain podman[106905]: 2026-02-01 08:58:06.810349184 +0000 UTC m=+0.073169124 container cleanup 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, container_name=iscsid, io.buildah.version=1.41.5, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, org.opencontainers.image.created=2026-01-12T22:34:43Z, managed_by=tripleo_ansible, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-type=git, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.openshift.expose-services=, com.redhat.component=openstack-iscsid-container, name=rhosp-rhel9/openstack-iscsid, config_id=tripleo_step3, url=https://www.redhat.com, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, distribution-scope=public, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T22:34:43Z, vendor=Red Hat, Inc., vcs-ref=705339545363fec600102567c4e923938e0f43b3, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, summary=Red Hat OpenStack Platform 17.1 iscsid, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']})
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: libpod-conmon-28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.scope: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.timer: Failed to open /run/systemd/transient/28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.timer: No such file or directory
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: Failed to open /run/systemd/transient/28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504.service: No such file or directory
Feb 01 08:58:06 np0005604215.localdomain podman[106923]: 2026-02-01 08:58:06.912909393 +0000 UTC m=+0.068996093 container cleanup 28eba75e28267ca71a7c3445962ab0b9db8f4cbb8ecd47fe22abfa4336b1c504 (image=registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1, name=iscsid, release=1766032510, org.opencontainers.image.created=2026-01-12T22:34:43Z, config_id=tripleo_step3, io.buildah.version=1.41.5, vendor=Red Hat, Inc., batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 iscsid, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 iscsid, vcs-ref=705339545363fec600102567c4e923938e0f43b3, container_name=iscsid, managed_by=tripleo_ansible, org.opencontainers.image.revision=705339545363fec600102567c4e923938e0f43b3, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-iscsid, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-iscsid, description=Red Hat OpenStack Platform 17.1 iscsid, io.k8s.description=Red Hat OpenStack Platform 17.1 iscsid, architecture=x86_64, url=https://www.redhat.com, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-iscsid:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 2, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z']}, distribution-scope=public, build-date=2026-01-12T22:34:43Z, com.redhat.component=openstack-iscsid-container, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 08:58:06 np0005604215.localdomain podman[106923]: iscsid
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: tripleo_iscsid.service: Deactivated successfully.
Feb 01 08:58:06 np0005604215.localdomain systemd[1]: Stopped iscsid container.
Feb 01 08:58:06 np0005604215.localdomain sudo[106850]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:07 np0005604215.localdomain sudo[107024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjwmzozpblnpfqhgodaebbifjhdjjseq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936287.0779793-111-100517100005503/AnsiballZ_systemd_service.py
Feb 01 08:58:07 np0005604215.localdomain sudo[107024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-179e7ed4ab403439e752a2c426c6db4ca9807018662c061e320fe01562a6e116-merged.mount: Deactivated successfully.
Feb 01 08:58:07 np0005604215.localdomain python3.9[107026]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_logrotate_crond.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:07 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:58:07 np0005604215.localdomain systemd-sysv-generator[107049]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:58:07 np0005604215.localdomain systemd-rc-local-generator[107046]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:58:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: Stopping logrotate_crond container...
Feb 01 08:58:08 np0005604215.localdomain crond[69125]: (CRON) INFO (Shutting down)
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: libpod-07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.scope: Deactivated successfully.
Feb 01 08:58:08 np0005604215.localdomain podman[107066]: 2026-02-01 08:58:08.11094169 +0000 UTC m=+0.067334932 container died 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, description=Red Hat OpenStack Platform 17.1 cron, org.opencontainers.image.created=2026-01-12T22:10:15Z, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, com.redhat.component=openstack-cron-container, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, build-date=2026-01-12T22:10:15Z, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, version=17.1.13, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., architecture=x86_64, vcs-type=git, io.buildah.version=1.41.5, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, tcib_managed=true, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_id=tripleo_step4)
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.timer: Deactivated successfully.
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Failed to open /run/systemd/transient/07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: No such file or directory
Feb 01 08:58:08 np0005604215.localdomain podman[107066]: 2026-02-01 08:58:08.16416426 +0000 UTC m=+0.120557512 container cleanup 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, build-date=2026-01-12T22:10:15Z, tcib_managed=true, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, version=17.1.13, description=Red Hat OpenStack Platform 17.1 cron, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://www.redhat.com, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, io.buildah.version=1.41.5, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, container_name=logrotate_crond, distribution-scope=public, name=rhosp-rhel9/openstack-cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, batch=17.1_20260112.1, com.redhat.component=openstack-cron-container, org.opencontainers.image.created=2026-01-12T22:10:15Z)
Feb 01 08:58:08 np0005604215.localdomain podman[107066]: logrotate_crond
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.timer: Failed to open /run/systemd/transient/07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.timer: No such file or directory
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Failed to open /run/systemd/transient/07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: No such file or directory
Feb 01 08:58:08 np0005604215.localdomain podman[107080]: 2026-02-01 08:58:08.200758542 +0000 UTC m=+0.078043216 container cleanup 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=logrotate_crond, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 cron, com.redhat.component=openstack-cron-container, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, org.opencontainers.image.created=2026-01-12T22:10:15Z, version=17.1.13, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, config_id=tripleo_step4, url=https://www.redhat.com, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 cron, build-date=2026-01-12T22:10:15Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-cron, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=tripleo_ansible)
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: libpod-conmon-07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.scope: Deactivated successfully.
Feb 01 08:58:08 np0005604215.localdomain podman[107106]: error opening file `/run/crun/07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7/status`: No such file or directory
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.timer: Failed to open /run/systemd/transient/07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.timer: No such file or directory
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: Failed to open /run/systemd/transient/07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7.service: No such file or directory
Feb 01 08:58:08 np0005604215.localdomain podman[107095]: 2026-02-01 08:58:08.310747533 +0000 UTC m=+0.078111197 container cleanup 07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7 (image=registry.redhat.io/rhosp-rhel9/openstack-cron:17.1, name=logrotate_crond, version=17.1.13, distribution-scope=public, org.opencontainers.image.created=2026-01-12T22:10:15Z, com.redhat.component=openstack-cron-container, build-date=2026-01-12T22:10:15Z, io.k8s.description=Red Hat OpenStack Platform 17.1 cron, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 cron, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, description=Red Hat OpenStack Platform 17.1 cron, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-cron, container_name=logrotate_crond, vcs-type=git, architecture=x86_64, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, io.buildah.version=1.41.5, release=1766032510, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 cron, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '53ed83bb0cae779ff95edb2002262c6f'}, 'healthcheck': {'test': '/usr/share/openstack-tripleo-common/healthcheck/cron'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-cron:17.1', 'net': 'none', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/logrotate-crond.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/crond:/var/lib/kolla/config_files/src:ro', '/var/log/containers:/var/log/containers:z']}, config_id=tripleo_step4, vendor=Red Hat, Inc., url=https://www.redhat.com, name=rhosp-rhel9/openstack-cron, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:58:08 np0005604215.localdomain podman[107095]: logrotate_crond
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: tripleo_logrotate_crond.service: Deactivated successfully.
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: Stopped logrotate_crond container.
Feb 01 08:58:08 np0005604215.localdomain sudo[107024]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d2793eb0d727691e97e5e2f52ec5e9822efebe0b6bf32e0fb26a5897fd53d53c-merged.mount: Deactivated successfully.
Feb 01 08:58:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-07fa29680495e8092117ae3481a9591dd99a632431ea586d039da8eb996b3aa7-userdata-shm.mount: Deactivated successfully.
Feb 01 08:58:08 np0005604215.localdomain sudo[107197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imxfestzwumkphrgzcbmupvniagvywfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936288.5234056-111-8108090199770/AnsiballZ_systemd_service.py
Feb 01 08:58:08 np0005604215.localdomain sudo[107197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:09 np0005604215.localdomain python3.9[107199]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_metrics_qdr.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2294 DF PROTO=TCP SPT=47384 DPT=9100 SEQ=302641157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645C34D0000000001030307) 
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:58:10 np0005604215.localdomain systemd-sysv-generator[107228]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:58:10 np0005604215.localdomain systemd-rc-local-generator[107224]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: Stopping metrics_qdr container...
Feb 01 08:58:10 np0005604215.localdomain kernel: qdrouterd[54747]: segfault at 0 ip 00007fc4eedee7cb sp 00007ffc5069d3e0 error 4 in libc.so.6[7fc4eed8b000+175000]
Feb 01 08:58:10 np0005604215.localdomain kernel: Code: 0b 00 64 44 89 23 85 c0 75 d4 e9 2b ff ff ff e8 db a5 00 00 e9 fd fe ff ff e8 41 1d 0d 00 90 f3 0f 1e fa 41 54 55 48 89 fd 53 <8b> 07 f6 c4 20 0f 85 aa 00 00 00 89 c2 81 e2 00 80 00 00 0f 84 a9
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: Created slice Slice /system/systemd-coredump.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: Started Process Core Dump (PID 107255/UID 0).
Feb 01 08:58:10 np0005604215.localdomain systemd-coredump[107256]: Resource limits disable core dumping for process 54747 (qdrouterd).
Feb 01 08:58:10 np0005604215.localdomain systemd-coredump[107256]: Process 54747 (qdrouterd) of user 42465 dumped core.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: systemd-coredump@0-107255-0.service: Deactivated successfully.
Feb 01 08:58:10 np0005604215.localdomain podman[107240]: 2026-02-01 08:58:10.735717138 +0000 UTC m=+0.234089355 container died 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, batch=17.1_20260112.1, build-date=2026-01-12T22:10:14Z, com.redhat.component=openstack-qdrouterd-container, distribution-scope=public, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, io.openshift.expose-services=, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, summary=Red Hat OpenStack Platform 17.1 qdrouterd, maintainer=OpenStack TripleO Team, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, release=1766032510, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=metrics_qdr, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.created=2026-01-12T22:10:14Z, config_id=tripleo_step1, description=Red Hat OpenStack Platform 17.1 qdrouterd, tcib_managed=true, url=https://www.redhat.com, io.buildah.version=1.41.5, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13)
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: libpod-75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.scope: Deactivated successfully.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: libpod-75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.scope: Consumed 27.773s CPU time.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.timer: Deactivated successfully.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Failed to open /run/systemd/transient/75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: No such file or directory
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7-userdata-shm.mount: Deactivated successfully.
Feb 01 08:58:10 np0005604215.localdomain podman[107240]: 2026-02-01 08:58:10.779927817 +0000 UTC m=+0.278300074 container cleanup 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, name=rhosp-rhel9/openstack-qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, container_name=metrics_qdr, managed_by=tripleo_ansible, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, release=1766032510, io.buildah.version=1.41.5, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, com.redhat.component=openstack-qdrouterd-container, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T22:10:14Z, tcib_managed=true, vcs-type=git, org.opencontainers.image.created=2026-01-12T22:10:14Z, description=Red Hat OpenStack Platform 17.1 qdrouterd, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=tripleo_step1)
Feb 01 08:58:10 np0005604215.localdomain podman[107240]: metrics_qdr
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.timer: Failed to open /run/systemd/transient/75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.timer: No such file or directory
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Failed to open /run/systemd/transient/75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: No such file or directory
Feb 01 08:58:10 np0005604215.localdomain podman[107260]: 2026-02-01 08:58:10.81333914 +0000 UTC m=+0.067799256 container cleanup 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:10:14Z, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:10:14Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, container_name=metrics_qdr, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, summary=Red Hat OpenStack Platform 17.1 qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, managed_by=tripleo_ansible, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-qdrouterd-container, release=1766032510, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, name=rhosp-rhel9/openstack-qdrouterd, config_id=tripleo_step1)
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: tripleo_metrics_qdr.service: Main process exited, code=exited, status=139/n/a
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: libpod-conmon-75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.scope: Deactivated successfully.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.timer: Failed to open /run/systemd/transient/75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.timer: No such file or directory
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: Failed to open /run/systemd/transient/75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7.service: No such file or directory
Feb 01 08:58:10 np0005604215.localdomain podman[107275]: 2026-02-01 08:58:10.904818323 +0000 UTC m=+0.065582837 container cleanup 75c8a36d6f9eaf1389dcbf73d6bf05a74137a01275fa44aa87fda8dd481c01e7 (image=registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1, name=metrics_qdr, org.opencontainers.image.revision=4ce33df42b67d01daba32be86cee6dc0ad95cdee, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., io.buildah.version=1.41.5, name=rhosp-rhel9/openstack-qdrouterd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-qdrouterd, vcs-ref=4ce33df42b67d01daba32be86cee6dc0ad95cdee, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': 'b8acc88e7150a91ea5eddde509e925f2'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-qdrouterd:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'start_order': 1, 'user': 'qdrouterd', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/metrics_qdr.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/metrics_qdr:/var/lib/kolla/config_files/src:ro', '/var/lib/metrics_qdr:/var/lib/qdrouterd:z', '/var/log/containers/metrics_qdr:/var/log/qdrouterd:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 qdrouterd, architecture=x86_64, org.opencontainers.image.created=2026-01-12T22:10:14Z, vcs-type=git, container_name=metrics_qdr, description=Red Hat OpenStack Platform 17.1 qdrouterd, io.k8s.display-name=Red Hat OpenStack Platform 17.1 qdrouterd, version=17.1.13, tcib_managed=true, com.redhat.component=openstack-qdrouterd-container, release=1766032510, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 qdrouterd, config_id=tripleo_step1, batch=17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T22:10:14Z, managed_by=tripleo_ansible)
Feb 01 08:58:10 np0005604215.localdomain podman[107275]: metrics_qdr
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: tripleo_metrics_qdr.service: Failed with result 'exit-code'.
Feb 01 08:58:10 np0005604215.localdomain systemd[1]: Stopped metrics_qdr container.
Feb 01 08:58:10 np0005604215.localdomain sudo[107197]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:11 np0005604215.localdomain sudo[107377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mplvviknqufjefbzlprsuwuxfboyfius ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936291.0841465-111-75232204194996/AnsiballZ_systemd_service.py
Feb 01 08:58:11 np0005604215.localdomain sudo[107377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:11 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f747231ffc56e15c128dac75ec633f161eee676530b28d17cb7b8d0be7728054-merged.mount: Deactivated successfully.
Feb 01 08:58:11 np0005604215.localdomain python3.9[107379]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_dhcp.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:11 np0005604215.localdomain sudo[107377]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:12 np0005604215.localdomain sudo[107470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xmqdghjwcfokdbgzdqiciheqwheismyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936291.7794454-111-97995123659421/AnsiballZ_systemd_service.py
Feb 01 08:58:12 np0005604215.localdomain sudo[107470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:12 np0005604215.localdomain python3.9[107472]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_l3_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25358 DF PROTO=TCP SPT=60424 DPT=9101 SEQ=733340608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645D04E0000000001030307) 
Feb 01 08:58:13 np0005604215.localdomain sudo[107470]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:13 np0005604215.localdomain sudo[107563]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gvvernnlztnpuiclxhylibdruhczdbrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936293.4842854-111-141780024652083/AnsiballZ_systemd_service.py
Feb 01 08:58:13 np0005604215.localdomain sudo[107563]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:14 np0005604215.localdomain python3.9[107565]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_neutron_ovs_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:14 np0005604215.localdomain sudo[107563]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:14 np0005604215.localdomain sudo[107656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sgxhkjpwljpdtrgptdnggksjyhkhufyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936294.2161462-111-156736247684205/AnsiballZ_systemd_service.py
Feb 01 08:58:14 np0005604215.localdomain sudo[107656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:58:14 np0005604215.localdomain podman[107659]: 2026-02-01 08:58:14.618124873 +0000 UTC m=+0.089371759 container health_status 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, health_status=unhealthy, tcib_managed=true, container_name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, io.openshift.expose-services=, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team)
Feb 01 08:58:14 np0005604215.localdomain podman[107659]: 2026-02-01 08:58:14.662619921 +0000 UTC m=+0.133866757 container exec_died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step5, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, io.openshift.expose-services=, com.redhat.component=openstack-nova-compute-container, url=https://www.redhat.com, version=17.1.13, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, vcs-type=git, build-date=2026-01-12T23:32:04Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z)
Feb 01 08:58:14 np0005604215.localdomain podman[107659]: unhealthy
Feb 01 08:58:14 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:58:14 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:58:14 np0005604215.localdomain python3.9[107658]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36119 DF PROTO=TCP SPT=46678 DPT=9882 SEQ=1152359193 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645D90D0000000001030307) 
Feb 01 08:58:15 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:58:15 np0005604215.localdomain systemd-rc-local-generator[107708]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:58:15 np0005604215.localdomain systemd-sysv-generator[107712]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:58:16 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:58:16 np0005604215.localdomain systemd[1]: Stopping nova_compute container...
Feb 01 08:58:16 np0005604215.localdomain systemd[1]: tmp-crun.blxDUP.mount: Deactivated successfully.
Feb 01 08:58:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42359 DF PROTO=TCP SPT=39894 DPT=9102 SEQ=1683018467 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645E70D0000000001030307) 
Feb 01 08:58:20 np0005604215.localdomain sudo[107733]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:58:20 np0005604215.localdomain sudo[107733]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:58:20 np0005604215.localdomain sudo[107733]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:20 np0005604215.localdomain sudo[107748]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:58:20 np0005604215.localdomain sudo[107748]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:58:21 np0005604215.localdomain sudo[107748]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:58:21 np0005604215.localdomain systemd[1]: tmp-crun.tKWm0K.mount: Deactivated successfully.
Feb 01 08:58:21 np0005604215.localdomain podman[107796]: 2026-02-01 08:58:21.638105233 +0000 UTC m=+0.105061229 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, com.redhat.component=openstack-nova-compute-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vcs-type=git, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, build-date=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, version=17.1.13, io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., config_id=tripleo_step4)
Feb 01 08:58:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2296 DF PROTO=TCP SPT=47384 DPT=9100 SEQ=302641157 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA645F30E0000000001030307) 
Feb 01 08:58:22 np0005604215.localdomain podman[107796]: 2026-02-01 08:58:22.038781264 +0000 UTC m=+0.505737230 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-12T23:32:04Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, name=rhosp-rhel9/openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, summary=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, url=https://www.redhat.com, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:32:04Z, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-compute-container, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, container_name=nova_migration_target, managed_by=tripleo_ansible, architecture=x86_64, batch=17.1_20260112.1)
Feb 01 08:58:22 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:58:22 np0005604215.localdomain sudo[107820]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:58:22 np0005604215.localdomain sudo[107820]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:58:22 np0005604215.localdomain sudo[107820]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:58:23 np0005604215.localdomain podman[107836]: 2026-02-01 08:58:23.867790116 +0000 UTC m=+0.082691661 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-ovn-controller, io.openshift.expose-services=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, architecture=x86_64, vcs-type=git, version=17.1.13, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_controller, build-date=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:58:23 np0005604215.localdomain podman[107836]: 2026-02-01 08:58:23.88075665 +0000 UTC m=+0.095658225 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, batch=17.1_20260112.1, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, release=1766032510, container_name=ovn_controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, build-date=2026-01-12T22:36:40Z, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, architecture=x86_64, vcs-type=git, version=17.1.13, tcib_managed=true, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:58:23 np0005604215.localdomain podman[107836]: unhealthy
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: tmp-crun.6yy6fr.mount: Deactivated successfully.
Feb 01 08:58:23 np0005604215.localdomain podman[107835]: 2026-02-01 08:58:23.971381378 +0000 UTC m=+0.188188762 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T22:56:19Z, tcib_managed=true, io.buildah.version=1.41.5, build-date=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, konflux.additional-tags=17.1.13 17.1_20260112.1, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, release=1766032510, vendor=Red Hat, Inc., vcs-type=git, managed_by=tripleo_ansible, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn)
Feb 01 08:58:23 np0005604215.localdomain podman[107835]: 2026-02-01 08:58:23.986098587 +0000 UTC m=+0.202905801 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, architecture=x86_64, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.openshift.expose-services=, managed_by=tripleo_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, url=https://www.redhat.com, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1)
Feb 01 08:58:23 np0005604215.localdomain podman[107835]: unhealthy
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:58:23 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:58:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25360 DF PROTO=TCP SPT=60424 DPT=9101 SEQ=733340608 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646010D0000000001030307) 
Feb 01 08:58:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38260 DF PROTO=TCP SPT=49670 DPT=9882 SEQ=485417032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646129B0000000001030307) 
Feb 01 08:58:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38261 DF PROTO=TCP SPT=49670 DPT=9882 SEQ=485417032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646168E0000000001030307) 
Feb 01 08:58:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38262 DF PROTO=TCP SPT=49670 DPT=9882 SEQ=485417032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6461E8D0000000001030307) 
Feb 01 08:58:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42700 DF PROTO=TCP SPT=45450 DPT=9102 SEQ=4184278850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6462C0D0000000001030307) 
Feb 01 08:58:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10366 DF PROTO=TCP SPT=33804 DPT=9100 SEQ=252398568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646388D0000000001030307) 
Feb 01 08:58:42 np0005604215.localdomain sshd[107876]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 08:58:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10581 DF PROTO=TCP SPT=45426 DPT=9101 SEQ=1263811756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646458D0000000001030307) 
Feb 01 08:58:43 np0005604215.localdomain sshd[107876]: Invalid user financeiro from 85.206.171.113 port 52022
Feb 01 08:58:43 np0005604215.localdomain sshd[107876]: Received disconnect from 85.206.171.113 port 52022:11: Bye Bye [preauth]
Feb 01 08:58:43 np0005604215.localdomain sshd[107876]: Disconnected from invalid user financeiro 85.206.171.113 port 52022 [preauth]
Feb 01 08:58:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:58:44 np0005604215.localdomain podman[107878]: Error: container 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e is not running
Feb 01 08:58:44 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Main process exited, code=exited, status=125/n/a
Feb 01 08:58:44 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed with result 'exit-code'.
Feb 01 08:58:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38264 DF PROTO=TCP SPT=49670 DPT=9882 SEQ=485417032 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6464F0D0000000001030307) 
Feb 01 08:58:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42702 DF PROTO=TCP SPT=45450 DPT=9102 SEQ=4184278850 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6465D0D0000000001030307) 
Feb 01 08:58:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10368 DF PROTO=TCP SPT=33804 DPT=9100 SEQ=252398568 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646690E0000000001030307) 
Feb 01 08:58:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:58:52 np0005604215.localdomain podman[107889]: 2026-02-01 08:58:52.617957868 +0000 UTC m=+0.083681842 container health_status 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, health_status=healthy, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, url=https://www.redhat.com, batch=17.1_20260112.1, architecture=x86_64, managed_by=tripleo_ansible, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, container_name=nova_migration_target, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, name=rhosp-rhel9/openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, summary=Red Hat OpenStack Platform 17.1 nova-compute, release=1766032510, io.buildah.version=1.41.5, distribution-scope=public, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:58:53 np0005604215.localdomain podman[107889]: 2026-02-01 08:58:53.009476032 +0000 UTC m=+0.475199956 container exec_died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.buildah.version=1.41.5, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:32:04Z, summary=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, architecture=x86_64, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_migration_target, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, vendor=Red Hat, Inc., batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, url=https://www.redhat.com, version=17.1.13)
Feb 01 08:58:53 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Deactivated successfully.
Feb 01 08:58:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:58:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:58:54 np0005604215.localdomain podman[107912]: 2026-02-01 08:58:54.871243217 +0000 UTC m=+0.086073576 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, vcs-type=git, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, build-date=2026-01-12T22:56:19Z, distribution-scope=public, io.openshift.expose-services=, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, container_name=ovn_metadata_agent, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, architecture=x86_64, version=17.1.13, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:58:54 np0005604215.localdomain podman[107912]: 2026-02-01 08:58:54.88672151 +0000 UTC m=+0.101551829 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, vcs-type=git, architecture=x86_64, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, batch=17.1_20260112.1, io.openshift.expose-services=, tcib_managed=true, version=17.1.13, build-date=2026-01-12T22:56:19Z, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, release=1766032510)
Feb 01 08:58:54 np0005604215.localdomain podman[107912]: unhealthy
Feb 01 08:58:54 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:58:54 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:58:54 np0005604215.localdomain podman[107913]: 2026-02-01 08:58:54.981044673 +0000 UTC m=+0.192166817 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, tcib_managed=true, url=https://www.redhat.com, managed_by=tripleo_ansible, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step4, description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, vcs-type=git, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, architecture=x86_64, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, container_name=ovn_controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:58:55 np0005604215.localdomain podman[107913]: 2026-02-01 08:58:55.023786096 +0000 UTC m=+0.234908200 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, distribution-scope=public, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.buildah.version=1.41.5, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, io.openshift.expose-services=, release=1766032510, vcs-type=git, summary=Red Hat OpenStack Platform 17.1 ovn-controller, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-ovn-controller-container, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., container_name=ovn_controller)
Feb 01 08:58:55 np0005604215.localdomain podman[107913]: unhealthy
Feb 01 08:58:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:58:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:58:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10583 DF PROTO=TCP SPT=45426 DPT=9101 SEQ=1263811756 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646750D0000000001030307) 
Feb 01 08:58:58 np0005604215.localdomain podman[107721]: time="2026-02-01T08:58:58Z" level=warning msg="StopSignal SIGTERM failed to stop container nova_compute in 42 seconds, resorting to SIGKILL"
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: libpod-1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.scope: Deactivated successfully.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: libpod-1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.scope: Consumed 27.818s CPU time.
Feb 01 08:58:58 np0005604215.localdomain podman[107721]: 2026-02-01 08:58:58.302273098 +0000 UTC m=+42.088728432 container stop 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_id=tripleo_step5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, architecture=x86_64, release=1766032510, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, batch=17.1_20260112.1, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, container_name=nova_compute, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-compute-container, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:32:04Z, tcib_managed=true, vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, version=17.1.13, description=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:58:58 np0005604215.localdomain podman[107721]: 2026-02-01 08:58:58.336570788 +0000 UTC m=+42.123026102 container died 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, architecture=x86_64, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, config_id=tripleo_step5, name=rhosp-rhel9/openstack-nova-compute, batch=17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, container_name=nova_compute, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, io.buildah.version=1.41.5, version=17.1.13, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, distribution-scope=public)
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.timer: Deactivated successfully.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed to open /run/systemd/transient/1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: No such file or directory
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: tmp-crun.rzzXw4.mount: Deactivated successfully.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-66409f2cae0cc3fdf46266cf7a9b4ef7f2208d64cf24e912c16b5d672be00b92-merged.mount: Deactivated successfully.
Feb 01 08:58:58 np0005604215.localdomain podman[107721]: 2026-02-01 08:58:58.447790408 +0000 UTC m=+42.234245732 container cleanup 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, version=17.1.13, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, release=1766032510, vcs-type=git, architecture=x86_64, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_compute, managed_by=tripleo_ansible, io.openshift.expose-services=, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, url=https://www.redhat.com, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-nova-compute-container, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 08:58:58 np0005604215.localdomain podman[107721]: nova_compute
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.timer: Failed to open /run/systemd/transient/1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.timer: No such file or directory
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed to open /run/systemd/transient/1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: No such file or directory
Feb 01 08:58:58 np0005604215.localdomain podman[107953]: 2026-02-01 08:58:58.464540721 +0000 UTC m=+0.146200933 container cleanup 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, config_id=tripleo_step5, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, release=1766032510, version=17.1.13, url=https://www.redhat.com, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.component=openstack-nova-compute-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, tcib_managed=true, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']})
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: libpod-conmon-1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.scope: Deactivated successfully.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.timer: Failed to open /run/systemd/transient/1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.timer: No such file or directory
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: Failed to open /run/systemd/transient/1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e.service: No such file or directory
Feb 01 08:58:58 np0005604215.localdomain podman[107967]: 2026-02-01 08:58:58.542519573 +0000 UTC m=+0.051849248 container cleanup 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, container_name=nova_compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, konflux.additional-tags=17.1.13 17.1_20260112.1, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, config_id=tripleo_step5, org.opencontainers.image.created=2026-01-12T23:32:04Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, architecture=x86_64, vcs-type=git, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-compute-container, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, io.openshift.expose-services=)
Feb 01 08:58:58 np0005604215.localdomain podman[107967]: nova_compute
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: tripleo_nova_compute.service: Deactivated successfully.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: Stopped nova_compute container.
Feb 01 08:58:58 np0005604215.localdomain systemd[1]: tripleo_nova_compute.service: Consumed 1.101s CPU time, no IO.
Feb 01 08:58:58 np0005604215.localdomain sudo[107656]: pam_unix(sudo:session): session closed for user root
Feb 01 08:58:59 np0005604215.localdomain sudo[108069]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gumvmkrninnnaceqkhnlbnrqnrbmjnms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936338.709162-111-79104837533809/AnsiballZ_systemd_service.py
Feb 01 08:58:59 np0005604215.localdomain sudo[108069]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:58:59 np0005604215.localdomain python3.9[108071]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:58:59 np0005604215.localdomain systemd-rc-local-generator[108097]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:58:59 np0005604215.localdomain systemd-sysv-generator[108102]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: Stopping nova_migration_target container...
Feb 01 08:58:59 np0005604215.localdomain sshd[69422]: Received signal 15; terminating.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: libpod-080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.scope: Deactivated successfully.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: libpod-080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.scope: Consumed 33.437s CPU time.
Feb 01 08:58:59 np0005604215.localdomain podman[108112]: 2026-02-01 08:58:59.833040565 +0000 UTC m=+0.079241093 container died 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, summary=Red Hat OpenStack Platform 17.1 nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, tcib_managed=true, container_name=nova_migration_target, build-date=2026-01-12T23:32:04Z, distribution-scope=public, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, config_id=tripleo_step4, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, maintainer=OpenStack TripleO Team, architecture=x86_64, release=1766032510, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com)
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.timer: Deactivated successfully.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Failed to open /run/systemd/transient/080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: No such file or directory
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: tmp-crun.OE4cvd.mount: Deactivated successfully.
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96-userdata-shm.mount: Deactivated successfully.
Feb 01 08:58:59 np0005604215.localdomain podman[108112]: 2026-02-01 08:58:59.882709785 +0000 UTC m=+0.128910243 container cleanup 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step4, vendor=Red Hat, Inc., baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.expose-services=, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, io.buildah.version=1.41.5, container_name=nova_migration_target, name=rhosp-rhel9/openstack-nova-compute, build-date=2026-01-12T23:32:04Z, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, vcs-type=git, description=Red Hat OpenStack Platform 17.1 nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, release=1766032510, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, tcib_managed=true, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=openstack-nova-compute-container, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:58:59 np0005604215.localdomain podman[108112]: nova_migration_target
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.timer: Failed to open /run/systemd/transient/080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.timer: No such file or directory
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Failed to open /run/systemd/transient/080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: No such file or directory
Feb 01 08:58:59 np0005604215.localdomain podman[108124]: 2026-02-01 08:58:59.921606909 +0000 UTC m=+0.079292476 container cleanup 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, version=17.1.13, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-compute, vcs-type=git, build-date=2026-01-12T23:32:04Z, container_name=nova_migration_target, com.redhat.component=openstack-nova-compute-container, name=rhosp-rhel9/openstack-nova-compute, description=Red Hat OpenStack Platform 17.1 nova-compute, managed_by=tripleo_ansible, io.buildah.version=1.41.5, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:32:04Z, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, vendor=Red Hat, Inc., architecture=x86_64, config_id=tripleo_step4, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, url=https://www.redhat.com, release=1766032510, tcib_managed=true)
Feb 01 08:58:59 np0005604215.localdomain systemd[1]: libpod-conmon-080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.scope: Deactivated successfully.
Feb 01 08:59:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=360 DF PROTO=TCP SPT=46140 DPT=9882 SEQ=139889354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646878C0000000001030307) 
Feb 01 08:59:00 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.timer: Failed to open /run/systemd/transient/080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.timer: No such file or directory
Feb 01 08:59:00 np0005604215.localdomain systemd[1]: 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: Failed to open /run/systemd/transient/080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96.service: No such file or directory
Feb 01 08:59:00 np0005604215.localdomain podman[108141]: 2026-02-01 08:59:00.025020335 +0000 UTC m=+0.072475512 container cleanup 080dbed59cc2bdf7ec953594b316706c298919e7e0007b17239bca57f337ed96 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_migration_target, distribution-scope=public, release=1766032510, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, org.opencontainers.image.created=2026-01-12T23:32:04Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-nova-compute, maintainer=OpenStack TripleO Team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/nova-migration-target.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/etc/ssh:/host-ssh:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_migration_target, version=17.1.13, managed_by=tripleo_ansible, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, com.redhat.component=openstack-nova-compute-container, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, vcs-type=git, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-compute, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:32:04Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-compute)
Feb 01 08:59:00 np0005604215.localdomain podman[108141]: nova_migration_target
Feb 01 08:59:00 np0005604215.localdomain systemd[1]: tripleo_nova_migration_target.service: Deactivated successfully.
Feb 01 08:59:00 np0005604215.localdomain systemd[1]: Stopped nova_migration_target container.
Feb 01 08:59:00 np0005604215.localdomain sudo[108069]: pam_unix(sudo:session): session closed for user root
Feb 01 08:59:00 np0005604215.localdomain sudo[108243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nvrhaayibijpvuqvognmadqfchqwclas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936340.1910138-111-250982975810129/AnsiballZ_systemd_service.py
Feb 01 08:59:00 np0005604215.localdomain sudo[108243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 08:59:00 np0005604215.localdomain python3.9[108245]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 08:59:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8fb1968646de61e5d6c5b7938dce54da276edc06f0bc75651b588722ba09cba1-merged.mount: Deactivated successfully.
Feb 01 08:59:00 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 08:59:00 np0005604215.localdomain systemd-sysv-generator[108275]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 08:59:00 np0005604215.localdomain systemd-rc-local-generator[108271]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 08:59:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=361 DF PROTO=TCP SPT=46140 DPT=9882 SEQ=139889354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6468B8D0000000001030307) 
Feb 01 08:59:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 08:59:01 np0005604215.localdomain systemd[1]: Stopping nova_virtlogd_wrapper container...
Feb 01 08:59:01 np0005604215.localdomain systemd[1]: libpod-4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa.scope: Deactivated successfully.
Feb 01 08:59:01 np0005604215.localdomain podman[108286]: 2026-02-01 08:59:01.324341661 +0000 UTC m=+0.072338587 container stop 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, container_name=nova_virtlogd_wrapper, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, com.redhat.component=openstack-nova-libvirt-container, batch=17.1_20260112.1, distribution-scope=public, name=rhosp-rhel9/openstack-nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, tcib_managed=true, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, architecture=x86_64)
Feb 01 08:59:01 np0005604215.localdomain podman[108286]: 2026-02-01 08:59:01.360500269 +0000 UTC m=+0.108497155 container died 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, url=https://www.redhat.com, vcs-type=git, io.buildah.version=1.41.5, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, distribution-scope=public, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64)
Feb 01 08:59:01 np0005604215.localdomain systemd[1]: tmp-crun.5ffSDJ.mount: Deactivated successfully.
Feb 01 08:59:01 np0005604215.localdomain podman[108286]: 2026-02-01 08:59:01.399513406 +0000 UTC m=+0.147510272 container cleanup 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, container_name=nova_virtlogd_wrapper, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, vendor=Red Hat, Inc., summary=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, version=17.1.13, config_id=tripleo_step3, distribution-scope=public, build-date=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']})
Feb 01 08:59:01 np0005604215.localdomain podman[108286]: nova_virtlogd_wrapper
Feb 01 08:59:01 np0005604215.localdomain podman[108298]: 2026-02-01 08:59:01.460022504 +0000 UTC m=+0.116793865 container cleanup 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_virtlogd_wrapper, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, managed_by=tripleo_ansible, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, maintainer=OpenStack TripleO Team, version=17.1.13, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, config_id=tripleo_step3)
Feb 01 08:59:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c-merged.mount: Deactivated successfully.
Feb 01 08:59:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa-userdata-shm.mount: Deactivated successfully.
Feb 01 08:59:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=362 DF PROTO=TCP SPT=46140 DPT=9882 SEQ=139889354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646938E0000000001030307) 
Feb 01 08:59:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4763 DF PROTO=TCP SPT=46602 DPT=9102 SEQ=3686674674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646A10D0000000001030307) 
Feb 01 08:59:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57105 DF PROTO=TCP SPT=50090 DPT=9100 SEQ=2105554257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646AD8D0000000001030307) 
Feb 01 08:59:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52467 DF PROTO=TCP SPT=59864 DPT=9101 SEQ=3165624299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646BACD0000000001030307) 
Feb 01 08:59:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=364 DF PROTO=TCP SPT=46140 DPT=9882 SEQ=139889354 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646C30D0000000001030307) 
Feb 01 08:59:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4765 DF PROTO=TCP SPT=46602 DPT=9102 SEQ=3686674674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646D10D0000000001030307) 
Feb 01 08:59:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57107 DF PROTO=TCP SPT=50090 DPT=9100 SEQ=2105554257 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646DD0D0000000001030307) 
Feb 01 08:59:22 np0005604215.localdomain sudo[108315]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 08:59:22 np0005604215.localdomain sudo[108315]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:59:22 np0005604215.localdomain sudo[108315]: pam_unix(sudo:session): session closed for user root
Feb 01 08:59:22 np0005604215.localdomain sudo[108330]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 08:59:22 np0005604215.localdomain sudo[108330]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:59:23 np0005604215.localdomain sudo[108330]: pam_unix(sudo:session): session closed for user root
Feb 01 08:59:23 np0005604215.localdomain sudo[108376]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 08:59:23 np0005604215.localdomain sudo[108376]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 08:59:23 np0005604215.localdomain sudo[108376]: pam_unix(sudo:session): session closed for user root
Feb 01 08:59:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:59:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:59:25 np0005604215.localdomain podman[108391]: 2026-02-01 08:59:25.376942891 +0000 UTC m=+0.091322940 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., managed_by=tripleo_ansible, build-date=2026-01-12T22:56:19Z, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, architecture=x86_64, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 08:59:25 np0005604215.localdomain podman[108391]: 2026-02-01 08:59:25.399538206 +0000 UTC m=+0.113918255 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, vcs-type=git, build-date=2026-01-12T22:56:19Z, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, container_name=ovn_metadata_agent, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, config_id=tripleo_step4, io.openshift.expose-services=, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 08:59:25 np0005604215.localdomain podman[108392]: 2026-02-01 08:59:25.441767493 +0000 UTC m=+0.156194193 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, build-date=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, release=1766032510, io.buildah.version=1.41.5, io.openshift.expose-services=, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, batch=17.1_20260112.1, vcs-type=git, maintainer=OpenStack TripleO Team, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, konflux.additional-tags=17.1.13 17.1_20260112.1, name=rhosp-rhel9/openstack-ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, vendor=Red Hat, Inc., tcib_managed=true, container_name=ovn_controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13)
Feb 01 08:59:25 np0005604215.localdomain podman[108391]: unhealthy
Feb 01 08:59:25 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:59:25 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:59:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=52469 DF PROTO=TCP SPT=59864 DPT=9101 SEQ=3165624299 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646EB0D0000000001030307) 
Feb 01 08:59:25 np0005604215.localdomain podman[108392]: 2026-02-01 08:59:25.506800833 +0000 UTC m=+0.221227573 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, container_name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, tcib_managed=true, config_id=tripleo_step4, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, org.opencontainers.image.created=2026-01-12T22:36:40Z, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, batch=17.1_20260112.1, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, version=17.1.13, release=1766032510, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:59:25 np0005604215.localdomain podman[108392]: unhealthy
Feb 01 08:59:25 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:59:25 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 08:59:28 np0005604215.localdomain systemd[1]: Starting Check and recover tripleo_nova_virtqemud...
Feb 01 08:59:28 np0005604215.localdomain recover_tripleo_nova_virtqemud[108429]: 62016
Feb 01 08:59:28 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.service: Deactivated successfully.
Feb 01 08:59:28 np0005604215.localdomain systemd[1]: Finished Check and recover tripleo_nova_virtqemud.
Feb 01 08:59:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46460 DF PROTO=TCP SPT=45212 DPT=9882 SEQ=530326567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA646FCBC0000000001030307) 
Feb 01 08:59:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46461 DF PROTO=TCP SPT=45212 DPT=9882 SEQ=530326567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64700CD0000000001030307) 
Feb 01 08:59:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46462 DF PROTO=TCP SPT=45212 DPT=9882 SEQ=530326567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64708CE0000000001030307) 
Feb 01 08:59:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15831 DF PROTO=TCP SPT=38952 DPT=9102 SEQ=3518675689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647164E0000000001030307) 
Feb 01 08:59:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61418 DF PROTO=TCP SPT=48410 DPT=9100 SEQ=3396714805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64722CE0000000001030307) 
Feb 01 08:59:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41673 DF PROTO=TCP SPT=39862 DPT=9101 SEQ=1408653870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647300D0000000001030307) 
Feb 01 08:59:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46464 DF PROTO=TCP SPT=45212 DPT=9882 SEQ=530326567 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647390D0000000001030307) 
Feb 01 08:59:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15833 DF PROTO=TCP SPT=38952 DPT=9102 SEQ=3518675689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647470D0000000001030307) 
Feb 01 08:59:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61420 DF PROTO=TCP SPT=48410 DPT=9100 SEQ=3396714805 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647530D0000000001030307) 
Feb 01 08:59:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41675 DF PROTO=TCP SPT=39862 DPT=9101 SEQ=1408653870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647610D0000000001030307) 
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 08:59:55 np0005604215.localdomain podman[108430]: 2026-02-01 08:59:55.87595792 +0000 UTC m=+0.087111749 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, distribution-scope=public, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, version=17.1.13, config_id=tripleo_step4, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=ovn_metadata_agent, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, url=https://www.redhat.com, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, vcs-type=git)
Feb 01 08:59:55 np0005604215.localdomain podman[108430]: 2026-02-01 08:59:55.892925489 +0000 UTC m=+0.104079308 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, managed_by=tripleo_ansible, distribution-scope=public, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, batch=17.1_20260112.1, io.openshift.expose-services=, version=17.1.13, url=https://www.redhat.com, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, tcib_managed=true, architecture=x86_64, config_id=tripleo_step4, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-12T22:56:19Z, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc.)
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: tmp-crun.oti1IQ.mount: Deactivated successfully.
Feb 01 08:59:55 np0005604215.localdomain podman[108431]: 2026-02-01 08:59:55.938232853 +0000 UTC m=+0.145715727 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, vendor=Red Hat, Inc., config_id=tripleo_step4, name=rhosp-rhel9/openstack-ovn-controller, vcs-type=git, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, tcib_managed=true, io.openshift.expose-services=, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, maintainer=OpenStack TripleO Team, distribution-scope=public, url=https://www.redhat.com, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 08:59:55 np0005604215.localdomain podman[108430]: unhealthy
Feb 01 08:59:55 np0005604215.localdomain podman[108431]: 2026-02-01 08:59:55.95674081 +0000 UTC m=+0.164223694 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, distribution-scope=public, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.openshift.expose-services=, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, managed_by=tripleo_ansible, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vendor=Red Hat, Inc., container_name=ovn_controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, build-date=2026-01-12T22:36:40Z, config_id=tripleo_step4, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller)
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 08:59:55 np0005604215.localdomain podman[108431]: unhealthy
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 08:59:55 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 09:00:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7057 DF PROTO=TCP SPT=41244 DPT=9882 SEQ=397158234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64771ED0000000001030307) 
Feb 01 09:00:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7058 DF PROTO=TCP SPT=41244 DPT=9882 SEQ=397158234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647760D0000000001030307) 
Feb 01 09:00:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7059 DF PROTO=TCP SPT=41244 DPT=9882 SEQ=397158234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6477E0E0000000001030307) 
Feb 01 09:00:05 np0005604215.localdomain sshd[108468]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:00:06 np0005604215.localdomain sshd[108468]: Invalid user scheduler from 85.206.171.113 port 35222
Feb 01 09:00:06 np0005604215.localdomain sshd[108468]: Received disconnect from 85.206.171.113 port 35222:11: Bye Bye [preauth]
Feb 01 09:00:06 np0005604215.localdomain sshd[108468]: Disconnected from invalid user scheduler 85.206.171.113 port 35222 [preauth]
Feb 01 09:00:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45584 DF PROTO=TCP SPT=53482 DPT=9102 SEQ=864165797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6478B8D0000000001030307) 
Feb 01 09:00:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50678 DF PROTO=TCP SPT=36468 DPT=9100 SEQ=3026304708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647980D0000000001030307) 
Feb 01 09:00:12 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41676 DF PROTO=TCP SPT=39862 DPT=9101 SEQ=1408653870 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647A10D0000000001030307) 
Feb 01 09:00:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7061 DF PROTO=TCP SPT=41244 DPT=9882 SEQ=397158234 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647AF0D0000000001030307) 
Feb 01 09:00:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45586 DF PROTO=TCP SPT=53482 DPT=9102 SEQ=864165797 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647BB0D0000000001030307) 
Feb 01 09:00:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50680 DF PROTO=TCP SPT=36468 DPT=9100 SEQ=3026304708 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647C90D0000000001030307) 
Feb 01 09:00:24 np0005604215.localdomain sudo[108470]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:00:24 np0005604215.localdomain sudo[108470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:00:24 np0005604215.localdomain sudo[108470]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:24 np0005604215.localdomain sudo[108485]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:00:24 np0005604215.localdomain sudo[108485]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:00:24 np0005604215.localdomain sudo[108485]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12287 DF PROTO=TCP SPT=53502 DPT=9101 SEQ=3656957715 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647D50E0000000001030307) 
Feb 01 09:00:25 np0005604215.localdomain sudo[108531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: State 'stop-sigterm' timed out. Killing.
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Killing process 61244 (conmon) with signal SIGKILL.
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Main process exited, code=killed, status=9/KILL
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: libpod-conmon-4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa.scope: Deactivated successfully.
Feb 01 09:00:25 np0005604215.localdomain sudo[108531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:00:25 np0005604215.localdomain sudo[108531]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: tmp-crun.qagyoO.mount: Deactivated successfully.
Feb 01 09:00:25 np0005604215.localdomain podman[108559]: error opening file `/run/crun/4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa/status`: No such file or directory
Feb 01 09:00:25 np0005604215.localdomain podman[108545]: 2026-02-01 09:00:25.5457218 +0000 UTC m=+0.079077877 container cleanup 4e95a1e950181e0667190c6bd97db8923001994e00adc494730e9ca958d24dfa (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd_wrapper, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 0, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtlogd.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/container-config-scripts/virtlogd_wrapper:/usr/local/bin/virtlogd_wrapper:ro']}, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, vendor=Red Hat, Inc., batch=17.1_20260112.1, vcs-type=git, distribution-scope=public, container_name=nova_virtlogd_wrapper, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, release=1766032510, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, version=17.1.13)
Feb 01 09:00:25 np0005604215.localdomain podman[108545]: nova_virtlogd_wrapper
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: tripleo_nova_virtlogd_wrapper.service: Failed with result 'timeout'.
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: Stopped nova_virtlogd_wrapper container.
Feb 01 09:00:25 np0005604215.localdomain sudo[108243]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:25 np0005604215.localdomain sudo[108650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-glqjpkmcdndvxsedfvuvyipeeojukcjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936425.7164803-111-94349639903326/AnsiballZ_systemd_service.py
Feb 01 09:00:25 np0005604215.localdomain sudo[108650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 09:00:26 np0005604215.localdomain podman[108653]: 2026-02-01 09:00:26.102180831 +0000 UTC m=+0.091058311 container health_status e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, health_status=unhealthy, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_id=tripleo_step4, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, cpe=cpe:/a:redhat:openstack:17.1::el9, maintainer=OpenStack TripleO Team, release=1766032510, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, io.openshift.expose-services=, url=https://www.redhat.com, vendor=Red Hat, Inc., architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, build-date=2026-01-12T22:56:19Z, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, distribution-scope=public, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:00:26 np0005604215.localdomain podman[108654]: 2026-02-01 09:00:26.153394419 +0000 UTC m=+0.140577877 container health_status e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, health_status=unhealthy, com.redhat.component=openstack-ovn-controller-container, release=1766032510, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, tcib_managed=true, vcs-type=git, container_name=ovn_controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, distribution-scope=public, maintainer=OpenStack TripleO Team, managed_by=tripleo_ansible, vendor=Red Hat, Inc., org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhosp-rhel9/openstack-ovn-controller, url=https://www.redhat.com, io.openshift.expose-services=, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c)
Feb 01 09:00:26 np0005604215.localdomain podman[108653]: 2026-02-01 09:00:26.169490771 +0000 UTC m=+0.158368211 container exec_died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, distribution-scope=public, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, architecture=x86_64, maintainer=OpenStack TripleO Team, release=1766032510, url=https://www.redhat.com, io.openshift.expose-services=, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.buildah.version=1.41.5, managed_by=tripleo_ansible, container_name=ovn_metadata_agent, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, tcib_managed=true, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 09:00:26 np0005604215.localdomain podman[108653]: unhealthy
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed with result 'exit-code'.
Feb 01 09:00:26 np0005604215.localdomain podman[108654]: 2026-02-01 09:00:26.194675117 +0000 UTC m=+0.181858605 container exec_died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, description=Red Hat OpenStack Platform 17.1 ovn-controller, version=17.1.13, tcib_managed=true, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-type=git, org.opencontainers.image.created=2026-01-12T22:36:40Z, com.redhat.component=openstack-ovn-controller-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, url=https://www.redhat.com, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T22:36:40Z, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, name=rhosp-rhel9/openstack-ovn-controller, batch=17.1_20260112.1, container_name=ovn_controller, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.buildah.version=1.41.5, release=1766032510, architecture=x86_64)
Feb 01 09:00:26 np0005604215.localdomain podman[108654]: unhealthy
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed with result 'exit-code'.
Feb 01 09:00:26 np0005604215.localdomain python3.9[108652]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:26 np0005604215.localdomain systemd-rc-local-generator[108717]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:26 np0005604215.localdomain systemd-sysv-generator[108722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: Stopping nova_virtnodedevd container...
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: libpod-883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff.scope: Deactivated successfully.
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: libpod-883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff.scope: Consumed 1.464s CPU time.
Feb 01 09:00:26 np0005604215.localdomain podman[108730]: 2026-02-01 09:00:26.764505925 +0000 UTC m=+0.082054411 container died 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, release=1766032510, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=tripleo_step3, description=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, url=https://www.redhat.com, tcib_managed=true, maintainer=OpenStack TripleO Team, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, cpe=cpe:/a:redhat:openstack:17.1::el9, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_virtnodedevd, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, vendor=Red Hat, Inc.)
Feb 01 09:00:26 np0005604215.localdomain podman[108730]: 2026-02-01 09:00:26.807001811 +0000 UTC m=+0.124550297 container cleanup 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, vendor=Red Hat, Inc., description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, container_name=nova_virtnodedevd, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, com.redhat.component=openstack-nova-libvirt-container, config_id=tripleo_step3, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, managed_by=tripleo_ansible, version=17.1.13, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 09:00:26 np0005604215.localdomain podman[108730]: nova_virtnodedevd
Feb 01 09:00:26 np0005604215.localdomain podman[108745]: 2026-02-01 09:00:26.859679284 +0000 UTC m=+0.074628609 container cleanup 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, managed_by=tripleo_ansible, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, container_name=nova_virtnodedevd, config_id=tripleo_step3, version=17.1.13, release=1766032510, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.openshift.expose-services=, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container)
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: libpod-conmon-883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff.scope: Deactivated successfully.
Feb 01 09:00:26 np0005604215.localdomain podman[108773]: error opening file `/run/crun/883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff/status`: No such file or directory
Feb 01 09:00:26 np0005604215.localdomain podman[108761]: 2026-02-01 09:00:26.955125792 +0000 UTC m=+0.061769408 container cleanup 883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtnodedevd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 2, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtnodedevd.json:/var/lib/kolla/config_files/config.json:ro']}, architecture=x86_64, container_name=nova_virtnodedevd, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, io.buildah.version=1.41.5, com.redhat.component=openstack-nova-libvirt-container, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, release=1766032510, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, vcs-type=git, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 09:00:26 np0005604215.localdomain podman[108761]: nova_virtnodedevd
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: tripleo_nova_virtnodedevd.service: Deactivated successfully.
Feb 01 09:00:26 np0005604215.localdomain systemd[1]: Stopped nova_virtnodedevd container.
Feb 01 09:00:27 np0005604215.localdomain sudo[108650]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:27 np0005604215.localdomain sudo[108864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihdhcgoyqjwgomijampohewnbhgfpurc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936427.1466603-111-101560685094483/AnsiballZ_systemd_service.py
Feb 01 09:00:27 np0005604215.localdomain sudo[108864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a-merged.mount: Deactivated successfully.
Feb 01 09:00:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-883f6f56ab9fa5aa479ce063c46f9ab4ceeecb724900013295141ba5cef97aff-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:27 np0005604215.localdomain python3.9[108866]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:27 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:27 np0005604215.localdomain systemd-rc-local-generator[108890]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:27 np0005604215.localdomain systemd-sysv-generator[108896]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:27 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: Stopping nova_virtproxyd container...
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: libpod-3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac.scope: Deactivated successfully.
Feb 01 09:00:28 np0005604215.localdomain podman[108906]: 2026-02-01 09:00:28.226071993 +0000 UTC m=+0.074251157 container died 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, cpe=cpe:/a:redhat:openstack:17.1::el9, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtproxyd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., config_id=tripleo_step3, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, managed_by=tripleo_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, version=17.1.13, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510)
Feb 01 09:00:28 np0005604215.localdomain podman[108906]: 2026-02-01 09:00:28.266863656 +0000 UTC m=+0.115042790 container cleanup 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, name=rhosp-rhel9/openstack-nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, org.opencontainers.image.created=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., version=17.1.13, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, container_name=nova_virtproxyd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, vcs-type=git, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, release=1766032510, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.5)
Feb 01 09:00:28 np0005604215.localdomain podman[108906]: nova_virtproxyd
Feb 01 09:00:28 np0005604215.localdomain podman[108920]: 2026-02-01 09:00:28.307206794 +0000 UTC m=+0.072513443 container cleanup 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, url=https://www.redhat.com, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, com.redhat.component=openstack-nova-libvirt-container, container_name=nova_virtproxyd, distribution-scope=public, io.buildah.version=1.41.5, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, managed_by=tripleo_ansible, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, build-date=2026-01-12T23:31:49Z, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, config_id=tripleo_step3, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, maintainer=OpenStack TripleO Team, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: libpod-conmon-3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac.scope: Deactivated successfully.
Feb 01 09:00:28 np0005604215.localdomain podman[108946]: error opening file `/run/crun/3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac/status`: No such file or directory
Feb 01 09:00:28 np0005604215.localdomain podman[108935]: 2026-02-01 09:00:28.416712181 +0000 UTC m=+0.072023898 container cleanup 3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtproxyd, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 5, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtproxyd.json:/var/lib/kolla/config_files/config.json:ro']}, distribution-scope=public, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, vcs-type=git, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, build-date=2026-01-12T23:31:49Z, tcib_managed=true, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, container_name=nova_virtproxyd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt)
Feb 01 09:00:28 np0005604215.localdomain podman[108935]: nova_virtproxyd
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: tripleo_nova_virtproxyd.service: Deactivated successfully.
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: Stopped nova_virtproxyd container.
Feb 01 09:00:28 np0005604215.localdomain sudo[108864]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce-merged.mount: Deactivated successfully.
Feb 01 09:00:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3165c6655234691f8c773165b4ab8c73e334d265dc9bbf78941b544f8a9449ac-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:28 np0005604215.localdomain sudo[109039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kceyarauktvyucmfrrxlpwociyvtsxls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936428.6003563-111-252271812362265/AnsiballZ_systemd_service.py
Feb 01 09:00:28 np0005604215.localdomain sudo[109039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:29 np0005604215.localdomain python3.9[109041]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:29 np0005604215.localdomain systemd-rc-local-generator[109066]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:29 np0005604215.localdomain systemd-sysv-generator[109073]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud_recover.timer: Deactivated successfully.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: Stopped Check and recover tripleo_nova_virtqemud every 10m.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: Stopping nova_virtqemud container...
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: libpod-526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70.scope: Deactivated successfully.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: libpod-526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70.scope: Consumed 2.095s CPU time.
Feb 01 09:00:29 np0005604215.localdomain podman[109081]: 2026-02-01 09:00:29.56353611 +0000 UTC m=+0.076320673 container died 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, name=rhosp-rhel9/openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, version=17.1.13, url=https://www.redhat.com, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, managed_by=tripleo_ansible, vendor=Red Hat, Inc., build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, batch=17.1_20260112.1, io.buildah.version=1.41.5, container_name=nova_virtqemud, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3)
Feb 01 09:00:29 np0005604215.localdomain podman[109081]: 2026-02-01 09:00:29.598441309 +0000 UTC m=+0.111225842 container cleanup 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, tcib_managed=true, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, build-date=2026-01-12T23:31:49Z, vendor=Red Hat, Inc., name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, managed_by=tripleo_ansible, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git, maintainer=OpenStack TripleO Team, org.opencontainers.image.created=2026-01-12T23:31:49Z, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.component=openstack-nova-libvirt-container, release=1766032510, distribution-scope=public, config_id=tripleo_step3, container_name=nova_virtqemud, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, batch=17.1_20260112.1)
Feb 01 09:00:29 np0005604215.localdomain podman[109081]: nova_virtqemud
Feb 01 09:00:29 np0005604215.localdomain podman[109095]: 2026-02-01 09:00:29.638365694 +0000 UTC m=+0.056860015 container cleanup 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, org.opencontainers.image.created=2026-01-12T23:31:49Z, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, build-date=2026-01-12T23:31:49Z, batch=17.1_20260112.1, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, io.openshift.expose-services=, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, version=17.1.13, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_virtqemud, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64-merged.mount: Deactivated successfully.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: libpod-conmon-526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70.scope: Deactivated successfully.
Feb 01 09:00:29 np0005604215.localdomain podman[109121]: error opening file `/run/crun/526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70/status`: No such file or directory
Feb 01 09:00:29 np0005604215.localdomain podman[109109]: 2026-02-01 09:00:29.753082713 +0000 UTC m=+0.077425646 container cleanup 526ebca495097f81b426b805eea65425bd57d645348c5ecb89d3373156d69f70 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtqemud, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_virtqemud, version=17.1.13, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, vendor=Red Hat, Inc., managed_by=tripleo_ansible, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 4, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtqemud.json:/var/lib/kolla/config_files/config.json:ro', '/var/log/containers/libvirt/swtpm:/var/log/swtpm:z']}, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, build-date=2026-01-12T23:31:49Z, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, com.redhat.component=openstack-nova-libvirt-container, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step3, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, architecture=x86_64, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, vcs-type=git)
Feb 01 09:00:29 np0005604215.localdomain podman[109109]: nova_virtqemud
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: tripleo_nova_virtqemud.service: Deactivated successfully.
Feb 01 09:00:29 np0005604215.localdomain systemd[1]: Stopped nova_virtqemud container.
Feb 01 09:00:29 np0005604215.localdomain sudo[109039]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54942 DF PROTO=TCP SPT=41284 DPT=9882 SEQ=3822266166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647E71C0000000001030307) 
Feb 01 09:00:30 np0005604215.localdomain sudo[109212]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olmquyqsgukhuykxpwpjknpgepjyxndn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936429.9935346-111-91042898648380/AnsiballZ_systemd_service.py
Feb 01 09:00:30 np0005604215.localdomain sudo[109212]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:30 np0005604215.localdomain python3.9[109214]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud_recover.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:30 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:30 np0005604215.localdomain systemd-rc-local-generator[109239]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:30 np0005604215.localdomain systemd-sysv-generator[109244]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:30 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:30 np0005604215.localdomain sudo[109212]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54943 DF PROTO=TCP SPT=41284 DPT=9882 SEQ=3822266166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647EB0D0000000001030307) 
Feb 01 09:00:31 np0005604215.localdomain sudo[109342]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czxwmdthgqnhextubsljvpwqssfomclp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936431.0679054-111-196487124073916/AnsiballZ_systemd_service.py
Feb 01 09:00:31 np0005604215.localdomain sudo[109342]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:31 np0005604215.localdomain python3.9[109344]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:32 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:32 np0005604215.localdomain systemd-sysv-generator[109373]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:32 np0005604215.localdomain systemd-rc-local-generator[109370]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:33 np0005604215.localdomain systemd[1]: Stopping nova_virtsecretd container...
Feb 01 09:00:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54944 DF PROTO=TCP SPT=41284 DPT=9882 SEQ=3822266166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA647F30D0000000001030307) 
Feb 01 09:00:33 np0005604215.localdomain systemd[1]: libpod-a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3.scope: Deactivated successfully.
Feb 01 09:00:33 np0005604215.localdomain podman[109385]: 2026-02-01 09:00:33.118885521 +0000 UTC m=+0.078350555 container died a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, batch=17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.5, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=nova_virtsecretd, org.opencontainers.image.created=2026-01-12T23:31:49Z, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, release=1766032510, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, config_id=tripleo_step3, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, com.redhat.component=openstack-nova-libvirt-container, architecture=x86_64, description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, name=rhosp-rhel9/openstack-nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-type=git)
Feb 01 09:00:33 np0005604215.localdomain podman[109385]: 2026-02-01 09:00:33.152062417 +0000 UTC m=+0.111527411 container cleanup a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, io.openshift.expose-services=, maintainer=OpenStack TripleO Team, architecture=x86_64, container_name=nova_virtsecretd, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, distribution-scope=public, config_id=tripleo_step3, url=https://www.redhat.com, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, tcib_managed=true, vcs-type=git, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., io.buildah.version=1.41.5)
Feb 01 09:00:33 np0005604215.localdomain podman[109385]: nova_virtsecretd
Feb 01 09:00:33 np0005604215.localdomain podman[109400]: 2026-02-01 09:00:33.212238443 +0000 UTC m=+0.078092477 container cleanup a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, org.opencontainers.image.created=2026-01-12T23:31:49Z, build-date=2026-01-12T23:31:49Z, distribution-scope=public, vendor=Red Hat, Inc., vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.component=openstack-nova-libvirt-container, maintainer=OpenStack TripleO Team, io.openshift.expose-services=, container_name=nova_virtsecretd, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1766032510, description=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, url=https://www.redhat.com, managed_by=tripleo_ansible, version=17.1.13, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, vcs-type=git, name=rhosp-rhel9/openstack-nova-libvirt)
Feb 01 09:00:33 np0005604215.localdomain systemd[1]: libpod-conmon-a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3.scope: Deactivated successfully.
Feb 01 09:00:33 np0005604215.localdomain podman[109427]: error opening file `/run/crun/a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3/status`: No such file or directory
Feb 01 09:00:33 np0005604215.localdomain podman[109415]: 2026-02-01 09:00:33.32208195 +0000 UTC m=+0.077998343 container cleanup a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtsecretd, vcs-type=git, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, release=1766032510, name=rhosp-rhel9/openstack-nova-libvirt, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, description=Red Hat OpenStack Platform 17.1 nova-libvirt, managed_by=tripleo_ansible, build-date=2026-01-12T23:31:49Z, container_name=nova_virtsecretd, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.created=2026-01-12T23:31:49Z, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 1, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtsecretd.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=OpenStack TripleO Team, config_id=tripleo_step3, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, version=17.1.13, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, url=https://www.redhat.com, com.redhat.component=openstack-nova-libvirt-container)
Feb 01 09:00:33 np0005604215.localdomain podman[109415]: nova_virtsecretd
Feb 01 09:00:33 np0005604215.localdomain systemd[1]: tripleo_nova_virtsecretd.service: Deactivated successfully.
Feb 01 09:00:33 np0005604215.localdomain systemd[1]: Stopped nova_virtsecretd container.
Feb 01 09:00:33 np0005604215.localdomain sudo[109342]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:33 np0005604215.localdomain sudo[109518]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdwvtrtwjbomgmcpyprrvinyqsucftjb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936433.4878905-111-177783434996205/AnsiballZ_systemd_service.py
Feb 01 09:00:33 np0005604215.localdomain sudo[109518]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7-merged.mount: Deactivated successfully.
Feb 01 09:00:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a40eccdb34adf480cf82211333a94dba024795e8b0c70208a91e22bed3cf9ef3-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:34 np0005604215.localdomain python3.9[109520]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:35 np0005604215.localdomain systemd-rc-local-generator[109546]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:35 np0005604215.localdomain systemd-sysv-generator[109550]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: Stopping nova_virtstoraged container...
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: libpod-39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5.scope: Deactivated successfully.
Feb 01 09:00:35 np0005604215.localdomain podman[109561]: 2026-02-01 09:00:35.575175464 +0000 UTC m=+0.075509047 container died 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtstoraged, io.openshift.expose-services=, release=1766032510, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, distribution-scope=public, com.redhat.component=openstack-nova-libvirt-container, build-date=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, maintainer=OpenStack TripleO Team, version=17.1.13, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-nova-libvirt, tcib_managed=true, cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, io.buildah.version=1.41.5)
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: tmp-crun.PE60ms.mount: Deactivated successfully.
Feb 01 09:00:35 np0005604215.localdomain podman[109561]: 2026-02-01 09:00:35.623162511 +0000 UTC m=+0.123496044 container cleanup 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, build-date=2026-01-12T23:31:49Z, name=rhosp-rhel9/openstack-nova-libvirt, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, batch=17.1_20260112.1, tcib_managed=true, distribution-scope=public, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, architecture=x86_64, url=https://www.redhat.com, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.buildah.version=1.41.5, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, release=1766032510, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, container_name=nova_virtstoraged, version=17.1.13, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, config_id=tripleo_step3, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, description=Red Hat OpenStack Platform 17.1 nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, io.openshift.expose-services=)
Feb 01 09:00:35 np0005604215.localdomain podman[109561]: nova_virtstoraged
Feb 01 09:00:35 np0005604215.localdomain podman[109574]: 2026-02-01 09:00:35.650986729 +0000 UTC m=+0.065739953 container cleanup 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, vcs-type=git, version=17.1.13, release=1766032510, io.openshift.expose-services=, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.buildah.version=1.41.5, tcib_managed=true, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, config_id=tripleo_step3, org.opencontainers.image.created=2026-01-12T23:31:49Z, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, batch=17.1_20260112.1, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, cpe=cpe:/a:redhat:openstack:17.1::el9, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, container_name=nova_virtstoraged, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, konflux.additional-tags=17.1.13 17.1_20260112.1, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, managed_by=tripleo_ansible, com.redhat.component=openstack-nova-libvirt-container, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vendor=Red Hat, Inc., maintainer=OpenStack TripleO Team, url=https://www.redhat.com)
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: libpod-conmon-39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5.scope: Deactivated successfully.
Feb 01 09:00:35 np0005604215.localdomain podman[109603]: error opening file `/run/crun/39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5/status`: No such file or directory
Feb 01 09:00:35 np0005604215.localdomain podman[109591]: 2026-02-01 09:00:35.744249028 +0000 UTC m=+0.064462752 container cleanup 39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtstoraged, tcib_managed=true, architecture=x86_64, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, config_data={'cgroupns': 'host', 'depends_on': ['tripleo_nova_virtlogd_wrapper.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '1296029e90a465a2201c8dc6f8be17e7'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1', 'net': 'host', 'pid': 'host', 'pids_limit': 65536, 'privileged': True, 'restart': 'always', 'security_opt': ['label=level:s0', 'label=type:spc_t', 'label=filetype:container_file_t'], 'start_order': 3, 'ulimit': ['nofile=131072', 'nproc=126960'], 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/libvirt:/var/log/libvirt:shared,z', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/run:/run', '/sys/fs/cgroup:/sys/fs/cgroup', '/sys/fs/selinux:/sys/fs/selinux', '/etc/selinux/config:/etc/selinux/config:ro', '/etc/libvirt:/etc/libvirt:shared', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/libvirt:/var/lib/libvirt:shared', '/var/cache/libvirt:/var/cache/libvirt:shared', '/var/lib/vhost_sockets:/var/lib/vhost_sockets', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/var/lib/kolla/config_files/nova_virtstoraged.json:/var/lib/kolla/config_files/config.json:ro']}, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, com.redhat.component=openstack-nova-libvirt-container, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, description=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, distribution-scope=public, release=1766032510, url=https://www.redhat.com, managed_by=tripleo_ansible, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, container_name=nova_virtstoraged, name=rhosp-rhel9/openstack-nova-libvirt, vcs-type=git, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vendor=Red Hat, Inc., io.buildah.version=1.41.5, config_id=tripleo_step3, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, version=17.1.13, build-date=2026-01-12T23:31:49Z)
Feb 01 09:00:35 np0005604215.localdomain podman[109591]: nova_virtstoraged
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: tripleo_nova_virtstoraged.service: Deactivated successfully.
Feb 01 09:00:35 np0005604215.localdomain systemd[1]: Stopped nova_virtstoraged container.
Feb 01 09:00:35 np0005604215.localdomain sudo[109518]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:36 np0005604215.localdomain sudo[109694]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cschrfvhgeyusiivoypvgcctemfyfiha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936435.8934672-111-270877254029527/AnsiballZ_systemd_service.py
Feb 01 09:00:36 np0005604215.localdomain sudo[109694]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:36 np0005604215.localdomain python3.9[109696]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_controller.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1259 DF PROTO=TCP SPT=43364 DPT=9102 SEQ=3186521073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64800CD0000000001030307) 
Feb 01 09:00:36 np0005604215.localdomain systemd-sysv-generator[109722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:36 np0005604215.localdomain systemd-rc-local-generator[109719]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: tmp-crun.aIpB62.mount: Deactivated successfully.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-39157644699ac29a43c584aadeae04badec11f2504b552a425c9256e9b3f3dc5-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9-merged.mount: Deactivated successfully.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: Stopping ovn_controller container...
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: libpod-e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.scope: Deactivated successfully.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: libpod-e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.scope: Consumed 2.533s CPU time.
Feb 01 09:00:36 np0005604215.localdomain podman[109737]: 2026-02-01 09:00:36.904580918 +0000 UTC m=+0.060611321 container died e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, batch=17.1_20260112.1, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, container_name=ovn_controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, tcib_managed=true, io.buildah.version=1.41.5, summary=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, version=17.1.13, url=https://www.redhat.com, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, distribution-scope=public)
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.timer: Deactivated successfully.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed to open /run/systemd/transient/e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: No such file or directory
Feb 01 09:00:36 np0005604215.localdomain systemd[1]: tmp-crun.Y3MMKR.mount: Deactivated successfully.
Feb 01 09:00:37 np0005604215.localdomain podman[109737]: 2026-02-01 09:00:37.053260057 +0000 UTC m=+0.209290450 container cleanup e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, url=https://www.redhat.com, architecture=x86_64, io.openshift.expose-services=, build-date=2026-01-12T22:36:40Z, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, cpe=cpe:/a:redhat:openstack:17.1::el9, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, description=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, config_id=tripleo_step4, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, distribution-scope=public, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, release=1766032510, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, name=rhosp-rhel9/openstack-ovn-controller, com.redhat.component=openstack-ovn-controller-container, batch=17.1_20260112.1, container_name=ovn_controller)
Feb 01 09:00:37 np0005604215.localdomain podman[109737]: ovn_controller
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.timer: Failed to open /run/systemd/transient/e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.timer: No such file or directory
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed to open /run/systemd/transient/e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: No such file or directory
Feb 01 09:00:37 np0005604215.localdomain podman[109751]: 2026-02-01 09:00:37.066438488 +0000 UTC m=+0.150806276 container cleanup e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, version=17.1.13, url=https://www.redhat.com, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 ovn-controller, vendor=Red Hat, Inc., distribution-scope=public, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, summary=Red Hat OpenStack Platform 17.1 ovn-controller, org.opencontainers.image.created=2026-01-12T22:36:40Z, config_id=tripleo_step4, build-date=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.expose-services=, com.redhat.component=openstack-ovn-controller-container, vcs-type=git, io.buildah.version=1.41.5, maintainer=OpenStack TripleO Team, tcib_managed=true, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, release=1766032510, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, managed_by=tripleo_ansible, batch=17.1_20260112.1, container_name=ovn_controller, cpe=cpe:/a:redhat:openstack:17.1::el9, architecture=x86_64)
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: libpod-conmon-e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.scope: Deactivated successfully.
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.timer: Failed to open /run/systemd/transient/e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.timer: No such file or directory
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: Failed to open /run/systemd/transient/e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257.service: No such file or directory
Feb 01 09:00:37 np0005604215.localdomain podman[109764]: 2026-02-01 09:00:37.167089319 +0000 UTC m=+0.065151974 container cleanup e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://www.redhat.com, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 ovn-controller, vcs-type=git, description=Red Hat OpenStack Platform 17.1 ovn-controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, cpe=cpe:/a:redhat:openstack:17.1::el9, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, com.redhat.component=openstack-ovn-controller-container, org.opencontainers.image.created=2026-01-12T22:36:40Z, name=rhosp-rhel9/openstack-ovn-controller, managed_by=tripleo_ansible, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, io.openshift.expose-services=, release=1766032510, vendor=Red Hat, Inc., io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, maintainer=OpenStack TripleO Team, version=17.1.13, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, container_name=ovn_controller, architecture=x86_64, io.buildah.version=1.41.5)
Feb 01 09:00:37 np0005604215.localdomain podman[109764]: ovn_controller
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: tripleo_ovn_controller.service: Deactivated successfully.
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: Stopped ovn_controller container.
Feb 01 09:00:37 np0005604215.localdomain sudo[109694]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:37 np0005604215.localdomain sudo[109865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-alowuoyexsjxzugpwuoqxvoitquamzuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936437.3388438-111-141328720793111/AnsiballZ_systemd_service.py
Feb 01 09:00:37 np0005604215.localdomain sudo[109865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ae6e92d81edd57130eba0dea91809d1be824b840176ebe669287b6264f5d2d37-merged.mount: Deactivated successfully.
Feb 01 09:00:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:37 np0005604215.localdomain python3.9[109867]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ovn_metadata_agent.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:39 np0005604215.localdomain systemd-sysv-generator[109893]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:39 np0005604215.localdomain systemd-rc-local-generator[109890]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: Stopping ovn_metadata_agent container...
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: tmp-crun.A0W9P4.mount: Deactivated successfully.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: libpod-e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.scope: Deactivated successfully.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: libpod-e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.scope: Consumed 9.254s CPU time.
Feb 01 09:00:39 np0005604215.localdomain podman[109907]: 2026-02-01 09:00:39.558887478 +0000 UTC m=+0.244774057 container stop e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, url=https://www.redhat.com, managed_by=tripleo_ansible, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.buildah.version=1.41.5, org.opencontainers.image.created=2026-01-12T22:56:19Z, config_id=tripleo_step4, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, release=1766032510, io.openshift.expose-services=, distribution-scope=public, batch=17.1_20260112.1, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:openstack:17.1::el9, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=17.1.13, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, build-date=2026-01-12T22:56:19Z, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, tcib_managed=true, konflux.additional-tags=17.1.13 17.1_20260112.1, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f)
Feb 01 09:00:39 np0005604215.localdomain podman[109907]: 2026-02-01 09:00:39.588363297 +0000 UTC m=+0.274249886 container died e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, distribution-scope=public, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, managed_by=tripleo_ansible, version=17.1.13, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, architecture=x86_64, url=https://www.redhat.com, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, config_id=tripleo_step4, maintainer=OpenStack TripleO Team, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.openshift.expose-services=, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, build-date=2026-01-12T22:56:19Z, tcib_managed=true, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.timer: Deactivated successfully.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: Stopped /usr/bin/podman healthcheck run e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed to open /run/systemd/transient/e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: No such file or directory
Feb 01 09:00:39 np0005604215.localdomain podman[109907]: 2026-02-01 09:00:39.706763781 +0000 UTC m=+0.392650370 container cleanup e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:56:19Z, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.openshift.expose-services=, managed_by=tripleo_ansible, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vcs-type=git, release=1766032510, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, vendor=Red Hat, Inc., build-date=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, container_name=ovn_metadata_agent, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, maintainer=OpenStack TripleO Team, url=https://www.redhat.com, distribution-scope=public, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 09:00:39 np0005604215.localdomain podman[109907]: ovn_metadata_agent
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.timer: Failed to open /run/systemd/transient/e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.timer: No such file or directory
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed to open /run/systemd/transient/e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: No such file or directory
Feb 01 09:00:39 np0005604215.localdomain podman[109920]: 2026-02-01 09:00:39.731445321 +0000 UTC m=+0.156674359 container cleanup e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, build-date=2026-01-12T22:56:19Z, config_id=tripleo_step4, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, release=1766032510, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vcs-type=git, batch=17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, vendor=Red Hat, Inc., distribution-scope=public, url=https://www.redhat.com, org.opencontainers.image.created=2026-01-12T22:56:19Z, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, tcib_managed=true, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.expose-services=, architecture=x86_64, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, version=17.1.13, maintainer=OpenStack TripleO Team, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1, managed_by=tripleo_ansible)
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: libpod-conmon-e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.scope: Deactivated successfully.
Feb 01 09:00:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41272 DF PROTO=TCP SPT=36132 DPT=9100 SEQ=2460966367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6480D4D0000000001030307) 
Feb 01 09:00:39 np0005604215.localdomain podman[109950]: error opening file `/run/crun/e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06/status`: No such file or directory
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.timer: Failed to open /run/systemd/transient/e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.timer: No such file or directory
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: Failed to open /run/systemd/transient/e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06.service: No such file or directory
Feb 01 09:00:39 np0005604215.localdomain podman[109937]: 2026-02-01 09:00:39.842154776 +0000 UTC m=+0.074536897 container cleanup e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, release=1766032510, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, build-date=2026-01-12T22:56:19Z, tcib_managed=true, architecture=x86_64, container_name=ovn_metadata_agent, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=tripleo_ansible, cpe=cpe:/a:redhat:openstack:17.1::el9, config_id=tripleo_step4, version=17.1.13, io.buildah.version=1.41.5, vcs-type=git, url=https://www.redhat.com, batch=17.1_20260112.1, maintainer=OpenStack TripleO Team, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, org.opencontainers.image.created=2026-01-12T22:56:19Z, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 09:00:39 np0005604215.localdomain podman[109937]: ovn_metadata_agent
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: tripleo_ovn_metadata_agent.service: Deactivated successfully.
Feb 01 09:00:39 np0005604215.localdomain systemd[1]: Stopped ovn_metadata_agent container.
Feb 01 09:00:39 np0005604215.localdomain sudo[109865]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:40 np0005604215.localdomain sudo[110041]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zsgdzpvkknwffumklmagdsuxpahhnprz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936440.0171888-111-93357069195028/AnsiballZ_systemd_service.py
Feb 01 09:00:40 np0005604215.localdomain sudo[110041]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:00:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d506918155a93476a6405c9e2c98cb06d7e575d23557b96e2d10a36860f0cb4c-merged.mount: Deactivated successfully.
Feb 01 09:00:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06-userdata-shm.mount: Deactivated successfully.
Feb 01 09:00:40 np0005604215.localdomain python3.9[110043]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_rsyslog.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:00:40 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:00:40 np0005604215.localdomain systemd-sysv-generator[110073]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:00:40 np0005604215.localdomain systemd-rc-local-generator[110067]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:00:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:00:40 np0005604215.localdomain sudo[110041]: pam_unix(sudo:session): session closed for user root
Feb 01 09:00:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62776 DF PROTO=TCP SPT=49370 DPT=9101 SEQ=1282612434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6481A4D0000000001030307) 
Feb 01 09:00:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54946 DF PROTO=TCP SPT=41284 DPT=9882 SEQ=3822266166 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648230D0000000001030307) 
Feb 01 09:00:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:00:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:00:46 np0005604215.localdomain sshd[110096]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:00:46 np0005604215.localdomain sshd[110096]: error: kex_exchange_identification: Connection closed by remote host
Feb 01 09:00:46 np0005604215.localdomain sshd[110096]: Connection closed by 178.128.245.186 port 40128
Feb 01 09:00:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1261 DF PROTO=TCP SPT=43364 DPT=9102 SEQ=3186521073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648310D0000000001030307) 
Feb 01 09:00:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:00:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 4800.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:00:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41274 DF PROTO=TCP SPT=36132 DPT=9100 SEQ=2460966367 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6483D0E0000000001030307) 
Feb 01 09:00:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62778 DF PROTO=TCP SPT=49370 DPT=9101 SEQ=1282612434 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6484B0E0000000001030307) 
Feb 01 09:01:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51375 DF PROTO=TCP SPT=56472 DPT=9882 SEQ=2463887314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6485C4D0000000001030307) 
Feb 01 09:01:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51376 DF PROTO=TCP SPT=56472 DPT=9882 SEQ=2463887314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648604E0000000001030307) 
Feb 01 09:01:01 np0005604215.localdomain CROND[110098]: (root) CMD (run-parts /etc/cron.hourly)
Feb 01 09:01:01 np0005604215.localdomain run-parts[110101]: (/etc/cron.hourly) starting 0anacron
Feb 01 09:01:01 np0005604215.localdomain run-parts[110107]: (/etc/cron.hourly) finished 0anacron
Feb 01 09:01:01 np0005604215.localdomain CROND[110097]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 01 09:01:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51377 DF PROTO=TCP SPT=56472 DPT=9882 SEQ=2463887314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648684D0000000001030307) 
Feb 01 09:01:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1422 DF PROTO=TCP SPT=54214 DPT=9102 SEQ=492321943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64875CD0000000001030307) 
Feb 01 09:01:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2188 DF PROTO=TCP SPT=51398 DPT=9100 SEQ=3609510058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648824D0000000001030307) 
Feb 01 09:01:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11327 DF PROTO=TCP SPT=35098 DPT=9101 SEQ=1865113286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6488F8E0000000001030307) 
Feb 01 09:01:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51379 DF PROTO=TCP SPT=56472 DPT=9882 SEQ=2463887314 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648990D0000000001030307) 
Feb 01 09:01:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1424 DF PROTO=TCP SPT=54214 DPT=9102 SEQ=492321943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648A50D0000000001030307) 
Feb 01 09:01:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2190 DF PROTO=TCP SPT=51398 DPT=9100 SEQ=3609510058 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648B30D0000000001030307) 
Feb 01 09:01:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11329 DF PROTO=TCP SPT=35098 DPT=9101 SEQ=1865113286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648BF0D0000000001030307) 
Feb 01 09:01:25 np0005604215.localdomain sudo[110108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:01:25 np0005604215.localdomain sudo[110108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:01:25 np0005604215.localdomain sudo[110108]: pam_unix(sudo:session): session closed for user root
Feb 01 09:01:25 np0005604215.localdomain sudo[110123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:01:25 np0005604215.localdomain sudo[110123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:01:26 np0005604215.localdomain sudo[110123]: pam_unix(sudo:session): session closed for user root
Feb 01 09:01:28 np0005604215.localdomain sshd[110171]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:01:28 np0005604215.localdomain sudo[110173]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:01:28 np0005604215.localdomain sudo[110173]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:01:28 np0005604215.localdomain sudo[110173]: pam_unix(sudo:session): session closed for user root
Feb 01 09:01:28 np0005604215.localdomain sshd[110171]: Invalid user cloudera from 85.206.171.113 port 58842
Feb 01 09:01:29 np0005604215.localdomain sshd[110171]: Received disconnect from 85.206.171.113 port 58842:11: Bye Bye [preauth]
Feb 01 09:01:29 np0005604215.localdomain sshd[110171]: Disconnected from invalid user cloudera 85.206.171.113 port 58842 [preauth]
Feb 01 09:01:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3236 DF PROTO=TCP SPT=60732 DPT=9882 SEQ=3897439414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648D17C0000000001030307) 
Feb 01 09:01:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3237 DF PROTO=TCP SPT=60732 DPT=9882 SEQ=3897439414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648D58E0000000001030307) 
Feb 01 09:01:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3238 DF PROTO=TCP SPT=60732 DPT=9882 SEQ=3897439414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648DD8D0000000001030307) 
Feb 01 09:01:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24227 DF PROTO=TCP SPT=53696 DPT=9102 SEQ=853338594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648EB0D0000000001030307) 
Feb 01 09:01:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10533 DF PROTO=TCP SPT=56770 DPT=9100 SEQ=2294429146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA648F78D0000000001030307) 
Feb 01 09:01:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16210 DF PROTO=TCP SPT=53240 DPT=9101 SEQ=1019892307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64904CD0000000001030307) 
Feb 01 09:01:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3240 DF PROTO=TCP SPT=60732 DPT=9882 SEQ=3897439414 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6490D0D0000000001030307) 
Feb 01 09:01:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24229 DF PROTO=TCP SPT=53696 DPT=9102 SEQ=853338594 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6491B0D0000000001030307) 
Feb 01 09:01:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10535 DF PROTO=TCP SPT=56770 DPT=9100 SEQ=2294429146 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649270E0000000001030307) 
Feb 01 09:01:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16212 DF PROTO=TCP SPT=53240 DPT=9101 SEQ=1019892307 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649350D0000000001030307) 
Feb 01 09:02:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25937 DF PROTO=TCP SPT=58880 DPT=9882 SEQ=1834266595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64946AD0000000001030307) 
Feb 01 09:02:01 np0005604215.localdomain sshd[104959]: Received disconnect from 192.168.122.31 port 47500:11: disconnected by user
Feb 01 09:02:01 np0005604215.localdomain sshd[104959]: Disconnected from user zuul 192.168.122.31 port 47500
Feb 01 09:02:01 np0005604215.localdomain sshd[104956]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:02:01 np0005604215.localdomain systemd[1]: session-36.scope: Deactivated successfully.
Feb 01 09:02:01 np0005604215.localdomain systemd[1]: session-36.scope: Consumed 18.168s CPU time.
Feb 01 09:02:01 np0005604215.localdomain systemd-logind[761]: Session 36 logged out. Waiting for processes to exit.
Feb 01 09:02:01 np0005604215.localdomain systemd-logind[761]: Removed session 36.
Feb 01 09:02:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25938 DF PROTO=TCP SPT=58880 DPT=9882 SEQ=1834266595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6494ACE0000000001030307) 
Feb 01 09:02:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25939 DF PROTO=TCP SPT=58880 DPT=9882 SEQ=1834266595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64952CE0000000001030307) 
Feb 01 09:02:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30153 DF PROTO=TCP SPT=44568 DPT=9102 SEQ=2928275220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649604D0000000001030307) 
Feb 01 09:02:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16626 DF PROTO=TCP SPT=48068 DPT=9100 SEQ=500505732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6496CCD0000000001030307) 
Feb 01 09:02:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12342 DF PROTO=TCP SPT=52708 DPT=9101 SEQ=2462273802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64979CD0000000001030307) 
Feb 01 09:02:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25941 DF PROTO=TCP SPT=58880 DPT=9882 SEQ=1834266595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649830E0000000001030307) 
Feb 01 09:02:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30155 DF PROTO=TCP SPT=44568 DPT=9102 SEQ=2928275220 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649910E0000000001030307) 
Feb 01 09:02:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16628 DF PROTO=TCP SPT=48068 DPT=9100 SEQ=500505732 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6499D0D0000000001030307) 
Feb 01 09:02:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12344 DF PROTO=TCP SPT=52708 DPT=9101 SEQ=2462273802 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649A90D0000000001030307) 
Feb 01 09:02:28 np0005604215.localdomain sudo[110189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:02:28 np0005604215.localdomain sudo[110189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:02:28 np0005604215.localdomain sudo[110189]: pam_unix(sudo:session): session closed for user root
Feb 01 09:02:28 np0005604215.localdomain sudo[110204]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:02:28 np0005604215.localdomain sudo[110204]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:02:29 np0005604215.localdomain sudo[110204]: pam_unix(sudo:session): session closed for user root
Feb 01 09:02:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25234 DF PROTO=TCP SPT=38550 DPT=9882 SEQ=290365811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649BDC50000000001030307) 
Feb 01 09:02:30 np0005604215.localdomain sudo[110251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:02:30 np0005604215.localdomain sudo[110251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:02:30 np0005604215.localdomain sudo[110251]: pam_unix(sudo:session): session closed for user root
Feb 01 09:02:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25235 DF PROTO=TCP SPT=38550 DPT=9882 SEQ=290365811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649C1CD0000000001030307) 
Feb 01 09:02:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5317 DF PROTO=TCP SPT=33866 DPT=9105 SEQ=2217420361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649C90D0000000001030307) 
Feb 01 09:02:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47192 DF PROTO=TCP SPT=55370 DPT=9102 SEQ=929202725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649D5A60000000001030307) 
Feb 01 09:02:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41193 DF PROTO=TCP SPT=55428 DPT=9100 SEQ=1921553124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649E20D0000000001030307) 
Feb 01 09:02:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63908 DF PROTO=TCP SPT=56862 DPT=9101 SEQ=3142567501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649EF0E0000000001030307) 
Feb 01 09:02:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25238 DF PROTO=TCP SPT=38550 DPT=9882 SEQ=290365811 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA649F90D0000000001030307) 
Feb 01 09:02:48 np0005604215.localdomain sshd[110266]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:02:48 np0005604215.localdomain sshd[110266]: Invalid user backupuser from 85.206.171.113 port 57920
Feb 01 09:02:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47194 DF PROTO=TCP SPT=55370 DPT=9102 SEQ=929202725 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A050D0000000001030307) 
Feb 01 09:02:48 np0005604215.localdomain sshd[110266]: Received disconnect from 85.206.171.113 port 57920:11: Bye Bye [preauth]
Feb 01 09:02:48 np0005604215.localdomain sshd[110266]: Disconnected from invalid user backupuser 85.206.171.113 port 57920 [preauth]
Feb 01 09:02:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41195 DF PROTO=TCP SPT=55428 DPT=9100 SEQ=1921553124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A130D0000000001030307) 
Feb 01 09:02:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63910 DF PROTO=TCP SPT=56862 DPT=9101 SEQ=3142567501 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A1F0D0000000001030307) 
Feb 01 09:03:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27631 DF PROTO=TCP SPT=38690 DPT=9882 SEQ=102954606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A310E0000000001030307) 
Feb 01 09:03:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27632 DF PROTO=TCP SPT=38690 DPT=9882 SEQ=102954606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A350E0000000001030307) 
Feb 01 09:03:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27633 DF PROTO=TCP SPT=38690 DPT=9882 SEQ=102954606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A3D0D0000000001030307) 
Feb 01 09:03:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39714 DF PROTO=TCP SPT=53930 DPT=9102 SEQ=2356406653 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A4A8E0000000001030307) 
Feb 01 09:03:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23682 DF PROTO=TCP SPT=50332 DPT=9100 SEQ=2193449943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A570D0000000001030307) 
Feb 01 09:03:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42105 DF PROTO=TCP SPT=34016 DPT=9101 SEQ=2965222399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A644E0000000001030307) 
Feb 01 09:03:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27635 DF PROTO=TCP SPT=38690 DPT=9882 SEQ=102954606 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A6D0D0000000001030307) 
Feb 01 09:03:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39716 DF PROTO=TCP SPT=53930 DPT=9102 SEQ=2356406653 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A7B0D0000000001030307) 
Feb 01 09:03:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23684 DF PROTO=TCP SPT=50332 DPT=9100 SEQ=2193449943 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A870D0000000001030307) 
Feb 01 09:03:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42107 DF PROTO=TCP SPT=34016 DPT=9101 SEQ=2965222399 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64A950D0000000001030307) 
Feb 01 09:03:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23711 DF PROTO=TCP SPT=60510 DPT=9882 SEQ=4061105213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AA63C0000000001030307) 
Feb 01 09:03:30 np0005604215.localdomain sudo[110268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:03:30 np0005604215.localdomain sudo[110268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:03:30 np0005604215.localdomain sudo[110268]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:30 np0005604215.localdomain sudo[110283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 09:03:30 np0005604215.localdomain sudo[110283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:03:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23712 DF PROTO=TCP SPT=60510 DPT=9882 SEQ=4061105213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AAA4D0000000001030307) 
Feb 01 09:03:31 np0005604215.localdomain sshd[110310]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:03:31 np0005604215.localdomain sudo[110283]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:31 np0005604215.localdomain sshd[110310]: Accepted publickey for zuul from 192.168.122.31 port 50888 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:03:31 np0005604215.localdomain systemd-logind[761]: New session 37 of user zuul.
Feb 01 09:03:31 np0005604215.localdomain systemd[1]: Started Session 37 of User zuul.
Feb 01 09:03:31 np0005604215.localdomain sshd[110310]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:03:31 np0005604215.localdomain sudo[110321]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:03:31 np0005604215.localdomain sudo[110321]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:03:31 np0005604215.localdomain sudo[110321]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:31 np0005604215.localdomain sudo[110350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:03:31 np0005604215.localdomain sudo[110350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:03:31 np0005604215.localdomain sudo[110427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqzbbphkoytmwyffxuplvvjbtkztiett ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936611.3813336-560-21395327344926/AnsiballZ_file.py
Feb 01 09:03:31 np0005604215.localdomain sudo[110427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:31 np0005604215.localdomain python3.9[110429]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:31 np0005604215.localdomain sudo[110427]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:31 np0005604215.localdomain sudo[110350]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:32 np0005604215.localdomain sudo[110549]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cgmblrfyxeihonygffbfuhkgfpzhtyrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936611.9634037-560-261950658667161/AnsiballZ_file.py
Feb 01 09:03:32 np0005604215.localdomain sudo[110549]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:32 np0005604215.localdomain python3.9[110551]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:32 np0005604215.localdomain sudo[110549]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:32 np0005604215.localdomain sudo[110617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:03:32 np0005604215.localdomain sudo[110617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:03:32 np0005604215.localdomain sudo[110617]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:32 np0005604215.localdomain sudo[110656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwlgmbolyjrzqyymnfuflwyhvezrrnjg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936612.5297735-560-66042992403096/AnsiballZ_file.py
Feb 01 09:03:32 np0005604215.localdomain sudo[110656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:32 np0005604215.localdomain python3.9[110658]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:32 np0005604215.localdomain sudo[110656]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23713 DF PROTO=TCP SPT=60510 DPT=9882 SEQ=4061105213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AB24D0000000001030307) 
Feb 01 09:03:33 np0005604215.localdomain sudo[110748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oecunzpqttuhhtgvwdhnmnysgenklfdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936613.0818427-560-89189416529072/AnsiballZ_file.py
Feb 01 09:03:33 np0005604215.localdomain sudo[110748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:33 np0005604215.localdomain python3.9[110750]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:33 np0005604215.localdomain sudo[110748]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:33 np0005604215.localdomain sudo[110840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ppbcozysywvnyjrmpiqwztsbpnyxjwvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936613.6226115-560-48265650409211/AnsiballZ_file.py
Feb 01 09:03:33 np0005604215.localdomain sudo[110840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:34 np0005604215.localdomain python3.9[110842]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:34 np0005604215.localdomain sudo[110840]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:34 np0005604215.localdomain sudo[110932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fgezchkpujcifnuigwdeorqebhobttfv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936614.2342048-560-126917044854159/AnsiballZ_file.py
Feb 01 09:03:34 np0005604215.localdomain sudo[110932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:34 np0005604215.localdomain python3.9[110934]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:34 np0005604215.localdomain sudo[110932]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:35 np0005604215.localdomain sudo[111024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-irkruirbiyyziqiclldxqlnmzfztkiql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936614.828637-560-143218848890133/AnsiballZ_file.py
Feb 01 09:03:35 np0005604215.localdomain sudo[111024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:35 np0005604215.localdomain python3.9[111026]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:35 np0005604215.localdomain sudo[111024]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:35 np0005604215.localdomain sudo[111116]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjnvwkdoslkdgdmppnuiuyoirlonbaxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936615.421903-560-34183837642474/AnsiballZ_file.py
Feb 01 09:03:35 np0005604215.localdomain sudo[111116]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:35 np0005604215.localdomain python3.9[111118]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:35 np0005604215.localdomain sudo[111116]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:36 np0005604215.localdomain sudo[111208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hmyyufzdcuwnlhoaiphuedjlmdkcbdmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936616.0032926-560-190154137234669/AnsiballZ_file.py
Feb 01 09:03:36 np0005604215.localdomain sudo[111208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:36 np0005604215.localdomain python3.9[111210]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:36 np0005604215.localdomain sudo[111208]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36929 DF PROTO=TCP SPT=44138 DPT=9102 SEQ=283164502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64ABFCD0000000001030307) 
Feb 01 09:03:36 np0005604215.localdomain sudo[111300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyjkvccgpcnkeisdqxfrjavyyjhakhvq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936616.5777228-560-268282076980374/AnsiballZ_file.py
Feb 01 09:03:36 np0005604215.localdomain sudo[111300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:37 np0005604215.localdomain python3.9[111302]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:37 np0005604215.localdomain sudo[111300]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:37 np0005604215.localdomain sudo[111392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kevpnnpbwzgbovdyqtryrtickwtcbuhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936617.1730988-560-108483341246324/AnsiballZ_file.py
Feb 01 09:03:37 np0005604215.localdomain sudo[111392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:37 np0005604215.localdomain python3.9[111394]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:37 np0005604215.localdomain sudo[111392]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:37 np0005604215.localdomain sudo[111484]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxdvytrwqabbuhbmvjjyhibwqplzxqrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936617.7500346-560-31129215456888/AnsiballZ_file.py
Feb 01 09:03:37 np0005604215.localdomain sudo[111484]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:38 np0005604215.localdomain python3.9[111486]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:38 np0005604215.localdomain sudo[111484]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:38 np0005604215.localdomain sudo[111576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvibkoalrsronmekmecuyvnmwjayusax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936618.432842-560-100146840357715/AnsiballZ_file.py
Feb 01 09:03:38 np0005604215.localdomain sudo[111576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:38 np0005604215.localdomain python3.9[111578]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:38 np0005604215.localdomain sudo[111576]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:39 np0005604215.localdomain sudo[111668]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfeiakixvjxexmwbsysnmhrrhbceoepy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936619.007927-560-140831735017476/AnsiballZ_file.py
Feb 01 09:03:39 np0005604215.localdomain sudo[111668]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:39 np0005604215.localdomain python3.9[111670]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:39 np0005604215.localdomain sudo[111668]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38530 DF PROTO=TCP SPT=49588 DPT=9100 SEQ=2192356510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64ACC4D0000000001030307) 
Feb 01 09:03:39 np0005604215.localdomain sudo[111760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kilcfhyfodavayjlwcdocuznhfwtmgif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936619.602568-560-186715409747375/AnsiballZ_file.py
Feb 01 09:03:39 np0005604215.localdomain sudo[111760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:40 np0005604215.localdomain python3.9[111762]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:40 np0005604215.localdomain sudo[111760]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:40 np0005604215.localdomain sudo[111852]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bxwplkcdgkmryfpkjdouppdbgzyhabav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936620.1480575-560-29950346911692/AnsiballZ_file.py
Feb 01 09:03:40 np0005604215.localdomain sudo[111852]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:40 np0005604215.localdomain python3.9[111854]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:40 np0005604215.localdomain sudo[111852]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:40 np0005604215.localdomain sudo[111944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xiocozrkccjygtxuolsvkfhrtyazgese ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936620.6845045-560-134989395489545/AnsiballZ_file.py
Feb 01 09:03:40 np0005604215.localdomain sudo[111944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:41 np0005604215.localdomain python3.9[111946]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:41 np0005604215.localdomain sudo[111944]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:41 np0005604215.localdomain sudo[112036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqkfjrxtdrevjwyfgzquhnxpwvuckrhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936621.2586253-560-172895931562221/AnsiballZ_file.py
Feb 01 09:03:41 np0005604215.localdomain sudo[112036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:41 np0005604215.localdomain python3.9[112038]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:41 np0005604215.localdomain sudo[112036]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:42 np0005604215.localdomain sudo[112128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhlylkhpuiazbtpesnowzvuwegdpucdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936621.784387-560-238437144629379/AnsiballZ_file.py
Feb 01 09:03:42 np0005604215.localdomain sudo[112128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:42 np0005604215.localdomain python3.9[112130]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:42 np0005604215.localdomain sudo[112128]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:42 np0005604215.localdomain sudo[112220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czntzfwiubzzxduaopqgokocskmuwlyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936622.330114-560-139546178538614/AnsiballZ_file.py
Feb 01 09:03:42 np0005604215.localdomain sudo[112220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:42 np0005604215.localdomain python3.9[112222]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:42 np0005604215.localdomain sudo[112220]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:43 np0005604215.localdomain sudo[112312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ieahlbjtuciohjthbxzrliohejwrwlfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936622.8967347-560-228707797855898/AnsiballZ_file.py
Feb 01 09:03:43 np0005604215.localdomain sudo[112312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7876 DF PROTO=TCP SPT=59982 DPT=9101 SEQ=769046800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AD98D0000000001030307) 
Feb 01 09:03:43 np0005604215.localdomain python3.9[112314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:43 np0005604215.localdomain sudo[112312]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:44 np0005604215.localdomain sudo[112404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhbubvooritxgkebkssomayroajtodvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936624.0658596-1010-207542741283931/AnsiballZ_file.py
Feb 01 09:03:44 np0005604215.localdomain sudo[112404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:44 np0005604215.localdomain python3.9[112406]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:44 np0005604215.localdomain sudo[112404]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:44 np0005604215.localdomain sudo[112496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jklgwdrkcyecbehkilpuenbtfqfzyhmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936624.6773899-1010-97784114215090/AnsiballZ_file.py
Feb 01 09:03:44 np0005604215.localdomain sudo[112496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:45 np0005604215.localdomain python3.9[112498]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_ipmi.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:45 np0005604215.localdomain sudo[112496]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:45 np0005604215.localdomain sudo[112588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aujcaribujfgqshqydwgvegzeiiswhtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936625.229927-1010-95948659805606/AnsiballZ_file.py
Feb 01 09:03:45 np0005604215.localdomain sudo[112588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23715 DF PROTO=TCP SPT=60510 DPT=9882 SEQ=4061105213 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AE30D0000000001030307) 
Feb 01 09:03:45 np0005604215.localdomain python3.9[112590]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_collectd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:45 np0005604215.localdomain sudo[112588]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:46 np0005604215.localdomain sudo[112680]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xpcpxlbrjfjgqendbxowqrftvidwzhea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936625.7783613-1010-277616673389256/AnsiballZ_file.py
Feb 01 09:03:46 np0005604215.localdomain sudo[112680]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:46 np0005604215.localdomain python3.9[112682]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_iscsid.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:46 np0005604215.localdomain sudo[112680]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:46 np0005604215.localdomain sudo[112772]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hbnymlkqnhbaeiyiadhosheitcexjvef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936626.3320706-1010-164865780787043/AnsiballZ_file.py
Feb 01 09:03:46 np0005604215.localdomain sudo[112772]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:46 np0005604215.localdomain python3.9[112774]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_logrotate_crond.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:46 np0005604215.localdomain sudo[112772]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:47 np0005604215.localdomain sudo[112864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqppccdhjscdddiiwoqbtnzmcbhmcvky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936626.953788-1010-238877890428099/AnsiballZ_file.py
Feb 01 09:03:47 np0005604215.localdomain sudo[112864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:47 np0005604215.localdomain python3.9[112866]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_metrics_qdr.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:47 np0005604215.localdomain sudo[112864]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:47 np0005604215.localdomain sudo[112956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qlkpavzvroiayhnekmjvxxczaxdsiohu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936627.6084762-1010-65932026768041/AnsiballZ_file.py
Feb 01 09:03:47 np0005604215.localdomain sudo[112956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:48 np0005604215.localdomain python3.9[112958]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_dhcp.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:48 np0005604215.localdomain sudo[112956]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:48 np0005604215.localdomain sudo[113048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-facrbkytttmyxxsusqaewlnbggffehcd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936628.231548-1010-53780193656536/AnsiballZ_file.py
Feb 01 09:03:48 np0005604215.localdomain sudo[113048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36931 DF PROTO=TCP SPT=44138 DPT=9102 SEQ=283164502 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AEF0D0000000001030307) 
Feb 01 09:03:48 np0005604215.localdomain python3.9[113050]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_l3_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:48 np0005604215.localdomain sudo[113048]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:49 np0005604215.localdomain sudo[113140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-moiakuyeqixlnrbldipddicsypihespp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936628.8112352-1010-98232367077072/AnsiballZ_file.py
Feb 01 09:03:49 np0005604215.localdomain sudo[113140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:49 np0005604215.localdomain python3.9[113142]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_neutron_ovs_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:49 np0005604215.localdomain sudo[113140]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:49 np0005604215.localdomain sudo[113232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xntiszqjgxqibkjxwbgvsdjogqtezddb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936629.4124513-1010-58003539113219/AnsiballZ_file.py
Feb 01 09:03:49 np0005604215.localdomain sudo[113232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:49 np0005604215.localdomain python3.9[113234]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:49 np0005604215.localdomain sudo[113232]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:50 np0005604215.localdomain sudo[113324]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rkaasrcufjqejexleemnfjdliwpjvcjd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936630.0360465-1010-13085548888781/AnsiballZ_file.py
Feb 01 09:03:50 np0005604215.localdomain sudo[113324]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:50 np0005604215.localdomain python3.9[113326]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:50 np0005604215.localdomain sudo[113324]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:50 np0005604215.localdomain sudo[113416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vipokamgtrlstsrweclwgyjhlobwbrqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936630.5604484-1010-240275383967062/AnsiballZ_file.py
Feb 01 09:03:50 np0005604215.localdomain sudo[113416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:51 np0005604215.localdomain python3.9[113418]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:51 np0005604215.localdomain sudo[113416]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:51 np0005604215.localdomain sudo[113508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opfyprlqjpwhdqtkwsmaedsiaopooyci ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936631.1584709-1010-106095534712027/AnsiballZ_file.py
Feb 01 09:03:51 np0005604215.localdomain sudo[113508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:51 np0005604215.localdomain python3.9[113510]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:51 np0005604215.localdomain sudo[113508]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:52 np0005604215.localdomain sudo[113600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxwmdjbfrcldebavusnupquaskxmkijx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936631.7603621-1010-218771564641252/AnsiballZ_file.py
Feb 01 09:03:52 np0005604215.localdomain sudo[113600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:52 np0005604215.localdomain python3.9[113602]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38532 DF PROTO=TCP SPT=49588 DPT=9100 SEQ=2192356510 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64AFD0E0000000001030307) 
Feb 01 09:03:52 np0005604215.localdomain sudo[113600]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:52 np0005604215.localdomain sudo[113692]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqbarskhpvuanvpvrxxeauvglrshsrze ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936632.3667433-1010-72069793321588/AnsiballZ_file.py
Feb 01 09:03:52 np0005604215.localdomain sudo[113692]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:52 np0005604215.localdomain python3.9[113694]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:52 np0005604215.localdomain sudo[113692]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:53 np0005604215.localdomain sudo[113784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-esidgpcgegehnvpwyyjeuhuisbpzhzjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936632.939608-1010-86045745069894/AnsiballZ_file.py
Feb 01 09:03:53 np0005604215.localdomain sudo[113784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:53 np0005604215.localdomain python3.9[113786]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud_recover.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:53 np0005604215.localdomain sudo[113784]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:53 np0005604215.localdomain sudo[113876]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvtflzvnlsukxrtlmmtfgmxltfjjilxh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936633.5290782-1010-11283956272785/AnsiballZ_file.py
Feb 01 09:03:53 np0005604215.localdomain sudo[113876]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:54 np0005604215.localdomain python3.9[113878]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:54 np0005604215.localdomain sudo[113876]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:54 np0005604215.localdomain sudo[113968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gubirellgclxzzjafigrmtajbbxkvcqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936634.1519904-1010-70263600379316/AnsiballZ_file.py
Feb 01 09:03:54 np0005604215.localdomain sudo[113968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:54 np0005604215.localdomain python3.9[113970]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:54 np0005604215.localdomain sudo[113968]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:55 np0005604215.localdomain sudo[114060]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcxhbdmquhyddoimtrlwrjgsezsjpptt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936634.8415375-1010-53373499788840/AnsiballZ_file.py
Feb 01 09:03:55 np0005604215.localdomain sudo[114060]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7878 DF PROTO=TCP SPT=59982 DPT=9101 SEQ=769046800 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B090D0000000001030307) 
Feb 01 09:03:55 np0005604215.localdomain python3.9[114062]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_controller.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:55 np0005604215.localdomain sudo[114060]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:55 np0005604215.localdomain sudo[114152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hepckqgjuxohnkawbhxwhexcpbpjxrko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936635.4514053-1010-36922150379600/AnsiballZ_file.py
Feb 01 09:03:55 np0005604215.localdomain sudo[114152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:55 np0005604215.localdomain python3.9[114154]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ovn_metadata_agent.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:55 np0005604215.localdomain sudo[114152]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:56 np0005604215.localdomain sudo[114244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtjkjhheagyxhilzecntldvnvjwhcwnd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936636.1461945-1010-64007982650464/AnsiballZ_file.py
Feb 01 09:03:56 np0005604215.localdomain sudo[114244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:56 np0005604215.localdomain python3.9[114246]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_rsyslog.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:03:56 np0005604215.localdomain sudo[114244]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:57 np0005604215.localdomain sudo[114336]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ghxtzyzaqwwebzknxilclasgnobzsqee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936637.305208-1458-275915901992117/AnsiballZ_command.py
Feb 01 09:03:57 np0005604215.localdomain sudo[114336]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:57 np0005604215.localdomain python3.9[114338]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:03:57 np0005604215.localdomain sudo[114336]: pam_unix(sudo:session): session closed for user root
Feb 01 09:03:58 np0005604215.localdomain python3.9[114430]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:03:59 np0005604215.localdomain sudo[114520]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptezqqgsjptcquxtulyiswvgtuznzhpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936639.159289-1512-260404653917665/AnsiballZ_systemd_service.py
Feb 01 09:03:59 np0005604215.localdomain sudo[114520]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:03:59 np0005604215.localdomain python3.9[114522]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:03:59 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:03:59 np0005604215.localdomain systemd-rc-local-generator[114546]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:03:59 np0005604215.localdomain systemd-sysv-generator[114551]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:03:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:04:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12253 DF PROTO=TCP SPT=38286 DPT=9882 SEQ=3581918519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B1B6D0000000001030307) 
Feb 01 09:04:00 np0005604215.localdomain sudo[114520]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:00 np0005604215.localdomain sudo[114647]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evhhjxiscjpabeemwqyznrvjjjddallo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936640.286377-1536-200023686748411/AnsiballZ_command.py
Feb 01 09:04:00 np0005604215.localdomain sudo[114647]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:00 np0005604215.localdomain python3.9[114649]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:00 np0005604215.localdomain sudo[114647]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12254 DF PROTO=TCP SPT=38286 DPT=9882 SEQ=3581918519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B1F8E0000000001030307) 
Feb 01 09:04:01 np0005604215.localdomain sudo[114740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gyrqrcxmnlebmxzrueiupascunnwkmsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936640.8918686-1536-81405835155269/AnsiballZ_command.py
Feb 01 09:04:01 np0005604215.localdomain sudo[114740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:01 np0005604215.localdomain python3.9[114742]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_ipmi.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:01 np0005604215.localdomain sudo[114740]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:01 np0005604215.localdomain sudo[114833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bgslftccteagwtwsllonyucezqizzrqk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936641.4720986-1536-7524639736633/AnsiballZ_command.py
Feb 01 09:04:01 np0005604215.localdomain sudo[114833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:01 np0005604215.localdomain python3.9[114835]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_collectd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:02 np0005604215.localdomain sudo[114833]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:02 np0005604215.localdomain sudo[114926]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clrdnryepjqxqtepfeirvonbtwelrmgd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936642.0966113-1536-263718715538270/AnsiballZ_command.py
Feb 01 09:04:02 np0005604215.localdomain sudo[114926]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:02 np0005604215.localdomain python3.9[114928]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_iscsid.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:02 np0005604215.localdomain sudo[114926]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:02 np0005604215.localdomain sudo[115019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofzyinsehiexiqcfogluezutferumbkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936642.6219096-1536-98172845961428/AnsiballZ_command.py
Feb 01 09:04:02 np0005604215.localdomain sudo[115019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:03 np0005604215.localdomain python3.9[115021]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_logrotate_crond.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:03 np0005604215.localdomain sudo[115019]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12255 DF PROTO=TCP SPT=38286 DPT=9882 SEQ=3581918519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B278E0000000001030307) 
Feb 01 09:04:03 np0005604215.localdomain sudo[115112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocojkkdiittwtxclvubrcijlydxvcabp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936643.143975-1536-81012733279964/AnsiballZ_command.py
Feb 01 09:04:03 np0005604215.localdomain sudo[115112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:03 np0005604215.localdomain python3.9[115114]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_metrics_qdr.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:03 np0005604215.localdomain sudo[115112]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:03 np0005604215.localdomain sudo[115205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpsnonxnpzfsedayabwuhaoglxnuzvbr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936643.688394-1536-32261926872007/AnsiballZ_command.py
Feb 01 09:04:03 np0005604215.localdomain sudo[115205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:04 np0005604215.localdomain python3.9[115207]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_dhcp.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:04 np0005604215.localdomain sudo[115205]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:04 np0005604215.localdomain sudo[115298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kkzfursmrbaousykaqpkzrhcymgpgbee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936644.3020298-1536-269807969984182/AnsiballZ_command.py
Feb 01 09:04:04 np0005604215.localdomain sudo[115298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:04 np0005604215.localdomain python3.9[115300]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_l3_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:04 np0005604215.localdomain sudo[115298]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:05 np0005604215.localdomain sshd[115391]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:04:05 np0005604215.localdomain sudo[115392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvbypvrddlavnwpetgtvtlcczeckmjlc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936644.8911529-1536-21789044723544/AnsiballZ_command.py
Feb 01 09:04:05 np0005604215.localdomain sudo[115392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:05 np0005604215.localdomain python3.9[115395]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_neutron_ovs_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:05 np0005604215.localdomain sudo[115392]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:05 np0005604215.localdomain sudo[115486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-toicjmyhhiaavsvtaizwblokxgzksmxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936645.472629-1536-76316710336823/AnsiballZ_command.py
Feb 01 09:04:05 np0005604215.localdomain sudo[115486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:05 np0005604215.localdomain sshd[115391]: Invalid user lol from 85.206.171.113 port 46118
Feb 01 09:04:05 np0005604215.localdomain sshd[115391]: Received disconnect from 85.206.171.113 port 46118:11: Bye Bye [preauth]
Feb 01 09:04:05 np0005604215.localdomain sshd[115391]: Disconnected from invalid user lol 85.206.171.113 port 46118 [preauth]
Feb 01 09:04:05 np0005604215.localdomain python3.9[115488]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:05 np0005604215.localdomain sudo[115486]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:06 np0005604215.localdomain sudo[115579]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etxnaovcwidvwyspmpsoqaofvpilfffe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936646.1131608-1536-165321899028831/AnsiballZ_command.py
Feb 01 09:04:06 np0005604215.localdomain sudo[115579]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42072 DF PROTO=TCP SPT=36072 DPT=9102 SEQ=2451486445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B350E0000000001030307) 
Feb 01 09:04:06 np0005604215.localdomain python3.9[115581]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:06 np0005604215.localdomain sudo[115579]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:07 np0005604215.localdomain sudo[115672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-buturggxmsjpfrcnzjgqjqwlnwkzyybm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936646.741382-1536-89643218494139/AnsiballZ_command.py
Feb 01 09:04:07 np0005604215.localdomain sudo[115672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:07 np0005604215.localdomain python3.9[115674]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:07 np0005604215.localdomain sudo[115672]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:07 np0005604215.localdomain sudo[115765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jshmlamrtajhqxjutjbrglntrwnzbmce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936647.3762336-1536-30324676162175/AnsiballZ_command.py
Feb 01 09:04:07 np0005604215.localdomain sudo[115765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:07 np0005604215.localdomain python3.9[115767]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:07 np0005604215.localdomain sudo[115765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:08 np0005604215.localdomain sudo[115858]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhaibhoshahzmlqnbaeebrptuelvdyab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936648.0286088-1536-42878250245146/AnsiballZ_command.py
Feb 01 09:04:08 np0005604215.localdomain sudo[115858]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:08 np0005604215.localdomain python3.9[115860]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:08 np0005604215.localdomain sudo[115858]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:08 np0005604215.localdomain sudo[115951]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjsseryoskjlyzcxqioewnnetcxqxqqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936648.6152425-1536-241647604737024/AnsiballZ_command.py
Feb 01 09:04:08 np0005604215.localdomain sudo[115951]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:09 np0005604215.localdomain python3.9[115953]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:09 np0005604215.localdomain sudo[115951]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:09 np0005604215.localdomain sudo[116044]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytgwbxuiuoqxdwyvccvvigftuciuifpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936649.2267125-1536-273751917823707/AnsiballZ_command.py
Feb 01 09:04:09 np0005604215.localdomain sudo[116044]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:09 np0005604215.localdomain python3.9[116046]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud_recover.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:09 np0005604215.localdomain sudo[116044]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55303 DF PROTO=TCP SPT=59064 DPT=9100 SEQ=4002704535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B418D0000000001030307) 
Feb 01 09:04:10 np0005604215.localdomain sudo[116137]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkpvkzoxlkxfzqmplvjangghqfidrouh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936649.7999988-1536-227500797251740/AnsiballZ_command.py
Feb 01 09:04:10 np0005604215.localdomain sudo[116137]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:10 np0005604215.localdomain python3.9[116139]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:10 np0005604215.localdomain sudo[116137]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:10 np0005604215.localdomain sudo[116230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztwwoktymtscmdhcpgwfbmmxaqbkrswb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936650.4430926-1536-48300527915838/AnsiballZ_command.py
Feb 01 09:04:10 np0005604215.localdomain sudo[116230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:10 np0005604215.localdomain python3.9[116232]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:10 np0005604215.localdomain sudo[116230]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:11 np0005604215.localdomain sudo[116323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjbymwourrxmppneebuxkqoqjjkaccxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936651.0469415-1536-118259567402855/AnsiballZ_command.py
Feb 01 09:04:11 np0005604215.localdomain sudo[116323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:11 np0005604215.localdomain python3.9[116325]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_controller.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:11 np0005604215.localdomain sudo[116323]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:12 np0005604215.localdomain sudo[116416]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojgdjzmoclmzakclyigndjwcbdrovehg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936651.7359464-1536-63223477783817/AnsiballZ_command.py
Feb 01 09:04:12 np0005604215.localdomain sudo[116416]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:12 np0005604215.localdomain python3.9[116418]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ovn_metadata_agent.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:12 np0005604215.localdomain sudo[116416]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:12 np0005604215.localdomain sudo[116509]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrmwnyuefalcwpzixgfwqrrivukldkgb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936652.3537014-1536-23053255473795/AnsiballZ_command.py
Feb 01 09:04:12 np0005604215.localdomain sudo[116509]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:12 np0005604215.localdomain python3.9[116511]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_rsyslog.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:12 np0005604215.localdomain sudo[116509]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45990 DF PROTO=TCP SPT=51644 DPT=9101 SEQ=74014858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B4E8D0000000001030307) 
Feb 01 09:04:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12257 DF PROTO=TCP SPT=38286 DPT=9882 SEQ=3581918519 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B570D0000000001030307) 
Feb 01 09:04:16 np0005604215.localdomain sshd[110310]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:04:16 np0005604215.localdomain systemd[1]: session-37.scope: Deactivated successfully.
Feb 01 09:04:16 np0005604215.localdomain systemd[1]: session-37.scope: Consumed 29.424s CPU time.
Feb 01 09:04:16 np0005604215.localdomain systemd-logind[761]: Session 37 logged out. Waiting for processes to exit.
Feb 01 09:04:16 np0005604215.localdomain systemd-logind[761]: Removed session 37.
Feb 01 09:04:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42074 DF PROTO=TCP SPT=36072 DPT=9102 SEQ=2451486445 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B650D0000000001030307) 
Feb 01 09:04:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55305 DF PROTO=TCP SPT=59064 DPT=9100 SEQ=4002704535 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B710E0000000001030307) 
Feb 01 09:04:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45992 DF PROTO=TCP SPT=51644 DPT=9101 SEQ=74014858 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B7F0D0000000001030307) 
Feb 01 09:04:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16590 DF PROTO=TCP SPT=34300 DPT=9882 SEQ=1886277604 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B909D0000000001030307) 
Feb 01 09:04:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16591 DF PROTO=TCP SPT=34300 DPT=9882 SEQ=1886277604 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B948D0000000001030307) 
Feb 01 09:04:32 np0005604215.localdomain sudo[116527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:04:32 np0005604215.localdomain sudo[116527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:04:32 np0005604215.localdomain sudo[116527]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:32 np0005604215.localdomain sudo[116542]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:04:32 np0005604215.localdomain sudo[116542]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:04:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16592 DF PROTO=TCP SPT=34300 DPT=9882 SEQ=1886277604 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64B9C8D0000000001030307) 
Feb 01 09:04:33 np0005604215.localdomain sudo[116542]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:34 np0005604215.localdomain sudo[116589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:04:34 np0005604215.localdomain sudo[116589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:04:34 np0005604215.localdomain sudo[116589]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:35 np0005604215.localdomain sshd[116604]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:04:35 np0005604215.localdomain sshd[116604]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:04:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3960 DF PROTO=TCP SPT=36150 DPT=9102 SEQ=4223255335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BAA4D0000000001030307) 
Feb 01 09:04:37 np0005604215.localdomain sshd[116606]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:04:37 np0005604215.localdomain sshd[116606]: Accepted publickey for zuul from 192.168.122.31 port 35918 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:04:37 np0005604215.localdomain systemd-logind[761]: New session 38 of user zuul.
Feb 01 09:04:37 np0005604215.localdomain systemd[1]: Started Session 38 of User zuul.
Feb 01 09:04:37 np0005604215.localdomain sshd[116606]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:04:38 np0005604215.localdomain python3.9[116699]: ansible-ansible.legacy.ping Invoked with data=pong
Feb 01 09:04:39 np0005604215.localdomain python3.9[116803]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:04:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58342 DF PROTO=TCP SPT=44692 DPT=9100 SEQ=2904528710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BB6CD0000000001030307) 
Feb 01 09:04:39 np0005604215.localdomain sudo[116893]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwuafpqxntssdmgpudfoyeioazculixz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936679.4790661-90-271598311427376/AnsiballZ_command.py
Feb 01 09:04:39 np0005604215.localdomain sudo[116893]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:40 np0005604215.localdomain python3.9[116895]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:40 np0005604215.localdomain sudo[116893]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:41 np0005604215.localdomain sudo[116986]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llnbiubdsyppmvczawythnclnefbimsv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936680.7144275-126-19164067767793/AnsiballZ_stat.py
Feb 01 09:04:41 np0005604215.localdomain sudo[116986]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:41 np0005604215.localdomain python3.9[116988]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:04:41 np0005604215.localdomain sudo[116986]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:41 np0005604215.localdomain sudo[117078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byxaujkpgdcyhiqyyyyvheifpqppbxka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936681.550414-151-126528774276793/AnsiballZ_file.py
Feb 01 09:04:41 np0005604215.localdomain sudo[117078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:42 np0005604215.localdomain python3.9[117080]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:04:42 np0005604215.localdomain sudo[117078]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:42 np0005604215.localdomain sudo[117170]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbvsruxwmwazmkuzxfetpsufjiejvwwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936682.4372227-174-246883270507201/AnsiballZ_stat.py
Feb 01 09:04:42 np0005604215.localdomain sudo[117170]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:42 np0005604215.localdomain python3.9[117172]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:04:42 np0005604215.localdomain sudo[117170]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32535 DF PROTO=TCP SPT=43816 DPT=9101 SEQ=2743141383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BC3CD0000000001030307) 
Feb 01 09:04:43 np0005604215.localdomain sudo[117243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sylunkybuylbokhxmtsrgsgcfqbotych ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936682.4372227-174-246883270507201/AnsiballZ_copy.py
Feb 01 09:04:43 np0005604215.localdomain sudo[117243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:43 np0005604215.localdomain python3.9[117245]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936682.4372227-174-246883270507201/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:04:43 np0005604215.localdomain sudo[117243]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:44 np0005604215.localdomain sudo[117335]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-widqyfjkqhntfzmbmiyogrujhwtfqoxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936683.8411598-219-114771716081489/AnsiballZ_setup.py
Feb 01 09:04:44 np0005604215.localdomain sudo[117335]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:44 np0005604215.localdomain python3.9[117337]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:04:44 np0005604215.localdomain sudo[117335]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:45 np0005604215.localdomain sudo[117431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sqvowvacoxqkpgnkxercvqbvgjymvdxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936684.9058871-244-156090657276387/AnsiballZ_file.py
Feb 01 09:04:45 np0005604215.localdomain sudo[117431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:45 np0005604215.localdomain python3.9[117433]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:04:45 np0005604215.localdomain sudo[117431]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16594 DF PROTO=TCP SPT=34300 DPT=9882 SEQ=1886277604 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BCD0D0000000001030307) 
Feb 01 09:04:46 np0005604215.localdomain sudo[117523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxdblsxbjkletvvybznuraofdxwptkbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936685.6749923-271-5159663684451/AnsiballZ_file.py
Feb 01 09:04:46 np0005604215.localdomain sudo[117523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:46 np0005604215.localdomain python3.9[117525]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:04:46 np0005604215.localdomain sudo[117523]: pam_unix(sudo:session): session closed for user root
Feb 01 09:04:46 np0005604215.localdomain python3.9[117615]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:04:47 np0005604215.localdomain network[117632]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:04:47 np0005604215.localdomain network[117633]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:04:47 np0005604215.localdomain network[117634]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:04:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:04:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3962 DF PROTO=TCP SPT=36150 DPT=9102 SEQ=4223255335 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BDB0D0000000001030307) 
Feb 01 09:04:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58344 DF PROTO=TCP SPT=44692 DPT=9100 SEQ=2904528710 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BE70D0000000001030307) 
Feb 01 09:04:52 np0005604215.localdomain python3.9[117832]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:04:53 np0005604215.localdomain python3.9[117922]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:04:54 np0005604215.localdomain sudo[118016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orgiamsunvjnvvvxuacvdelqywmonfzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936693.8540785-372-11650998666704/AnsiballZ_command.py
Feb 01 09:04:54 np0005604215.localdomain sudo[118016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:04:54 np0005604215.localdomain python3.9[118018]: ansible-ansible.legacy.command Invoked with _raw_params=# This is a hack to deploy RDO Delorean repos to RHEL as if it were Centos 9 Stream
                                                            set -euxo pipefail
                                                            curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz
                                                            python3 -m venv ./venv
                                                            PBR_VERSION=0.0.0 ./venv/bin/pip install ./repo-setup-main
                                                            # This is required for FIPS enabled until trunk.rdoproject.org
                                                            # is not being served from a centos7 host, tracked by
                                                            # https://issues.redhat.com/browse/RHOSZUUL-1517
                                                            dnf -y install crypto-policies
                                                            update-crypto-policies --set FIPS:NO-ENFORCE-EMS
                                                            ./venv/bin/repo-setup current-podified -b antelope -d centos9 --stream
                                                            
                                                            # Exclude ceph-common-18.2.7 as it's pulling newer openssl not compatible
                                                            # with rhel 9.2 openssh
                                                            dnf config-manager --setopt centos9-storage.exclude="ceph-common-18.2.7" --save
                                                            # FIXME: perform dnf upgrade for other packages in EDPM ansible
                                                            # here we only ensuring that decontainerized libvirt can start
                                                            dnf -y upgrade openstack-selinux
                                                            rm -f /run/virtlogd.pid
                                                            
                                                            rm -rf repo-setup-main
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:04:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32537 DF PROTO=TCP SPT=43816 DPT=9101 SEQ=2743141383 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64BF30D0000000001030307) 
Feb 01 09:05:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56940 DF PROTO=TCP SPT=42620 DPT=9882 SEQ=2023601156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C05CE0000000001030307) 
Feb 01 09:05:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56941 DF PROTO=TCP SPT=42620 DPT=9882 SEQ=2023601156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C09CD0000000001030307) 
Feb 01 09:05:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56942 DF PROTO=TCP SPT=42620 DPT=9882 SEQ=2023601156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C11CE0000000001030307) 
Feb 01 09:05:03 np0005604215.localdomain sshd[45582]: Received signal 15; terminating.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 01 09:05:03 np0005604215.localdomain sshd[118061]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:05:03 np0005604215.localdomain sshd[118061]: Server listening on 0.0.0.0 port 22.
Feb 01 09:05:03 np0005604215.localdomain sshd[118061]: Server listening on :: port 22.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 09:05:03 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:05:04 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 09:05:04 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 09:05:04 np0005604215.localdomain systemd[1]: run-r0b77db35b6b54652bcf9ed79e522294c.service: Deactivated successfully.
Feb 01 09:05:04 np0005604215.localdomain systemd[1]: run-r7cf8884a8f1545099072d587b650fd91.service: Deactivated successfully.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 01 09:05:05 np0005604215.localdomain sshd[118061]: Received signal 15; terminating.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 01 09:05:05 np0005604215.localdomain sshd[118325]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:05:05 np0005604215.localdomain sshd[118325]: Server listening on 0.0.0.0 port 22.
Feb 01 09:05:05 np0005604215.localdomain sshd[118325]: Server listening on :: port 22.
Feb 01 09:05:05 np0005604215.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 01 09:05:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21264 DF PROTO=TCP SPT=56142 DPT=9102 SEQ=1489006701 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C1F8E0000000001030307) 
Feb 01 09:05:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51670 DF PROTO=TCP SPT=58496 DPT=9100 SEQ=1825872090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C2BCD0000000001030307) 
Feb 01 09:05:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32158 DF PROTO=TCP SPT=38530 DPT=9101 SEQ=3852027846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C390E0000000001030307) 
Feb 01 09:05:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56944 DF PROTO=TCP SPT=42620 DPT=9882 SEQ=2023601156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C410D0000000001030307) 
Feb 01 09:05:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21266 DF PROTO=TCP SPT=56142 DPT=9102 SEQ=1489006701 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C4F0D0000000001030307) 
Feb 01 09:05:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51672 DF PROTO=TCP SPT=58496 DPT=9100 SEQ=1825872090 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C5B0D0000000001030307) 
Feb 01 09:05:25 np0005604215.localdomain sshd[118434]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:05:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32160 DF PROTO=TCP SPT=38530 DPT=9101 SEQ=3852027846 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C690E0000000001030307) 
Feb 01 09:05:26 np0005604215.localdomain sshd[118434]: Invalid user amssys from 85.206.171.113 port 40682
Feb 01 09:05:26 np0005604215.localdomain sshd[118434]: Received disconnect from 85.206.171.113 port 40682:11: Bye Bye [preauth]
Feb 01 09:05:26 np0005604215.localdomain sshd[118434]: Disconnected from invalid user amssys 85.206.171.113 port 40682 [preauth]
Feb 01 09:05:27 np0005604215.localdomain sshd[118436]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:05:27 np0005604215.localdomain sshd[118436]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:05:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29287 DF PROTO=TCP SPT=55806 DPT=9882 SEQ=2753578153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C7AFC0000000001030307) 
Feb 01 09:05:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29288 DF PROTO=TCP SPT=55806 DPT=9882 SEQ=2753578153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C7F0D0000000001030307) 
Feb 01 09:05:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49132 DF PROTO=TCP SPT=53468 DPT=9105 SEQ=429852356 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C870E0000000001030307) 
Feb 01 09:05:34 np0005604215.localdomain sudo[118467]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:05:34 np0005604215.localdomain sudo[118467]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:05:34 np0005604215.localdomain sudo[118467]: pam_unix(sudo:session): session closed for user root
Feb 01 09:05:34 np0005604215.localdomain sudo[118482]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:05:34 np0005604215.localdomain sudo[118482]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:05:35 np0005604215.localdomain systemd[1]: tmp-crun.985hQx.mount: Deactivated successfully.
Feb 01 09:05:35 np0005604215.localdomain podman[118567]: 2026-02-01 09:05:35.322399256 +0000 UTC m=+0.096449940 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.buildah.version=1.41.4, vcs-type=git, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:05:35 np0005604215.localdomain podman[118567]: 2026-02-01 09:05:35.430680134 +0000 UTC m=+0.204730788 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, name=rhceph, ceph=True, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, release=1764794109, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:05:35 np0005604215.localdomain sudo[118482]: pam_unix(sudo:session): session closed for user root
Feb 01 09:05:35 np0005604215.localdomain sudo[118631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:05:35 np0005604215.localdomain sudo[118631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:05:35 np0005604215.localdomain sudo[118631]: pam_unix(sudo:session): session closed for user root
Feb 01 09:05:35 np0005604215.localdomain sudo[118646]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:05:35 np0005604215.localdomain sudo[118646]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:05:36 np0005604215.localdomain sudo[118646]: pam_unix(sudo:session): session closed for user root
Feb 01 09:05:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1889 DF PROTO=TCP SPT=57058 DPT=9102 SEQ=104127017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64C948D0000000001030307) 
Feb 01 09:05:37 np0005604215.localdomain sudo[118693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:05:37 np0005604215.localdomain sudo[118693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:05:37 np0005604215.localdomain sudo[118693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:05:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23926 DF PROTO=TCP SPT=45922 DPT=9100 SEQ=368008998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CA10D0000000001030307) 
Feb 01 09:05:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14825 DF PROTO=TCP SPT=36950 DPT=9101 SEQ=2660027807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CAE4E0000000001030307) 
Feb 01 09:05:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29291 DF PROTO=TCP SPT=55806 DPT=9882 SEQ=2753578153 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CB70D0000000001030307) 
Feb 01 09:05:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1891 DF PROTO=TCP SPT=57058 DPT=9102 SEQ=104127017 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CC50E0000000001030307) 
Feb 01 09:05:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23928 DF PROTO=TCP SPT=45922 DPT=9100 SEQ=368008998 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CD10D0000000001030307) 
Feb 01 09:05:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14827 DF PROTO=TCP SPT=36950 DPT=9101 SEQ=2660027807 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CDF0D0000000001030307) 
Feb 01 09:06:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=514 DF PROTO=TCP SPT=47956 DPT=9882 SEQ=3399857180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CF02D0000000001030307) 
Feb 01 09:06:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=515 DF PROTO=TCP SPT=47956 DPT=9882 SEQ=3399857180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CF44E0000000001030307) 
Feb 01 09:06:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=516 DF PROTO=TCP SPT=47956 DPT=9882 SEQ=3399857180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64CFC4E0000000001030307) 
Feb 01 09:06:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11811 DF PROTO=TCP SPT=58740 DPT=9102 SEQ=3271871497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D09CD0000000001030307) 
Feb 01 09:06:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34807 DF PROTO=TCP SPT=54414 DPT=9100 SEQ=1532626364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D164E0000000001030307) 
Feb 01 09:06:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57432 DF PROTO=TCP SPT=52170 DPT=9101 SEQ=2258476920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D234E0000000001030307) 
Feb 01 09:06:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=518 DF PROTO=TCP SPT=47956 DPT=9882 SEQ=3399857180 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D2D0D0000000001030307) 
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  Converting 2739 SID table entries...
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:06:16 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:06:17 np0005604215.localdomain sudo[118016]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:18 np0005604215.localdomain sshd[119074]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:06:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=11813 DF PROTO=TCP SPT=58740 DPT=9102 SEQ=3271871497 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D390D0000000001030307) 
Feb 01 09:06:18 np0005604215.localdomain sshd[119074]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:06:19 np0005604215.localdomain sudo[119151]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwxukgywcramcnrrspsvuucbyatnyysh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936779.6334922-399-243377443119921/AnsiballZ_file.py
Feb 01 09:06:19 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=17 res=1
Feb 01 09:06:19 np0005604215.localdomain sudo[119151]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:20 np0005604215.localdomain python3.9[119153]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:06:20 np0005604215.localdomain sudo[119151]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:20 np0005604215.localdomain sudo[119243]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nncvsnoecppuxxzuwctfmvxroeumolsp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936780.3313675-424-107940076111585/AnsiballZ_stat.py
Feb 01 09:06:20 np0005604215.localdomain sudo[119243]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:20 np0005604215.localdomain python3.9[119245]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/edpm.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:06:20 np0005604215.localdomain sudo[119243]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:21 np0005604215.localdomain sudo[119316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbipwyaixqpnyunwfjjlotbklphzkfbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936780.3313675-424-107940076111585/AnsiballZ_copy.py
Feb 01 09:06:21 np0005604215.localdomain sudo[119316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:21 np0005604215.localdomain python3.9[119318]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/edpm.fact mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936780.3313675-424-107940076111585/.source.fact _original_basename=._zpp9x14 follow=False checksum=03aee63dcf9b49b0ac4473b2f1a1b5d3783aa639 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:06:21 np0005604215.localdomain sudo[119316]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34809 DF PROTO=TCP SPT=54414 DPT=9100 SEQ=1532626364 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D470E0000000001030307) 
Feb 01 09:06:22 np0005604215.localdomain python3.9[119408]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:06:23 np0005604215.localdomain sudo[119504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urgvpnlhfqnznjnqvletkbfucglxnaxk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936782.842889-498-207734238013450/AnsiballZ_setup.py
Feb 01 09:06:23 np0005604215.localdomain sudo[119504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:23 np0005604215.localdomain python3.9[119506]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:06:23 np0005604215.localdomain sudo[119504]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:24 np0005604215.localdomain sudo[119558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tiyywrkuyhvaobnkuqkqlmsjijkioaju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936782.842889-498-207734238013450/AnsiballZ_dnf.py
Feb 01 09:06:24 np0005604215.localdomain sudo[119558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:24 np0005604215.localdomain python3.9[119560]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:06:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57434 DF PROTO=TCP SPT=52170 DPT=9101 SEQ=2258476920 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D530D0000000001030307) 
Feb 01 09:06:27 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:06:27 np0005604215.localdomain systemd-rc-local-generator[119597]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:06:27 np0005604215.localdomain systemd-sysv-generator[119600]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:06:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:06:28 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 09:06:29 np0005604215.localdomain sudo[119558]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:29 np0005604215.localdomain sudo[119697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcdnibbpzsmsikdnejpzzpquuzthlvas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936789.5700643-534-176236871690409/AnsiballZ_command.py
Feb 01 09:06:29 np0005604215.localdomain sudo[119697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7941 DF PROTO=TCP SPT=57284 DPT=9882 SEQ=2716998411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D655C0000000001030307) 
Feb 01 09:06:30 np0005604215.localdomain python3.9[119699]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:06:30 np0005604215.localdomain sudo[119697]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7942 DF PROTO=TCP SPT=57284 DPT=9882 SEQ=2716998411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D694D0000000001030307) 
Feb 01 09:06:31 np0005604215.localdomain sudo[119936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubqawkuhhhgvqsmzzuqpnltslnddwqic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936791.0477045-559-149744594759999/AnsiballZ_selinux.py
Feb 01 09:06:31 np0005604215.localdomain sudo[119936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:31 np0005604215.localdomain python3.9[119938]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Feb 01 09:06:31 np0005604215.localdomain sudo[119936]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:32 np0005604215.localdomain sudo[120028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oefziqzzpcmeiqoskjbktzatuyiqtjet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936792.4533322-591-77890852436065/AnsiballZ_command.py
Feb 01 09:06:32 np0005604215.localdomain sudo[120028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:32 np0005604215.localdomain python3.9[120030]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Feb 01 09:06:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7943 DF PROTO=TCP SPT=57284 DPT=9882 SEQ=2716998411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D714E0000000001030307) 
Feb 01 09:06:33 np0005604215.localdomain sudo[120028]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:33 np0005604215.localdomain sudo[120121]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-icmfbhseowjpfnsedmmhbqnefkktxtbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936793.6354537-616-165010063875881/AnsiballZ_file.py
Feb 01 09:06:33 np0005604215.localdomain sudo[120121]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:34 np0005604215.localdomain python3.9[120123]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:06:34 np0005604215.localdomain sudo[120121]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:34 np0005604215.localdomain sudo[120213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jopqxlhkwnekmlyjiwkwatddbilzhiju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936794.3067663-640-38566414998414/AnsiballZ_mount.py
Feb 01 09:06:34 np0005604215.localdomain sudo[120213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:34 np0005604215.localdomain python3.9[120215]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Feb 01 09:06:34 np0005604215.localdomain sudo[120213]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:36 np0005604215.localdomain sudo[120305]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykqywbulvkyaepxjioqfypvzvagbzanx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936796.0459332-724-32975069935936/AnsiballZ_file.py
Feb 01 09:06:36 np0005604215.localdomain sudo[120305]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:36 np0005604215.localdomain python3.9[120307]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:06:36 np0005604215.localdomain sudo[120305]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47078 DF PROTO=TCP SPT=43750 DPT=9102 SEQ=3647426648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D7F0D0000000001030307) 
Feb 01 09:06:36 np0005604215.localdomain sudo[120397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ikseybktbpeigobstygyexbulrjxwtfy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936796.7140834-747-260046596405157/AnsiballZ_stat.py
Feb 01 09:06:36 np0005604215.localdomain sudo[120397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:37 np0005604215.localdomain python3.9[120399]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:06:37 np0005604215.localdomain sudo[120397]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:37 np0005604215.localdomain sudo[120400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:06:37 np0005604215.localdomain sudo[120400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:06:37 np0005604215.localdomain sudo[120400]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:37 np0005604215.localdomain sudo[120415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:06:37 np0005604215.localdomain sudo[120415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:06:37 np0005604215.localdomain sudo[120500]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swmawlquajyvtrnbggmffagklrchbcch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936796.7140834-747-260046596405157/AnsiballZ_copy.py
Feb 01 09:06:37 np0005604215.localdomain sudo[120500]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:37 np0005604215.localdomain python3.9[120509]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769936796.7140834-747-260046596405157/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:06:37 np0005604215.localdomain sudo[120500]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:37 np0005604215.localdomain sudo[120415]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:38 np0005604215.localdomain sudo[120549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:06:38 np0005604215.localdomain sudo[120549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:06:38 np0005604215.localdomain sudo[120549]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:38 np0005604215.localdomain sudo[120638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iurpjynndracllyucyzqzqxhnyefqdbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936798.5464332-820-217692227487826/AnsiballZ_stat.py
Feb 01 09:06:38 np0005604215.localdomain sudo[120638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:39 np0005604215.localdomain python3.9[120640]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:06:39 np0005604215.localdomain sudo[120638]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49117 DF PROTO=TCP SPT=44728 DPT=9100 SEQ=3852835967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D8B8E0000000001030307) 
Feb 01 09:06:40 np0005604215.localdomain sudo[120732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-muxlyfbcnygdlsoiizfdxlsgfecgjzgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936799.6355634-859-238669241984141/AnsiballZ_getent.py
Feb 01 09:06:40 np0005604215.localdomain sudo[120732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:40 np0005604215.localdomain python3.9[120734]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Feb 01 09:06:40 np0005604215.localdomain sudo[120732]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:40 np0005604215.localdomain sudo[120825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuxvaiorjxxroxrukfqyoibknwzflbvb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936800.6967673-889-62238100529360/AnsiballZ_getent.py
Feb 01 09:06:40 np0005604215.localdomain sudo[120825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:41 np0005604215.localdomain python3.9[120827]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Feb 01 09:06:41 np0005604215.localdomain sudo[120825]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:41 np0005604215.localdomain sudo[120918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cejwcrtgenpssqgahausmnfxzowtuvgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936801.3955197-912-40642560070722/AnsiballZ_group.py
Feb 01 09:06:41 np0005604215.localdomain sudo[120918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:41 np0005604215.localdomain python3.9[120920]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 01 09:06:41 np0005604215.localdomain groupmod[120921]: group changed in /etc/group (group hugetlbfs/985, new gid: 42477)
Feb 01 09:06:41 np0005604215.localdomain groupmod[120921]: group changed in /etc/passwd (group hugetlbfs/985, new gid: 42477)
Feb 01 09:06:42 np0005604215.localdomain sudo[120918]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:42 np0005604215.localdomain sudo[121016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hukerumneojmpygvtbrhtxivazsndtef ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936802.3721795-939-170695314247261/AnsiballZ_file.py
Feb 01 09:06:42 np0005604215.localdomain sudo[121016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:42 np0005604215.localdomain python3.9[121018]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Feb 01 09:06:42 np0005604215.localdomain sudo[121016]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46249 DF PROTO=TCP SPT=59974 DPT=9101 SEQ=4245491564 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64D988E0000000001030307) 
Feb 01 09:06:43 np0005604215.localdomain sudo[121108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jzdkbxcrohlxlkvfbwpxtgvgjekedulq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936803.3258557-973-147070711474253/AnsiballZ_dnf.py
Feb 01 09:06:43 np0005604215.localdomain sudo[121108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:43 np0005604215.localdomain python3.9[121110]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:06:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7945 DF PROTO=TCP SPT=57284 DPT=9882 SEQ=2716998411 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DA10E0000000001030307) 
Feb 01 09:06:46 np0005604215.localdomain sshd[121113]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:06:46 np0005604215.localdomain sshd[121113]: Invalid user shared from 85.206.171.113 port 56284
Feb 01 09:06:46 np0005604215.localdomain sshd[121113]: Received disconnect from 85.206.171.113 port 56284:11: Bye Bye [preauth]
Feb 01 09:06:46 np0005604215.localdomain sshd[121113]: Disconnected from invalid user shared 85.206.171.113 port 56284 [preauth]
Feb 01 09:06:46 np0005604215.localdomain sudo[121108]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:47 np0005604215.localdomain sudo[121204]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rwtdoujffenmgoczkuylrtdgnmmghfcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936807.3690739-997-50697244388451/AnsiballZ_file.py
Feb 01 09:06:47 np0005604215.localdomain sudo[121204]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47080 DF PROTO=TCP SPT=43750 DPT=9102 SEQ=3647426648 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DAF0E0000000001030307) 
Feb 01 09:06:51 np0005604215.localdomain python3.9[121206]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:06:51 np0005604215.localdomain sudo[121204]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49119 DF PROTO=TCP SPT=44728 DPT=9100 SEQ=3852835967 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DBB0D0000000001030307) 
Feb 01 09:06:52 np0005604215.localdomain sudo[121296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ongfygwqwhplxmuvfcxqjiuavwcwllmt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936811.7612963-1021-12669778384064/AnsiballZ_stat.py
Feb 01 09:06:52 np0005604215.localdomain sudo[121296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:52 np0005604215.localdomain python3.9[121298]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:06:52 np0005604215.localdomain sudo[121296]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:52 np0005604215.localdomain sudo[121369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hskrfqxqzqbfwcbtaokqymsygeojmfku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936811.7612963-1021-12669778384064/AnsiballZ_copy.py
Feb 01 09:06:52 np0005604215.localdomain sudo[121369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:52 np0005604215.localdomain python3.9[121371]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769936811.7612963-1021-12669778384064/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:06:52 np0005604215.localdomain sudo[121369]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46251 DF PROTO=TCP SPT=59974 DPT=9101 SEQ=4245491564 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DC90E0000000001030307) 
Feb 01 09:06:57 np0005604215.localdomain sudo[121461]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-downebbtevawqwyfienrpvmxbdpnmszj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936817.4025915-1066-203684877585956/AnsiballZ_systemd.py
Feb 01 09:06:57 np0005604215.localdomain sudo[121461]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:58 np0005604215.localdomain python3.9[121463]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:06:58 np0005604215.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 01 09:06:58 np0005604215.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 01 09:06:58 np0005604215.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 01 09:06:58 np0005604215.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 01 09:06:58 np0005604215.localdomain systemd-modules-load[121467]: Module 'msr' is built in
Feb 01 09:06:58 np0005604215.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 01 09:06:58 np0005604215.localdomain sudo[121461]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:58 np0005604215.localdomain sudo[121558]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqjufkrkmuegqoqnabadzfslhcuoylrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936818.611207-1090-125100225835693/AnsiballZ_stat.py
Feb 01 09:06:58 np0005604215.localdomain sudo[121558]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:59 np0005604215.localdomain python3.9[121560]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:06:59 np0005604215.localdomain sudo[121558]: pam_unix(sudo:session): session closed for user root
Feb 01 09:06:59 np0005604215.localdomain sudo[121631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vopnqauknkpfgjfxebyrapqeyjcbmwuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936818.611207-1090-125100225835693/AnsiballZ_copy.py
Feb 01 09:06:59 np0005604215.localdomain sudo[121631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:06:59 np0005604215.localdomain python3.9[121633]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769936818.611207-1090-125100225835693/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:06:59 np0005604215.localdomain sudo[121631]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1286 DF PROTO=TCP SPT=52774 DPT=9882 SEQ=819140579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DDA8D0000000001030307) 
Feb 01 09:07:00 np0005604215.localdomain sudo[121723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dssaoricetsvmicvqcjjylzppltajiyy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936820.3572283-1144-192652976418359/AnsiballZ_dnf.py
Feb 01 09:07:00 np0005604215.localdomain sudo[121723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:00 np0005604215.localdomain python3.9[121725]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:07:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1287 DF PROTO=TCP SPT=52774 DPT=9882 SEQ=819140579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DDE8D0000000001030307) 
Feb 01 09:07:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1288 DF PROTO=TCP SPT=52774 DPT=9882 SEQ=819140579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DE68D0000000001030307) 
Feb 01 09:07:04 np0005604215.localdomain sudo[121723]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:05 np0005604215.localdomain python3.9[121817]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:07:06 np0005604215.localdomain python3.9[121909]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Feb 01 09:07:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1049 DF PROTO=TCP SPT=60934 DPT=9102 SEQ=3083332741 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64DF40E0000000001030307) 
Feb 01 09:07:06 np0005604215.localdomain python3.9[121999]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:07:07 np0005604215.localdomain sudo[122089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prtktpetnsbspjotvbxjmfujqoxdgbtl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936827.3605616-1266-104359301693126/AnsiballZ_systemd.py
Feb 01 09:07:07 np0005604215.localdomain sudo[122089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:07 np0005604215.localdomain python3.9[122091]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:07:07 np0005604215.localdomain systemd[1]: Stopping Dynamic System Tuning Daemon...
Feb 01 09:07:08 np0005604215.localdomain systemd[1]: tuned.service: Deactivated successfully.
Feb 01 09:07:08 np0005604215.localdomain systemd[1]: Stopped Dynamic System Tuning Daemon.
Feb 01 09:07:08 np0005604215.localdomain systemd[1]: tuned.service: Consumed 1.736s CPU time, no IO.
Feb 01 09:07:08 np0005604215.localdomain systemd[1]: Starting Dynamic System Tuning Daemon...
Feb 01 09:07:09 np0005604215.localdomain systemd[1]: Started Dynamic System Tuning Daemon.
Feb 01 09:07:09 np0005604215.localdomain sudo[122089]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55600 DF PROTO=TCP SPT=38318 DPT=9100 SEQ=3734527054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E008D0000000001030307) 
Feb 01 09:07:10 np0005604215.localdomain python3.9[122193]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Feb 01 09:07:11 np0005604215.localdomain sshd[122208]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:07:11 np0005604215.localdomain sshd[122208]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:07:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47596 DF PROTO=TCP SPT=59356 DPT=9101 SEQ=1807572945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E0DCD0000000001030307) 
Feb 01 09:07:13 np0005604215.localdomain sudo[122285]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxrdwmnqxtysftushzposodgbcgnfahi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936833.164819-1438-52139354108259/AnsiballZ_systemd.py
Feb 01 09:07:13 np0005604215.localdomain sudo[122285]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:13 np0005604215.localdomain python3.9[122287]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:07:13 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:07:13 np0005604215.localdomain systemd-sysv-generator[122320]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:07:13 np0005604215.localdomain systemd-rc-local-generator[122315]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:07:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:07:14 np0005604215.localdomain sudo[122285]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:14 np0005604215.localdomain sudo[122415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zdrwzwnrmnlddflcmtxojdhfqatzrrje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936834.197859-1438-61849167829465/AnsiballZ_systemd.py
Feb 01 09:07:14 np0005604215.localdomain sudo[122415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:14 np0005604215.localdomain python3.9[122417]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:07:14 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:07:14 np0005604215.localdomain systemd-sysv-generator[122450]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:07:14 np0005604215.localdomain systemd-rc-local-generator[122444]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:07:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:07:15 np0005604215.localdomain sudo[122415]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1290 DF PROTO=TCP SPT=52774 DPT=9882 SEQ=819140579 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E170D0000000001030307) 
Feb 01 09:07:15 np0005604215.localdomain sudo[122546]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fjloecdynyjhlyobquhjghzdorlehnqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936835.54222-1485-64466815285018/AnsiballZ_command.py
Feb 01 09:07:15 np0005604215.localdomain sudo[122546]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:16 np0005604215.localdomain python3.9[122548]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:07:16 np0005604215.localdomain sudo[122546]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:16 np0005604215.localdomain sudo[122639]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brfmplmnwdvoummjkjkhjhzajbzaabls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936836.2403271-1510-249935602608970/AnsiballZ_command.py
Feb 01 09:07:16 np0005604215.localdomain sudo[122639]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:16 np0005604215.localdomain python3.9[122641]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:07:16 np0005604215.localdomain kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k FS
Feb 01 09:07:16 np0005604215.localdomain sudo[122639]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:17 np0005604215.localdomain sudo[122732]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhmvutcvgzfksmjpytgsgjybmbzsrqro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936836.9517074-1534-88632760878872/AnsiballZ_command.py
Feb 01 09:07:17 np0005604215.localdomain sudo[122732]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:17 np0005604215.localdomain python3.9[122734]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:07:18 np0005604215.localdomain sudo[122732]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1051 DF PROTO=TCP SPT=60934 DPT=9102 SEQ=3083332741 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E250E0000000001030307) 
Feb 01 09:07:19 np0005604215.localdomain sudo[122831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fzdirdfdyaitrfpkwvipuvpppjuieffn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936838.7572198-1558-135627802358737/AnsiballZ_command.py
Feb 01 09:07:19 np0005604215.localdomain sudo[122831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:19 np0005604215.localdomain python3.9[122833]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:07:19 np0005604215.localdomain sudo[122831]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:19 np0005604215.localdomain sudo[122924]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdiojivvgxsunrgacnfwqedskwjhisgi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936839.5246928-1582-13883718863006/AnsiballZ_systemd.py
Feb 01 09:07:19 np0005604215.localdomain sudo[122924]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:20 np0005604215.localdomain python3.9[122926]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: systemd-sysctl.service: Deactivated successfully.
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: Stopped Apply Kernel Variables.
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: Stopping Apply Kernel Variables...
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: Starting Apply Kernel Variables...
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: Finished Apply Kernel Variables.
Feb 01 09:07:20 np0005604215.localdomain sudo[122924]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:20 np0005604215.localdomain sshd[116606]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: session-38.scope: Deactivated successfully.
Feb 01 09:07:20 np0005604215.localdomain systemd[1]: session-38.scope: Consumed 1min 56.046s CPU time.
Feb 01 09:07:20 np0005604215.localdomain systemd-logind[761]: Session 38 logged out. Waiting for processes to exit.
Feb 01 09:07:20 np0005604215.localdomain systemd-logind[761]: Removed session 38.
Feb 01 09:07:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55602 DF PROTO=TCP SPT=38318 DPT=9100 SEQ=3734527054 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E310E0000000001030307) 
Feb 01 09:07:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47598 DF PROTO=TCP SPT=59356 DPT=9101 SEQ=1807572945 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E3D0D0000000001030307) 
Feb 01 09:07:25 np0005604215.localdomain sshd[122946]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:07:25 np0005604215.localdomain sshd[122946]: Accepted publickey for zuul from 192.168.122.31 port 48594 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:07:25 np0005604215.localdomain systemd-logind[761]: New session 39 of user zuul.
Feb 01 09:07:25 np0005604215.localdomain systemd[1]: Started Session 39 of User zuul.
Feb 01 09:07:25 np0005604215.localdomain sshd[122946]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:07:26 np0005604215.localdomain python3.9[123039]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:07:27 np0005604215.localdomain python3.9[123133]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:07:29 np0005604215.localdomain sudo[123227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bumqnloajnnxiwobpnfqpngelkmwmgdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936848.7529912-107-20938056580526/AnsiballZ_command.py
Feb 01 09:07:29 np0005604215.localdomain sudo[123227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:29 np0005604215.localdomain python3.9[123229]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:07:29 np0005604215.localdomain sudo[123227]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9113 DF PROTO=TCP SPT=50094 DPT=9882 SEQ=3801450267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E4FBC0000000001030307) 
Feb 01 09:07:30 np0005604215.localdomain python3.9[123320]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:07:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9114 DF PROTO=TCP SPT=50094 DPT=9882 SEQ=3801450267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E53CD0000000001030307) 
Feb 01 09:07:31 np0005604215.localdomain sudo[123414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlxgqgmcsjoeysjhnppbuqvwahbvsgeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936850.8859673-167-155076223179201/AnsiballZ_setup.py
Feb 01 09:07:31 np0005604215.localdomain sudo[123414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:31 np0005604215.localdomain python3.9[123416]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:07:31 np0005604215.localdomain sudo[123414]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:32 np0005604215.localdomain sudo[123468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpidowzeaxqvouvdwoejmelkfxnooyqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936850.8859673-167-155076223179201/AnsiballZ_dnf.py
Feb 01 09:07:32 np0005604215.localdomain sudo[123468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:32 np0005604215.localdomain python3.9[123470]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:07:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9115 DF PROTO=TCP SPT=50094 DPT=9882 SEQ=3801450267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E5BCD0000000001030307) 
Feb 01 09:07:35 np0005604215.localdomain sudo[123468]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:36 np0005604215.localdomain sudo[123562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orcbbbznlqzotaxtqiyrgrtpvkmqczog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936855.8538132-204-137407743966594/AnsiballZ_setup.py
Feb 01 09:07:36 np0005604215.localdomain sudo[123562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:36 np0005604215.localdomain python3.9[123564]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:07:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32025 DF PROTO=TCP SPT=50502 DPT=9102 SEQ=972267169 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E694D0000000001030307) 
Feb 01 09:07:36 np0005604215.localdomain sudo[123562]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:37 np0005604215.localdomain sudo[123709]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prudatyukwnfoqyjwlynwjdfoetdgmng ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936857.1661148-236-172247512287517/AnsiballZ_file.py
Feb 01 09:07:37 np0005604215.localdomain sudo[123709]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:37 np0005604215.localdomain python3.9[123711]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:07:37 np0005604215.localdomain sudo[123709]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:38 np0005604215.localdomain sudo[123801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bniwefiexcsbljmbsavkgqddsorjecfe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936858.0045583-260-241287384954396/AnsiballZ_command.py
Feb 01 09:07:38 np0005604215.localdomain sudo[123801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:38 np0005604215.localdomain python3.9[123803]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:07:38 np0005604215.localdomain sudo[123801]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:38 np0005604215.localdomain sudo[123831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:07:38 np0005604215.localdomain sudo[123831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:07:38 np0005604215.localdomain sudo[123831]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:38 np0005604215.localdomain sudo[123846]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:07:38 np0005604215.localdomain sudo[123846]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:07:39 np0005604215.localdomain sudo[123955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgyhqvzemlgmdtkbkhdppiguoicllvvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936858.7724168-284-96199543653664/AnsiballZ_stat.py
Feb 01 09:07:39 np0005604215.localdomain sudo[123955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:39 np0005604215.localdomain sudo[123846]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:39 np0005604215.localdomain python3.9[123959]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:07:39 np0005604215.localdomain sudo[123955]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:39 np0005604215.localdomain sudo[124017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzekmoymmyfqbsshrzlzfxudtoguielg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936858.7724168-284-96199543653664/AnsiballZ_file.py
Feb 01 09:07:39 np0005604215.localdomain sudo[124017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58560 DF PROTO=TCP SPT=58548 DPT=9100 SEQ=1039894359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E75CD0000000001030307) 
Feb 01 09:07:39 np0005604215.localdomain python3.9[124019]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:07:39 np0005604215.localdomain sudo[124017]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:40 np0005604215.localdomain sudo[124034]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:07:40 np0005604215.localdomain sudo[124034]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:07:40 np0005604215.localdomain sudo[124034]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:40 np0005604215.localdomain sudo[124124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xqqwifhatozmcukfoigzekhvcloguvbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936860.164123-320-202167324695930/AnsiballZ_stat.py
Feb 01 09:07:40 np0005604215.localdomain sudo[124124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:40 np0005604215.localdomain python3.9[124126]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:07:40 np0005604215.localdomain sudo[124124]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:41 np0005604215.localdomain sudo[124197]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-beijxscmpqciixxvspiwrtamhcqpacql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936860.164123-320-202167324695930/AnsiballZ_copy.py
Feb 01 09:07:41 np0005604215.localdomain sudo[124197]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:41 np0005604215.localdomain python3.9[124199]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769936860.164123-320-202167324695930/.source.conf follow=False _original_basename=registries.conf.j2 checksum=804a0d01b832e60d20f779a331306df708c87b02 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:07:41 np0005604215.localdomain sudo[124197]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:42 np0005604215.localdomain sudo[124289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqjwnpqzdvdedazcgfqhzcvwqlsaoozs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936861.6834931-368-192720010770169/AnsiballZ_ini_file.py
Feb 01 09:07:42 np0005604215.localdomain sudo[124289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:42 np0005604215.localdomain python3.9[124291]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:07:42 np0005604215.localdomain sudo[124289]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:42 np0005604215.localdomain sudo[124381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txnqeavlfmoweykeqjjerynhddvstiim ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936862.4552975-368-248205780056213/AnsiballZ_ini_file.py
Feb 01 09:07:42 np0005604215.localdomain sudo[124381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:42 np0005604215.localdomain python3.9[124383]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:07:42 np0005604215.localdomain sudo[124381]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61490 DF PROTO=TCP SPT=41400 DPT=9101 SEQ=282903222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E830D0000000001030307) 
Feb 01 09:07:43 np0005604215.localdomain sudo[124473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zzfwyxjtiurupnemasibdlanaphsizmu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936863.0965126-368-190518127859884/AnsiballZ_ini_file.py
Feb 01 09:07:43 np0005604215.localdomain sudo[124473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:43 np0005604215.localdomain python3.9[124475]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:07:43 np0005604215.localdomain sudo[124473]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:43 np0005604215.localdomain sudo[124565]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpbdzjwscgawgecbzptppyprwcrvnhgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936863.6946945-368-108937624477834/AnsiballZ_ini_file.py
Feb 01 09:07:43 np0005604215.localdomain sudo[124565]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:44 np0005604215.localdomain python3.9[124567]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:07:44 np0005604215.localdomain sudo[124565]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9117 DF PROTO=TCP SPT=50094 DPT=9882 SEQ=3801450267 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E8B0D0000000001030307) 
Feb 01 09:07:45 np0005604215.localdomain python3.9[124657]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:07:45 np0005604215.localdomain sudo[124749]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tffvdxajdarlafkmhekmzfpoowpaolqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936865.5526767-488-80548926698929/AnsiballZ_dnf.py
Feb 01 09:07:45 np0005604215.localdomain sudo[124749]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:46 np0005604215.localdomain python3.9[124751]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:07:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32027 DF PROTO=TCP SPT=50502 DPT=9102 SEQ=972267169 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64E990D0000000001030307) 
Feb 01 09:07:49 np0005604215.localdomain sudo[124749]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:49 np0005604215.localdomain sudo[124843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxpqfghgapttihxkgoxymfdkypipqxai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936869.5311506-512-121450289179017/AnsiballZ_dnf.py
Feb 01 09:07:49 np0005604215.localdomain sudo[124843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:50 np0005604215.localdomain python3.9[124845]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:07:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58562 DF PROTO=TCP SPT=58548 DPT=9100 SEQ=1039894359 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EA50E0000000001030307) 
Feb 01 09:07:53 np0005604215.localdomain sudo[124843]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:53 np0005604215.localdomain sudo[124937]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgjvbxjdouusmzhmehkggzgdrvwaupdy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936873.694982-542-174758922429591/AnsiballZ_dnf.py
Feb 01 09:07:53 np0005604215.localdomain sudo[124937]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:54 np0005604215.localdomain python3.9[124939]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:07:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61492 DF PROTO=TCP SPT=41400 DPT=9101 SEQ=282903222 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EB30D0000000001030307) 
Feb 01 09:07:57 np0005604215.localdomain sudo[124937]: pam_unix(sudo:session): session closed for user root
Feb 01 09:07:58 np0005604215.localdomain sudo[125037]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wgsviudedabtxoyhffpqgtkwklpdaeah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936877.8631275-569-125272645054695/AnsiballZ_dnf.py
Feb 01 09:07:58 np0005604215.localdomain sudo[125037]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:07:58 np0005604215.localdomain python3.9[125039]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:08:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20840 DF PROTO=TCP SPT=42994 DPT=9882 SEQ=3843553390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EC4ED0000000001030307) 
Feb 01 09:08:00 np0005604215.localdomain sshd[125042]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:08:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20841 DF PROTO=TCP SPT=42994 DPT=9882 SEQ=3843553390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EC90D0000000001030307) 
Feb 01 09:08:01 np0005604215.localdomain sshd[125042]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:08:01 np0005604215.localdomain sudo[125037]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:02 np0005604215.localdomain sudo[125133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qcmowbuwepfxuursivgkrjpxvquomavc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936882.28135-605-270687661330984/AnsiballZ_dnf.py
Feb 01 09:08:02 np0005604215.localdomain sudo[125133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:02 np0005604215.localdomain python3.9[125135]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:08:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20842 DF PROTO=TCP SPT=42994 DPT=9882 SEQ=3843553390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64ED10D0000000001030307) 
Feb 01 09:08:05 np0005604215.localdomain sudo[125133]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:06 np0005604215.localdomain sudo[125227]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vlsbjdzbuosidyybpdcrygoyrpswsyhk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936886.2794206-632-58130915962034/AnsiballZ_dnf.py
Feb 01 09:08:06 np0005604215.localdomain sudo[125227]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43412 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=2493936316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EDE8D0000000001030307) 
Feb 01 09:08:06 np0005604215.localdomain python3.9[125229]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:08:08 np0005604215.localdomain sshd[125232]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:08:09 np0005604215.localdomain sshd[125232]: Received disconnect from 85.206.171.113 port 35742:11: Bye Bye [preauth]
Feb 01 09:08:09 np0005604215.localdomain sshd[125232]: Disconnected from authenticating user root 85.206.171.113 port 35742 [preauth]
Feb 01 09:08:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3615 DF PROTO=TCP SPT=50860 DPT=9100 SEQ=3681374626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EEB0D0000000001030307) 
Feb 01 09:08:09 np0005604215.localdomain sudo[125227]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:10 np0005604215.localdomain sudo[125323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zkctrqxttaownqjatqopzypynaqbzalf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936890.3365054-659-180816008832348/AnsiballZ_dnf.py
Feb 01 09:08:10 np0005604215.localdomain sudo[125323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:10 np0005604215.localdomain python3.9[125325]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:08:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36417 DF PROTO=TCP SPT=49922 DPT=9101 SEQ=2955079836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64EF80D0000000001030307) 
Feb 01 09:08:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20844 DF PROTO=TCP SPT=42994 DPT=9882 SEQ=3843553390 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F010D0000000001030307) 
Feb 01 09:08:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43414 DF PROTO=TCP SPT=33952 DPT=9102 SEQ=2493936316 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F0F0D0000000001030307) 
Feb 01 09:08:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3617 DF PROTO=TCP SPT=50860 DPT=9100 SEQ=3681374626 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F1B0E0000000001030307) 
Feb 01 09:08:22 np0005604215.localdomain sudo[125323]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:23 np0005604215.localdomain sudo[125491]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-riarcbkazwstonsestbefnsuvtmlblua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936903.3407066-701-192076661729561/AnsiballZ_file.py
Feb 01 09:08:23 np0005604215.localdomain sudo[125491]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:23 np0005604215.localdomain python3.9[125493]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:08:23 np0005604215.localdomain sudo[125491]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:24 np0005604215.localdomain sudo[125596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xetoicymwbekbbjyuamiqtjbekouaucp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936904.05408-725-8276830412287/AnsiballZ_stat.py
Feb 01 09:08:24 np0005604215.localdomain sudo[125596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:24 np0005604215.localdomain python3.9[125598]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:08:24 np0005604215.localdomain sudo[125596]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:24 np0005604215.localdomain sudo[125669]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qubmcdbsjssfqghrwoxrckiqkewaedpb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936904.05408-725-8276830412287/AnsiballZ_copy.py
Feb 01 09:08:24 np0005604215.localdomain sudo[125669]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:25 np0005604215.localdomain python3.9[125671]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1769936904.05408-725-8276830412287/.source.json _original_basename=._dg90376 follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:08:25 np0005604215.localdomain sudo[125669]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36419 DF PROTO=TCP SPT=49922 DPT=9101 SEQ=2955079836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F290D0000000001030307) 
Feb 01 09:08:25 np0005604215.localdomain sudo[125761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-twzhkstibglupvlwzbywfkjtryfgbikw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936905.4830737-779-83562066826488/AnsiballZ_podman_image.py
Feb 01 09:08:25 np0005604215.localdomain sudo[125761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:26 np0005604215.localdomain python3.9[125763]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 01 09:08:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51976 DF PROTO=TCP SPT=54358 DPT=9882 SEQ=3336363457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F3A1D0000000001030307) 
Feb 01 09:08:30 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 77.5 (258 of 333 items), suggesting rotation.
Feb 01 09:08:30 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 09:08:30 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:08:30 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:08:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51977 DF PROTO=TCP SPT=54358 DPT=9882 SEQ=3336363457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F3E0D0000000001030307) 
Feb 01 09:08:31 np0005604215.localdomain podman[125775]: 2026-02-01 09:08:26.196099361 +0000 UTC m=+0.039095555 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 01 09:08:32 np0005604215.localdomain sudo[125761]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51978 DF PROTO=TCP SPT=54358 DPT=9882 SEQ=3336363457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F460D0000000001030307) 
Feb 01 09:08:33 np0005604215.localdomain sudo[125974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nbdcvexgghraoafulkklntlasqzkdgck ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936912.7308564-812-130467681354020/AnsiballZ_podman_image.py
Feb 01 09:08:33 np0005604215.localdomain sudo[125974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:33 np0005604215.localdomain python3.9[125976]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 01 09:08:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51559 DF PROTO=TCP SPT=42946 DPT=9102 SEQ=2849249556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F53CD0000000001030307) 
Feb 01 09:08:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58046 DF PROTO=TCP SPT=59062 DPT=9100 SEQ=2362106755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F604D0000000001030307) 
Feb 01 09:08:40 np0005604215.localdomain sudo[126040]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:08:40 np0005604215.localdomain sudo[126040]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:08:40 np0005604215.localdomain sudo[126040]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:40 np0005604215.localdomain sudo[126055]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:08:40 np0005604215.localdomain sudo[126055]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:08:41 np0005604215.localdomain podman[125991]: 2026-02-01 09:08:33.497796694 +0000 UTC m=+0.021282301 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:08:41 np0005604215.localdomain sudo[126055]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:41 np0005604215.localdomain sudo[125974]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:42 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36420 DF PROTO=TCP SPT=49922 DPT=9101 SEQ=2955079836 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F690D0000000001030307) 
Feb 01 09:08:42 np0005604215.localdomain sudo[126177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:08:42 np0005604215.localdomain sudo[126177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:08:42 np0005604215.localdomain sudo[126177]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:42 np0005604215.localdomain sudo[126267]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfceastxutimiqlbeytyipjvuruudnrz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936922.0511098-845-19937138228274/AnsiballZ_podman_image.py
Feb 01 09:08:42 np0005604215.localdomain sudo[126267]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:42 np0005604215.localdomain python3.9[126269]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 01 09:08:44 np0005604215.localdomain podman[126281]: 2026-02-01 09:08:42.678041594 +0000 UTC m=+0.047298129 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 01 09:08:44 np0005604215.localdomain sudo[126267]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:45 np0005604215.localdomain sudo[126442]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dsfbglrurfxbabiwubvqiiybtkkxniba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936924.887408-872-165590190424938/AnsiballZ_podman_image.py
Feb 01 09:08:45 np0005604215.localdomain sudo[126442]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:45 np0005604215.localdomain python3.9[126444]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 01 09:08:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51980 DF PROTO=TCP SPT=54358 DPT=9882 SEQ=3336363457 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F770D0000000001030307) 
Feb 01 09:08:46 np0005604215.localdomain podman[126457]: 2026-02-01 09:08:45.537695802 +0000 UTC m=+0.055149394 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:08:46 np0005604215.localdomain sudo[126442]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:48 np0005604215.localdomain sudo[126620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcjvkufnnbbuwztinsfwgggbqbkqvqqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936928.007302-899-107646675683649/AnsiballZ_podman_image.py
Feb 01 09:08:48 np0005604215.localdomain sudo[126620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:48 np0005604215.localdomain python3.9[126622]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 01 09:08:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51561 DF PROTO=TCP SPT=42946 DPT=9102 SEQ=2849249556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F830D0000000001030307) 
Feb 01 09:08:50 np0005604215.localdomain sshd[126662]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:08:50 np0005604215.localdomain sshd[126662]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:08:51 np0005604215.localdomain podman[126636]: 2026-02-01 09:08:48.602813549 +0000 UTC m=+0.044055049 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 01 09:08:52 np0005604215.localdomain sudo[126620]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58048 DF PROTO=TCP SPT=59062 DPT=9100 SEQ=2362106755 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F910D0000000001030307) 
Feb 01 09:08:52 np0005604215.localdomain sudo[126814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-viirwmhtnashrcsbvsbshqrauvkxihnm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936932.246626-899-123145426927590/AnsiballZ_podman_image.py
Feb 01 09:08:52 np0005604215.localdomain sudo[126814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:08:52 np0005604215.localdomain python3.9[126816]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Feb 01 09:08:54 np0005604215.localdomain podman[126828]: 2026-02-01 09:08:52.8694345 +0000 UTC m=+0.039621071 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Feb 01 09:08:54 np0005604215.localdomain sudo[126814]: pam_unix(sudo:session): session closed for user root
Feb 01 09:08:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29482 DF PROTO=TCP SPT=37232 DPT=9101 SEQ=3755765723 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64F9D0D0000000001030307) 
Feb 01 09:08:56 np0005604215.localdomain sshd[122946]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:08:56 np0005604215.localdomain systemd[1]: session-39.scope: Deactivated successfully.
Feb 01 09:08:56 np0005604215.localdomain systemd[1]: session-39.scope: Consumed 1min 28.140s CPU time.
Feb 01 09:08:56 np0005604215.localdomain systemd-logind[761]: Session 39 logged out. Waiting for processes to exit.
Feb 01 09:08:56 np0005604215.localdomain systemd-logind[761]: Removed session 39.
Feb 01 09:09:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33598 DF PROTO=TCP SPT=56150 DPT=9882 SEQ=2685807276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FAF4D0000000001030307) 
Feb 01 09:09:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33599 DF PROTO=TCP SPT=56150 DPT=9882 SEQ=2685807276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FB34D0000000001030307) 
Feb 01 09:09:01 np0005604215.localdomain sshd[126935]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:09:01 np0005604215.localdomain sshd[126935]: Accepted publickey for zuul from 192.168.122.31 port 37206 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:09:01 np0005604215.localdomain systemd-logind[761]: New session 40 of user zuul.
Feb 01 09:09:01 np0005604215.localdomain systemd[1]: Started Session 40 of User zuul.
Feb 01 09:09:01 np0005604215.localdomain sshd[126935]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:09:02 np0005604215.localdomain python3.9[127028]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:09:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33600 DF PROTO=TCP SPT=56150 DPT=9882 SEQ=2685807276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FBB4D0000000001030307) 
Feb 01 09:09:04 np0005604215.localdomain sudo[127179]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kdexktbvzqpqawgigyatzsxfjnuuqegk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936943.7429714-65-40059344429074/AnsiballZ_getent.py
Feb 01 09:09:04 np0005604215.localdomain sudo[127179]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:04 np0005604215.localdomain python3.9[127181]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Feb 01 09:09:04 np0005604215.localdomain sudo[127179]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:05 np0005604215.localdomain sudo[127272]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gwiacczzufpjpyqhmgrqkqonjclnmgua ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936945.333541-101-74739333444342/AnsiballZ_setup.py
Feb 01 09:09:05 np0005604215.localdomain sudo[127272]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:05 np0005604215.localdomain python3.9[127274]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:09:06 np0005604215.localdomain sudo[127272]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24183 DF PROTO=TCP SPT=38264 DPT=9102 SEQ=2378727652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FC8CE0000000001030307) 
Feb 01 09:09:06 np0005604215.localdomain sudo[127326]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvfyqwevpgshyxewcpaucoahuiwwsvnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936945.333541-101-74739333444342/AnsiballZ_dnf.py
Feb 01 09:09:06 np0005604215.localdomain sudo[127326]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:06 np0005604215.localdomain python3.9[127328]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch3.3'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:09:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42797 DF PROTO=TCP SPT=50738 DPT=9100 SEQ=80647556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FD54D0000000001030307) 
Feb 01 09:09:09 np0005604215.localdomain sudo[127326]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:11 np0005604215.localdomain sudo[127619]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqvufyrbpmkpxksktmpvzsupttaxhukl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936951.274907-143-189036498484950/AnsiballZ_dnf.py
Feb 01 09:09:11 np0005604215.localdomain sudo[127619]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:11 np0005604215.localdomain python3.9[127621]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:09:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10121 DF PROTO=TCP SPT=41238 DPT=9101 SEQ=3118595880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FE28E0000000001030307) 
Feb 01 09:09:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33602 DF PROTO=TCP SPT=56150 DPT=9882 SEQ=2685807276 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FEB0D0000000001030307) 
Feb 01 09:09:15 np0005604215.localdomain sudo[127619]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:16 np0005604215.localdomain sudo[127713]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndfbdtwecpankrnopntvoboknjlswtto ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936956.1564603-168-252095658486978/AnsiballZ_systemd.py
Feb 01 09:09:16 np0005604215.localdomain sudo[127713]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:16 np0005604215.localdomain python3.9[127715]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:09:17 np0005604215.localdomain sudo[127713]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:18 np0005604215.localdomain python3.9[127808]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:09:18 np0005604215.localdomain sudo[127898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ktojecesuprxsafvshmaffjfalxwacdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936958.3275065-221-162875402208367/AnsiballZ_sefcontext.py
Feb 01 09:09:18 np0005604215.localdomain sudo[127898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=24185 DF PROTO=TCP SPT=38264 DPT=9102 SEQ=2378727652 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA64FF90D0000000001030307) 
Feb 01 09:09:19 np0005604215.localdomain python3.9[127900]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  Converting 2741 SID table entries...
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:09:21 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:09:21 np0005604215.localdomain sudo[127898]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42799 DF PROTO=TCP SPT=50738 DPT=9100 SEQ=80647556 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650050E0000000001030307) 
Feb 01 09:09:25 np0005604215.localdomain python3.9[127996]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:09:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10123 DF PROTO=TCP SPT=41238 DPT=9101 SEQ=3118595880 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650130D0000000001030307) 
Feb 01 09:09:25 np0005604215.localdomain sudo[128092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tmcmqsdnnpfjnlnzhpcwxkwyfbecsphb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936965.6494224-275-221297008900383/AnsiballZ_dnf.py
Feb 01 09:09:25 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=18 res=1
Feb 01 09:09:25 np0005604215.localdomain sudo[128092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:26 np0005604215.localdomain python3.9[128094]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:09:29 np0005604215.localdomain sshd[128097]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:09:29 np0005604215.localdomain sudo[128092]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:29 np0005604215.localdomain sshd[128097]: Invalid user julie from 85.206.171.113 port 49444
Feb 01 09:09:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49239 DF PROTO=TCP SPT=48082 DPT=9882 SEQ=2390964984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650247C0000000001030307) 
Feb 01 09:09:30 np0005604215.localdomain sshd[128097]: Received disconnect from 85.206.171.113 port 49444:11: Bye Bye [preauth]
Feb 01 09:09:30 np0005604215.localdomain sshd[128097]: Disconnected from invalid user julie 85.206.171.113 port 49444 [preauth]
Feb 01 09:09:30 np0005604215.localdomain sudo[128188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vihgcexawzgvrjzthptmydsdgfnfqbvw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936969.85551-301-256625401713264/AnsiballZ_command.py
Feb 01 09:09:30 np0005604215.localdomain sudo[128188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:30 np0005604215.localdomain python3.9[128190]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:09:31 np0005604215.localdomain sudo[128188]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49240 DF PROTO=TCP SPT=48082 DPT=9882 SEQ=2390964984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6502A810000000001030307) 
Feb 01 09:09:32 np0005604215.localdomain sudo[128433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vfussoomyediwxudcgtynmsiwjveaell ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936971.7884815-324-9552113932084/AnsiballZ_file.py
Feb 01 09:09:32 np0005604215.localdomain sudo[128433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:32 np0005604215.localdomain python3.9[128435]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None attributes=None
Feb 01 09:09:32 np0005604215.localdomain sudo[128433]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:33 np0005604215.localdomain python3.9[128525]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:09:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53840 DF PROTO=TCP SPT=60910 DPT=9105 SEQ=2581934111 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650310D0000000001030307) 
Feb 01 09:09:33 np0005604215.localdomain sudo[128617]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pfszsedwkvdvunuarkdbngfdlhubqrpa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936973.5377243-377-60775684727826/AnsiballZ_dnf.py
Feb 01 09:09:33 np0005604215.localdomain sudo[128617]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:34 np0005604215.localdomain python3.9[128619]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:09:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33349 DF PROTO=TCP SPT=60830 DPT=9102 SEQ=1861018260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6503E0E0000000001030307) 
Feb 01 09:09:37 np0005604215.localdomain sudo[128617]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:37 np0005604215.localdomain sudo[128711]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlyttmffpkpyuzgediknqzszfooplkwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936977.5416741-401-24773969718245/AnsiballZ_dnf.py
Feb 01 09:09:37 np0005604215.localdomain sudo[128711]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:38 np0005604215.localdomain python3.9[128713]: ansible-ansible.legacy.dnf Invoked with name=['openstack-network-scripts'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:09:38 np0005604215.localdomain sshd[128715]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:09:38 np0005604215.localdomain sshd[128715]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:09:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48745 DF PROTO=TCP SPT=33710 DPT=9100 SEQ=962843596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6504A8D0000000001030307) 
Feb 01 09:09:41 np0005604215.localdomain sudo[128711]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:41 np0005604215.localdomain sudo[128807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrmngedrrmtcoflqkmzhogyszcvbztjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936981.6632426-425-137814368633571/AnsiballZ_systemd.py
Feb 01 09:09:41 np0005604215.localdomain sudo[128807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:42 np0005604215.localdomain sudo[128810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:09:42 np0005604215.localdomain sudo[128810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:09:42 np0005604215.localdomain sudo[128810]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:42 np0005604215.localdomain sudo[128825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:09:42 np0005604215.localdomain sudo[128825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:09:42 np0005604215.localdomain python3.9[128809]: ansible-ansible.builtin.systemd Invoked with enabled=True name=network daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 01 09:09:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:09:42 np0005604215.localdomain systemd-rc-local-generator[128865]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:09:42 np0005604215.localdomain systemd-sysv-generator[128868]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:09:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:09:42 np0005604215.localdomain sudo[128807]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:42 np0005604215.localdomain sudo[128825]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39016 DF PROTO=TCP SPT=38034 DPT=9101 SEQ=30187069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65057CD0000000001030307) 
Feb 01 09:09:44 np0005604215.localdomain sudo[129001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcqnftbetrbltsvzcdulybjfyqdckroq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936983.843206-455-91174131638834/AnsiballZ_stat.py
Feb 01 09:09:44 np0005604215.localdomain sudo[129001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:44 np0005604215.localdomain python3.9[129003]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:09:44 np0005604215.localdomain sudo[129001]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:44 np0005604215.localdomain sudo[129093]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-laaubrqtoascctsaofdttexjrjknfecl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936984.5849047-482-106266322973229/AnsiballZ_ini_file.py
Feb 01 09:09:44 np0005604215.localdomain sudo[129093]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:45 np0005604215.localdomain python3.9[129095]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:45 np0005604215.localdomain sudo[129093]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:45 np0005604215.localdomain sudo[129187]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oalwtonhhmkafuowdmwserabrsifxrdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936985.3742957-506-128815913201076/AnsiballZ_ini_file.py
Feb 01 09:09:45 np0005604215.localdomain sudo[129187]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:45 np0005604215.localdomain python3.9[129189]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:45 np0005604215.localdomain sudo[129187]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:46 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49243 DF PROTO=TCP SPT=48082 DPT=9882 SEQ=2390964984 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650630D0000000001030307) 
Feb 01 09:09:46 np0005604215.localdomain sudo[129279]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wvmgwszffaajscmhpczirtlliighjnov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936986.0360866-530-137980677024125/AnsiballZ_ini_file.py
Feb 01 09:09:46 np0005604215.localdomain sudo[129279]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:46 np0005604215.localdomain python3.9[129281]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:46 np0005604215.localdomain sudo[129279]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:46 np0005604215.localdomain sudo[129296]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:09:46 np0005604215.localdomain sudo[129296]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:09:46 np0005604215.localdomain sudo[129296]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:47 np0005604215.localdomain sudo[129386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oldwpjxgkfzeclvrhvgbsunsrjjxiegf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936986.9668894-560-141302530002294/AnsiballZ_stat.py
Feb 01 09:09:47 np0005604215.localdomain sudo[129386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:47 np0005604215.localdomain python3.9[129388]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:09:47 np0005604215.localdomain sudo[129386]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:47 np0005604215.localdomain sudo[129459]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfjpvtjwdnvurpbphhejmehvzoohhjjz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936986.9668894-560-141302530002294/AnsiballZ_copy.py
Feb 01 09:09:47 np0005604215.localdomain sudo[129459]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:48 np0005604215.localdomain python3.9[129461]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936986.9668894-560-141302530002294/.source _original_basename=.k56qqdux follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:48 np0005604215.localdomain sudo[129459]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:48 np0005604215.localdomain sudo[129551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yfrudwsxhvutxcckfthchyihlngxtpdn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936988.409842-605-107231828913147/AnsiballZ_file.py
Feb 01 09:09:48 np0005604215.localdomain sudo[129551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:48 np0005604215.localdomain python3.9[129553]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:48 np0005604215.localdomain sudo[129551]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33351 DF PROTO=TCP SPT=60830 DPT=9102 SEQ=1861018260 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6506F0D0000000001030307) 
Feb 01 09:09:49 np0005604215.localdomain sudo[129643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gtgocdlulxsjvakkhynvbhalcvzshvyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936989.08226-629-197554041019836/AnsiballZ_edpm_os_net_config_mappings.py
Feb 01 09:09:49 np0005604215.localdomain sudo[129643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:49 np0005604215.localdomain python3.9[129645]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Feb 01 09:09:49 np0005604215.localdomain sudo[129643]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:50 np0005604215.localdomain sudo[129735]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utigggwmroxwkchniluparngtwgeqfar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936989.9754257-656-185598502684493/AnsiballZ_file.py
Feb 01 09:09:50 np0005604215.localdomain sudo[129735]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:50 np0005604215.localdomain python3.9[129737]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:50 np0005604215.localdomain sudo[129735]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:51 np0005604215.localdomain sudo[129827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxxvipdjxvmigahfrqynpcqdrovxakpy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936990.8912172-686-230972531386785/AnsiballZ_stat.py
Feb 01 09:09:51 np0005604215.localdomain sudo[129827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:51 np0005604215.localdomain python3.9[129829]: ansible-ansible.legacy.stat Invoked with path=/etc/os-net-config/config.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:09:51 np0005604215.localdomain sudo[129827]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:51 np0005604215.localdomain sudo[129900]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qianrascxgzuowtqmhfhcutsgznoqsie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936990.8912172-686-230972531386785/AnsiballZ_copy.py
Feb 01 09:09:51 np0005604215.localdomain sudo[129900]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:51 np0005604215.localdomain python3.9[129902]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/os-net-config/config.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936990.8912172-686-230972531386785/.source.yaml _original_basename=.xhys5tp5 follow=False checksum=4c28d1662755c608a6ffaa942e27a2488c0a78a3 force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:51 np0005604215.localdomain sudo[129900]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48747 DF PROTO=TCP SPT=33710 DPT=9100 SEQ=962843596 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6507B0D0000000001030307) 
Feb 01 09:09:52 np0005604215.localdomain sudo[129992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmnfvjtbctwqjzkzplcdjollxmburggk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936992.140479-731-111648971417832/AnsiballZ_slurp.py
Feb 01 09:09:52 np0005604215.localdomain sudo[129992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:52 np0005604215.localdomain python3.9[129994]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Feb 01 09:09:52 np0005604215.localdomain sudo[129992]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:53 np0005604215.localdomain sudo[130098]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tydbckczpxzmjxmezjlffhfamhlemoyy ; ANSIBLE_ASYNC_DIR='~/.ansible_async' /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936993.2521608-758-269011292602205/async_wrapper.py j107560936814 300 /home/zuul/.ansible/tmp/ansible-tmp-1769936993.2521608-758-269011292602205/AnsiballZ_edpm_os_net_config.py _
Feb 01 09:09:53 np0005604215.localdomain sudo[130098]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:54 np0005604215.localdomain ansible-async_wrapper.py[130100]: Invoked with j107560936814 300 /home/zuul/.ansible/tmp/ansible-tmp-1769936993.2521608-758-269011292602205/AnsiballZ_edpm_os_net_config.py _
Feb 01 09:09:54 np0005604215.localdomain ansible-async_wrapper.py[130103]: Starting module and watcher
Feb 01 09:09:54 np0005604215.localdomain ansible-async_wrapper.py[130103]: Start watching 130104 (300)
Feb 01 09:09:54 np0005604215.localdomain ansible-async_wrapper.py[130104]: Start module (130104)
Feb 01 09:09:54 np0005604215.localdomain ansible-async_wrapper.py[130100]: Return async_wrapper task started.
Feb 01 09:09:54 np0005604215.localdomain sudo[130098]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:54 np0005604215.localdomain python3.9[130105]: ansible-edpm_os_net_config Invoked with cleanup=False config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=False
Feb 01 09:09:54 np0005604215.localdomain ansible-async_wrapper.py[130104]: Module complete (130104)
Feb 01 09:09:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39018 DF PROTO=TCP SPT=38034 DPT=9101 SEQ=30187069 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650870D0000000001030307) 
Feb 01 09:09:57 np0005604215.localdomain sudo[130207]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-teglwxsdeabevilbuxztwwxtopnntufx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936997.1739013-758-56352087067705/AnsiballZ_async_status.py
Feb 01 09:09:57 np0005604215.localdomain sudo[130207]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:57 np0005604215.localdomain python3.9[130209]: ansible-ansible.legacy.async_status Invoked with jid=j107560936814.130100 mode=status _async_dir=/root/.ansible_async
Feb 01 09:09:57 np0005604215.localdomain sudo[130207]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:58 np0005604215.localdomain sudo[130266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuenffavdltepltdqagrbctvxrlhxzrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936997.1739013-758-56352087067705/AnsiballZ_async_status.py
Feb 01 09:09:58 np0005604215.localdomain sudo[130266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:58 np0005604215.localdomain python3.9[130268]: ansible-ansible.legacy.async_status Invoked with jid=j107560936814.130100 mode=cleanup _async_dir=/root/.ansible_async
Feb 01 09:09:58 np0005604215.localdomain sudo[130266]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:58 np0005604215.localdomain sudo[130358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tezxwpowyglarfanezcwoymucbhzetzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936998.6342192-824-114211570957403/AnsiballZ_stat.py
Feb 01 09:09:58 np0005604215.localdomain sudo[130358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:59 np0005604215.localdomain ansible-async_wrapper.py[130103]: Done in kid B.
Feb 01 09:09:59 np0005604215.localdomain python3.9[130360]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:09:59 np0005604215.localdomain sudo[130358]: pam_unix(sudo:session): session closed for user root
Feb 01 09:09:59 np0005604215.localdomain sudo[130431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qncxetpqaobbxlcsmpqrzxfwdomqoknf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936998.6342192-824-114211570957403/AnsiballZ_copy.py
Feb 01 09:09:59 np0005604215.localdomain sudo[130431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:09:59 np0005604215.localdomain python3.9[130433]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936998.6342192-824-114211570957403/.source.returncode _original_basename=.gichcccz follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:09:59 np0005604215.localdomain sudo[130431]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47624 DF PROTO=TCP SPT=53862 DPT=9882 SEQ=2007175808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65099AD0000000001030307) 
Feb 01 09:10:00 np0005604215.localdomain sudo[130523]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgdfjmiwhaidztrrtaagqlnzaqecrsqe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936999.9405298-873-245219907110846/AnsiballZ_stat.py
Feb 01 09:10:00 np0005604215.localdomain sudo[130523]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:00 np0005604215.localdomain python3.9[130525]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:10:00 np0005604215.localdomain sudo[130523]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:00 np0005604215.localdomain sudo[130596]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-baduxluxcagbpmyzpkcnpmfqhnkjynhp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769936999.9405298-873-245219907110846/AnsiballZ_copy.py
Feb 01 09:10:00 np0005604215.localdomain sudo[130596]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:00 np0005604215.localdomain python3.9[130598]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769936999.9405298-873-245219907110846/.source.cfg _original_basename=.stgtcv9p follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:10:00 np0005604215.localdomain sudo[130596]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47625 DF PROTO=TCP SPT=53862 DPT=9882 SEQ=2007175808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6509DCE0000000001030307) 
Feb 01 09:10:01 np0005604215.localdomain sudo[130688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qigvhssutrgtleonbvyhfkbxozwbcinx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937001.1925318-918-26804203856348/AnsiballZ_systemd.py
Feb 01 09:10:01 np0005604215.localdomain sudo[130688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:01 np0005604215.localdomain python3.9[130690]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:10:01 np0005604215.localdomain systemd[1]: Reloading Network Manager...
Feb 01 09:10:01 np0005604215.localdomain NetworkManager[5972]: <info>  [1769937001.8844] audit: op="reload" arg="0" pid=130694 uid=0 result="success"
Feb 01 09:10:01 np0005604215.localdomain NetworkManager[5972]: <info>  [1769937001.8851] config: signal: SIGHUP (no changes from disk)
Feb 01 09:10:01 np0005604215.localdomain systemd[1]: Reloaded Network Manager.
Feb 01 09:10:01 np0005604215.localdomain sudo[130688]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:02 np0005604215.localdomain sshd[126935]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:10:02 np0005604215.localdomain systemd[1]: session-40.scope: Deactivated successfully.
Feb 01 09:10:02 np0005604215.localdomain systemd[1]: session-40.scope: Consumed 35.353s CPU time.
Feb 01 09:10:02 np0005604215.localdomain systemd-logind[761]: Session 40 logged out. Waiting for processes to exit.
Feb 01 09:10:02 np0005604215.localdomain systemd-logind[761]: Removed session 40.
Feb 01 09:10:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47626 DF PROTO=TCP SPT=53862 DPT=9882 SEQ=2007175808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650A5CD0000000001030307) 
Feb 01 09:10:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30302 DF PROTO=TCP SPT=49992 DPT=9102 SEQ=2081088930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650B34D0000000001030307) 
Feb 01 09:10:07 np0005604215.localdomain sshd[130709]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:10:07 np0005604215.localdomain sshd[130709]: Accepted publickey for zuul from 192.168.122.31 port 34916 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:10:07 np0005604215.localdomain systemd-logind[761]: New session 41 of user zuul.
Feb 01 09:10:07 np0005604215.localdomain systemd[1]: Started Session 41 of User zuul.
Feb 01 09:10:07 np0005604215.localdomain sshd[130709]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:10:08 np0005604215.localdomain python3.9[130802]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:10:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62741 DF PROTO=TCP SPT=35366 DPT=9100 SEQ=29216029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650BFCE0000000001030307) 
Feb 01 09:10:09 np0005604215.localdomain python3.9[130896]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:10:11 np0005604215.localdomain python3.9[131041]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:10:11 np0005604215.localdomain sshd[130709]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:10:11 np0005604215.localdomain systemd[1]: session-41.scope: Deactivated successfully.
Feb 01 09:10:11 np0005604215.localdomain systemd[1]: session-41.scope: Consumed 1.944s CPU time.
Feb 01 09:10:11 np0005604215.localdomain systemd-logind[761]: Session 41 logged out. Waiting for processes to exit.
Feb 01 09:10:11 np0005604215.localdomain systemd-logind[761]: Removed session 41.
Feb 01 09:10:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26417 DF PROTO=TCP SPT=35542 DPT=9101 SEQ=3011396480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650CCCD0000000001030307) 
Feb 01 09:10:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47628 DF PROTO=TCP SPT=53862 DPT=9882 SEQ=2007175808 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650D50D0000000001030307) 
Feb 01 09:10:17 np0005604215.localdomain sshd[131057]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:10:17 np0005604215.localdomain sshd[131057]: Accepted publickey for zuul from 192.168.122.31 port 40814 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:10:17 np0005604215.localdomain systemd-logind[761]: New session 42 of user zuul.
Feb 01 09:10:17 np0005604215.localdomain systemd[1]: Started Session 42 of User zuul.
Feb 01 09:10:17 np0005604215.localdomain sshd[131057]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:10:18 np0005604215.localdomain python3.9[131150]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:10:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30304 DF PROTO=TCP SPT=49992 DPT=9102 SEQ=2081088930 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650E30D0000000001030307) 
Feb 01 09:10:19 np0005604215.localdomain python3.9[131244]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:10:20 np0005604215.localdomain sudo[131338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaamlhwhykgmtvrqkawzmofvwqsimupu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937019.65783-77-150596111062594/AnsiballZ_setup.py
Feb 01 09:10:20 np0005604215.localdomain sudo[131338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:20 np0005604215.localdomain python3.9[131340]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:10:20 np0005604215.localdomain sudo[131338]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:21 np0005604215.localdomain sudo[131392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuikvnhdzjkrpmqmvhnhprxnxncqslji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937019.65783-77-150596111062594/AnsiballZ_dnf.py
Feb 01 09:10:21 np0005604215.localdomain sudo[131392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:21 np0005604215.localdomain python3.9[131394]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:10:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62743 DF PROTO=TCP SPT=35366 DPT=9100 SEQ=29216029 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650EF0E0000000001030307) 
Feb 01 09:10:24 np0005604215.localdomain sudo[131392]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:24 np0005604215.localdomain sshd[131411]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:10:24 np0005604215.localdomain sshd[131411]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:10:24 np0005604215.localdomain sudo[131488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yweaqfsulmambcaxibvaragmnemnbjqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937024.6507633-114-107786422976813/AnsiballZ_setup.py
Feb 01 09:10:24 np0005604215.localdomain sudo[131488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:25 np0005604215.localdomain python3.9[131490]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:10:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26419 DF PROTO=TCP SPT=35542 DPT=9101 SEQ=3011396480 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA650FD0D0000000001030307) 
Feb 01 09:10:25 np0005604215.localdomain sudo[131488]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:26 np0005604215.localdomain sudo[131635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxhunmislffijbiblzhxvbdfvejfqebs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937025.8817124-146-24163238370113/AnsiballZ_file.py
Feb 01 09:10:26 np0005604215.localdomain sudo[131635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:26 np0005604215.localdomain python3.9[131637]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:10:26 np0005604215.localdomain sudo[131635]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:27 np0005604215.localdomain sudo[131727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pywycbcqkfihoydibsniydskordrxnlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937026.6377776-170-209360937846871/AnsiballZ_command.py
Feb 01 09:10:27 np0005604215.localdomain sudo[131727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:27 np0005604215.localdomain python3.9[131729]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:10:27 np0005604215.localdomain sudo[131727]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:27 np0005604215.localdomain sudo[131831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smjnqwpgkwjfihlmziyupjbvfpcyhqgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937027.5360777-194-243652538970648/AnsiballZ_stat.py
Feb 01 09:10:27 np0005604215.localdomain sudo[131831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:28 np0005604215.localdomain python3.9[131833]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:10:28 np0005604215.localdomain sudo[131831]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:28 np0005604215.localdomain sudo[131879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hvojjzzicpzeufyskyhtjfseegixtjuf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937027.5360777-194-243652538970648/AnsiballZ_file.py
Feb 01 09:10:28 np0005604215.localdomain sudo[131879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:28 np0005604215.localdomain python3.9[131881]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:10:28 np0005604215.localdomain sudo[131879]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:29 np0005604215.localdomain sudo[131971]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urqjyrrbubufkadxyoxniqwpywaehtzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937028.7624066-231-180191158446029/AnsiballZ_stat.py
Feb 01 09:10:29 np0005604215.localdomain sudo[131971]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:29 np0005604215.localdomain python3.9[131973]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:10:29 np0005604215.localdomain sudo[131971]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:29 np0005604215.localdomain sudo[132019]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwwegoyawjbhxvrmkouepwkvwfssnipf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937028.7624066-231-180191158446029/AnsiballZ_file.py
Feb 01 09:10:29 np0005604215.localdomain sudo[132019]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:29 np0005604215.localdomain python3.9[132021]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:10:29 np0005604215.localdomain sudo[132019]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36375 DF PROTO=TCP SPT=43050 DPT=9882 SEQ=2122265823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6510EDC0000000001030307) 
Feb 01 09:10:30 np0005604215.localdomain sudo[132111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrhjeiosqavoosdaodgmzzrwjlbbdebw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937029.947931-269-33158369242382/AnsiballZ_ini_file.py
Feb 01 09:10:30 np0005604215.localdomain sudo[132111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:30 np0005604215.localdomain python3.9[132113]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:10:30 np0005604215.localdomain sudo[132111]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:30 np0005604215.localdomain sudo[132203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrimqgfpcwqovsgeajocijhduvnecsea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937030.6766548-269-220176245866903/AnsiballZ_ini_file.py
Feb 01 09:10:30 np0005604215.localdomain sudo[132203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36376 DF PROTO=TCP SPT=43050 DPT=9882 SEQ=2122265823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65112CE0000000001030307) 
Feb 01 09:10:31 np0005604215.localdomain python3.9[132205]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:10:31 np0005604215.localdomain sudo[132203]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:31 np0005604215.localdomain sudo[132295]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmaobyighooyvsdovcvgbhoqpzycisld ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937031.239297-269-50876611206430/AnsiballZ_ini_file.py
Feb 01 09:10:31 np0005604215.localdomain sudo[132295]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:31 np0005604215.localdomain python3.9[132297]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:10:31 np0005604215.localdomain sudo[132295]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:32 np0005604215.localdomain sudo[132387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiopehqzqmqqdyhkqrxetdyelkxzglyb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937031.820484-269-205581339588929/AnsiballZ_ini_file.py
Feb 01 09:10:32 np0005604215.localdomain sudo[132387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:32 np0005604215.localdomain python3.9[132389]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:10:32 np0005604215.localdomain sudo[132387]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36377 DF PROTO=TCP SPT=43050 DPT=9882 SEQ=2122265823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6511ACD0000000001030307) 
Feb 01 09:10:33 np0005604215.localdomain sudo[132479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kgewynaojswioxmemlvkgoxmqppseejb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937033.026696-362-181112713323903/AnsiballZ_dnf.py
Feb 01 09:10:33 np0005604215.localdomain sudo[132479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:33 np0005604215.localdomain python3.9[132481]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:10:36 np0005604215.localdomain sudo[132479]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7917 DF PROTO=TCP SPT=54122 DPT=9102 SEQ=2889597690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651288D0000000001030307) 
Feb 01 09:10:37 np0005604215.localdomain sudo[132573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onehvjzxtauyezrhsrcyfcyjvhuxptat ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937037.3222158-395-261929163385065/AnsiballZ_setup.py
Feb 01 09:10:37 np0005604215.localdomain sudo[132573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:37 np0005604215.localdomain python3.9[132575]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:10:37 np0005604215.localdomain sudo[132573]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:38 np0005604215.localdomain sudo[132667]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bedebczihboguqiktuhddpenecikznjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937038.140026-419-133679412309180/AnsiballZ_stat.py
Feb 01 09:10:38 np0005604215.localdomain sudo[132667]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:38 np0005604215.localdomain python3.9[132669]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:10:38 np0005604215.localdomain sudo[132667]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:39 np0005604215.localdomain sudo[132759]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkyejljfkolnpfupimmhtsfytvmlcocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937038.9510493-446-174855971987205/AnsiballZ_stat.py
Feb 01 09:10:39 np0005604215.localdomain sudo[132759]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:39 np0005604215.localdomain python3.9[132761]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:10:39 np0005604215.localdomain sudo[132759]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17746 DF PROTO=TCP SPT=49134 DPT=9100 SEQ=892466814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651350D0000000001030307) 
Feb 01 09:10:40 np0005604215.localdomain sudo[132851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqaboplyretsfcquubwcfrwbcqbzyvcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937039.7689667-476-22993318346168/AnsiballZ_command.py
Feb 01 09:10:40 np0005604215.localdomain sudo[132851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:40 np0005604215.localdomain python3.9[132853]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:10:40 np0005604215.localdomain sudo[132851]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:40 np0005604215.localdomain sudo[132944]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahxsswkchzisudenlswyanjjisbyaqwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937040.5575407-506-141193268945112/AnsiballZ_service_facts.py
Feb 01 09:10:40 np0005604215.localdomain sudo[132944]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:41 np0005604215.localdomain python3.9[132946]: ansible-service_facts Invoked
Feb 01 09:10:41 np0005604215.localdomain network[132963]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:10:41 np0005604215.localdomain network[132964]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:10:41 np0005604215.localdomain network[132965]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:10:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:10:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3934 DF PROTO=TCP SPT=50832 DPT=9101 SEQ=1103186030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651420D0000000001030307) 
Feb 01 09:10:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36379 DF PROTO=TCP SPT=43050 DPT=9882 SEQ=2122265823 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6514B0D0000000001030307) 
Feb 01 09:10:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:10:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:10:45 np0005604215.localdomain sudo[132944]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:46 np0005604215.localdomain sudo[133149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:10:46 np0005604215.localdomain sudo[133149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:10:46 np0005604215.localdomain sudo[133149]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:46 np0005604215.localdomain sudo[133194]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkipzrhewbwrnjxazhqmbmgvgvozbflv ; /bin/bash /home/zuul/.ansible/tmp/ansible-tmp-1769937046.7081997-552-180081795389744/AnsiballZ_timesync_provider.sh /home/zuul/.ansible/tmp/ansible-tmp-1769937046.7081997-552-180081795389744/args
Feb 01 09:10:46 np0005604215.localdomain sudo[133194]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:46 np0005604215.localdomain sudo[133193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:10:46 np0005604215.localdomain sudo[133193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:10:47 np0005604215.localdomain sudo[133194]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:47 np0005604215.localdomain sudo[133341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jsbvvjtxmzqthrlyggoedybfxpufyuro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937047.3266525-585-256424184008243/AnsiballZ_dnf.py
Feb 01 09:10:47 np0005604215.localdomain sudo[133341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:47 np0005604215.localdomain sudo[133193]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:47 np0005604215.localdomain python3.9[133348]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:10:47 np0005604215.localdomain sudo[133350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:10:47 np0005604215.localdomain sudo[133350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:10:47 np0005604215.localdomain sudo[133350]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:48 np0005604215.localdomain sudo[133365]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:10:48 np0005604215.localdomain sudo[133365]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:10:48 np0005604215.localdomain sshd[133381]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:10:48 np0005604215.localdomain sudo[133365]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:48 np0005604215.localdomain sshd[133381]: Invalid user hts from 85.206.171.113 port 38022
Feb 01 09:10:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7919 DF PROTO=TCP SPT=54122 DPT=9102 SEQ=2889597690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651590E0000000001030307) 
Feb 01 09:10:49 np0005604215.localdomain sshd[133381]: Received disconnect from 85.206.171.113 port 38022:11: Bye Bye [preauth]
Feb 01 09:10:49 np0005604215.localdomain sshd[133381]: Disconnected from invalid user hts 85.206.171.113 port 38022 [preauth]
Feb 01 09:10:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:10:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 5400.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:10:50 np0005604215.localdomain sudo[133341]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:51 np0005604215.localdomain sudo[133416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:10:51 np0005604215.localdomain sudo[133416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:10:51 np0005604215.localdomain sudo[133416]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17748 DF PROTO=TCP SPT=49134 DPT=9100 SEQ=892466814 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651650D0000000001030307) 
Feb 01 09:10:52 np0005604215.localdomain sudo[133506]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-frxvzeapjnzvttaxlundriuehoatfayd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937051.775088-624-190597664374203/AnsiballZ_package_facts.py
Feb 01 09:10:52 np0005604215.localdomain sudo[133506]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:52 np0005604215.localdomain python3.9[133508]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Feb 01 09:10:52 np0005604215.localdomain sudo[133506]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:54 np0005604215.localdomain sudo[133598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjmsthuaxvpcufnszavbnpqvatpopqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937053.6857133-654-257954298262455/AnsiballZ_stat.py
Feb 01 09:10:54 np0005604215.localdomain sudo[133598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:54 np0005604215.localdomain python3.9[133600]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:10:54 np0005604215.localdomain sudo[133598]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:54 np0005604215.localdomain sudo[133673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqykdzuhaqadzdsxuhbeugbpkydnhfdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937053.6857133-654-257954298262455/AnsiballZ_copy.py
Feb 01 09:10:54 np0005604215.localdomain sudo[133673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:54 np0005604215.localdomain python3.9[133675]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937053.6857133-654-257954298262455/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:10:54 np0005604215.localdomain sudo[133673]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:55 np0005604215.localdomain sudo[133767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-awtylbbllwkvebkebunmozkoxttrekup ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937055.346552-699-202321833157605/AnsiballZ_stat.py
Feb 01 09:10:55 np0005604215.localdomain sudo[133767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3936 DF PROTO=TCP SPT=50832 DPT=9101 SEQ=1103186030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651730D0000000001030307) 
Feb 01 09:10:55 np0005604215.localdomain python3.9[133769]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:10:55 np0005604215.localdomain sudo[133767]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:56 np0005604215.localdomain sudo[133842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yajybqskhvxektshqzxvpzjmmgrkjpnc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937055.346552-699-202321833157605/AnsiballZ_copy.py
Feb 01 09:10:56 np0005604215.localdomain sudo[133842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:56 np0005604215.localdomain python3.9[133844]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937055.346552-699-202321833157605/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:10:56 np0005604215.localdomain sudo[133842]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:57 np0005604215.localdomain sudo[133936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nefpunubrfchampcfelvobexrpgnerfz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937057.4584086-763-278553427548374/AnsiballZ_lineinfile.py
Feb 01 09:10:57 np0005604215.localdomain sudo[133936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:58 np0005604215.localdomain python3.9[133938]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:10:58 np0005604215.localdomain sudo[133936]: pam_unix(sudo:session): session closed for user root
Feb 01 09:10:59 np0005604215.localdomain sudo[134030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-outzyhrxxiydehloeyoajfwdudkftlej ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937059.287426-807-185647460089365/AnsiballZ_setup.py
Feb 01 09:10:59 np0005604215.localdomain sudo[134030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:10:59 np0005604215.localdomain python3.9[134032]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:11:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48920 DF PROTO=TCP SPT=42058 DPT=9882 SEQ=3624156792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651840C0000000001030307) 
Feb 01 09:11:00 np0005604215.localdomain sudo[134030]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:00 np0005604215.localdomain sudo[134084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdkzcypulxkdcpniodfggrxvidbrhdgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937059.287426-807-185647460089365/AnsiballZ_systemd.py
Feb 01 09:11:00 np0005604215.localdomain sudo[134084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48921 DF PROTO=TCP SPT=42058 DPT=9882 SEQ=3624156792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651880D0000000001030307) 
Feb 01 09:11:01 np0005604215.localdomain python3.9[134086]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:11:01 np0005604215.localdomain sudo[134084]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:02 np0005604215.localdomain sudo[134178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojwcraoihnpbnkvaefschipcixfwaliw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937062.2126522-855-22385862446847/AnsiballZ_setup.py
Feb 01 09:11:02 np0005604215.localdomain sudo[134178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:02 np0005604215.localdomain python3.9[134180]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:11:03 np0005604215.localdomain sudo[134178]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48922 DF PROTO=TCP SPT=42058 DPT=9882 SEQ=3624156792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651900D0000000001030307) 
Feb 01 09:11:03 np0005604215.localdomain sudo[134232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trsxhrltvumnunublofngmjmeruxdaie ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937062.2126522-855-22385862446847/AnsiballZ_systemd.py
Feb 01 09:11:03 np0005604215.localdomain sudo[134232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:03 np0005604215.localdomain python3.9[134234]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:11:03 np0005604215.localdomain chronyd[25933]: chronyd exiting
Feb 01 09:11:03 np0005604215.localdomain systemd[1]: Stopping NTP client/server...
Feb 01 09:11:03 np0005604215.localdomain systemd[1]: chronyd.service: Deactivated successfully.
Feb 01 09:11:03 np0005604215.localdomain systemd[1]: Stopped NTP client/server.
Feb 01 09:11:03 np0005604215.localdomain systemd[1]: Starting NTP client/server...
Feb 01 09:11:03 np0005604215.localdomain chronyd[134242]: chronyd version 4.3 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Feb 01 09:11:03 np0005604215.localdomain chronyd[134242]: Frequency -30.790 +/- 0.505 ppm read from /var/lib/chrony/drift
Feb 01 09:11:03 np0005604215.localdomain chronyd[134242]: Loaded seccomp filter (level 2)
Feb 01 09:11:03 np0005604215.localdomain systemd[1]: Started NTP client/server.
Feb 01 09:11:03 np0005604215.localdomain sudo[134232]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:04 np0005604215.localdomain sshd[131057]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:11:04 np0005604215.localdomain systemd[1]: session-42.scope: Deactivated successfully.
Feb 01 09:11:04 np0005604215.localdomain systemd[1]: session-42.scope: Consumed 27.517s CPU time.
Feb 01 09:11:04 np0005604215.localdomain systemd-logind[761]: Session 42 logged out. Waiting for processes to exit.
Feb 01 09:11:04 np0005604215.localdomain systemd-logind[761]: Removed session 42.
Feb 01 09:11:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51770 DF PROTO=TCP SPT=44996 DPT=9102 SEQ=2616082020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6519D8D0000000001030307) 
Feb 01 09:11:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8569 DF PROTO=TCP SPT=42590 DPT=9100 SEQ=4037927389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651AA0D0000000001030307) 
Feb 01 09:11:09 np0005604215.localdomain sshd[134258]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:11:09 np0005604215.localdomain sshd[134258]: Accepted publickey for zuul from 192.168.122.31 port 60000 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:11:10 np0005604215.localdomain systemd-logind[761]: New session 43 of user zuul.
Feb 01 09:11:10 np0005604215.localdomain systemd[1]: Started Session 43 of User zuul.
Feb 01 09:11:10 np0005604215.localdomain sshd[134258]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:11:10 np0005604215.localdomain sshd[134352]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:11:11 np0005604215.localdomain python3.9[134351]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:11:11 np0005604215.localdomain sshd[134352]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:11:12 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3937 DF PROTO=TCP SPT=50832 DPT=9101 SEQ=1103186030 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651B30D0000000001030307) 
Feb 01 09:11:12 np0005604215.localdomain sudo[134447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-litfpavniluqcmhylcunsxcpxedwsqnk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937072.0269554-56-265526101681040/AnsiballZ_file.py
Feb 01 09:11:12 np0005604215.localdomain sudo[134447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:12 np0005604215.localdomain python3.9[134449]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:12 np0005604215.localdomain sudo[134447]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:13 np0005604215.localdomain sudo[134553]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nmmajunbcojfodlambjkgsbbtyngqnhe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937072.8467436-80-92153513566013/AnsiballZ_stat.py
Feb 01 09:11:13 np0005604215.localdomain sudo[134553]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:13 np0005604215.localdomain python3.9[134555]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:13 np0005604215.localdomain sudo[134553]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:13 np0005604215.localdomain sudo[134601]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjnqffqqakccheijmvsxriypzoomtrlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937072.8467436-80-92153513566013/AnsiballZ_file.py
Feb 01 09:11:13 np0005604215.localdomain sudo[134601]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:13 np0005604215.localdomain python3.9[134603]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.xmvua4r2 recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:14 np0005604215.localdomain sudo[134601]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:14 np0005604215.localdomain sudo[134693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-njvfrojwzjeghdxaezubixcmsxvcjmao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937074.5351927-140-265104662460362/AnsiballZ_stat.py
Feb 01 09:11:14 np0005604215.localdomain sudo[134693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:15 np0005604215.localdomain python3.9[134695]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:15 np0005604215.localdomain sudo[134693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:15 np0005604215.localdomain sudo[134768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijkkgqrbbxdebwsaiyzuraqlzzhbbdrc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937074.5351927-140-265104662460362/AnsiballZ_copy.py
Feb 01 09:11:15 np0005604215.localdomain sudo[134768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48924 DF PROTO=TCP SPT=42058 DPT=9882 SEQ=3624156792 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651C10D0000000001030307) 
Feb 01 09:11:15 np0005604215.localdomain python3.9[134770]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937074.5351927-140-265104662460362/.source _original_basename=.08twytty follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:15 np0005604215.localdomain sudo[134768]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:16 np0005604215.localdomain sudo[134860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tntrhdidbwgqzkrjkvkmwhqxqsmnuuce ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937076.006132-188-153832667041669/AnsiballZ_file.py
Feb 01 09:11:16 np0005604215.localdomain sudo[134860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:16 np0005604215.localdomain python3.9[134862]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:11:16 np0005604215.localdomain sudo[134860]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:16 np0005604215.localdomain auditd[727]: Audit daemon rotating log files
Feb 01 09:11:17 np0005604215.localdomain sudo[134952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scnfaknaqmruuphxsonbyiznxrkbsdqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937076.7347267-213-197154267585525/AnsiballZ_stat.py
Feb 01 09:11:17 np0005604215.localdomain sudo[134952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:17 np0005604215.localdomain python3.9[134954]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:17 np0005604215.localdomain sudo[134952]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:17 np0005604215.localdomain sudo[135025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pqwhokootfhmykgdbabqkxlnnmcqhdgs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937076.7347267-213-197154267585525/AnsiballZ_copy.py
Feb 01 09:11:17 np0005604215.localdomain sudo[135025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:17 np0005604215.localdomain python3.9[135027]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937076.7347267-213-197154267585525/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:11:17 np0005604215.localdomain sudo[135025]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:18 np0005604215.localdomain sudo[135117]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kzhdjlvlzvtoemaxmyuymbnrzdegqhrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937077.8888273-213-276460387304321/AnsiballZ_stat.py
Feb 01 09:11:18 np0005604215.localdomain sudo[135117]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:18 np0005604215.localdomain python3.9[135119]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:18 np0005604215.localdomain sudo[135117]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:18 np0005604215.localdomain sudo[135190]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edzuvqoclamawhdkizixripfvmgcpnxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937077.8888273-213-276460387304321/AnsiballZ_copy.py
Feb 01 09:11:18 np0005604215.localdomain sudo[135190]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51772 DF PROTO=TCP SPT=44996 DPT=9102 SEQ=2616082020 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651CD0D0000000001030307) 
Feb 01 09:11:18 np0005604215.localdomain python3.9[135192]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937077.8888273-213-276460387304321/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:11:18 np0005604215.localdomain sudo[135190]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:19 np0005604215.localdomain sudo[135282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iztoqdwbmvzlgmgazobvmoljbpqlkrfr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937079.3076706-300-176684936438906/AnsiballZ_file.py
Feb 01 09:11:19 np0005604215.localdomain sudo[135282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:19 np0005604215.localdomain python3.9[135284]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:19 np0005604215.localdomain sudo[135282]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:20 np0005604215.localdomain sudo[135374]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdbozoxrzqndivbvwbqradmlzedxlmdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937080.0196364-323-96863578069649/AnsiballZ_stat.py
Feb 01 09:11:20 np0005604215.localdomain sudo[135374]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:20 np0005604215.localdomain python3.9[135376]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:20 np0005604215.localdomain sudo[135374]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:20 np0005604215.localdomain sudo[135447]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etvvuwmpjghzetzfvfhvdxdotmfsochz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937080.0196364-323-96863578069649/AnsiballZ_copy.py
Feb 01 09:11:20 np0005604215.localdomain sudo[135447]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:21 np0005604215.localdomain python3.9[135449]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937080.0196364-323-96863578069649/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:21 np0005604215.localdomain sudo[135447]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:21 np0005604215.localdomain sudo[135539]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhtptaiisotuaxbmgebylfukfwremfzp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937081.3297405-368-7600610106807/AnsiballZ_stat.py
Feb 01 09:11:21 np0005604215.localdomain sudo[135539]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:21 np0005604215.localdomain python3.9[135541]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:21 np0005604215.localdomain sudo[135539]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:22 np0005604215.localdomain sudo[135612]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-poqhufsisekjmpjxemewvyqarjpgyeoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937081.3297405-368-7600610106807/AnsiballZ_copy.py
Feb 01 09:11:22 np0005604215.localdomain sudo[135612]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8571 DF PROTO=TCP SPT=42590 DPT=9100 SEQ=4037927389 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651DB0D0000000001030307) 
Feb 01 09:11:22 np0005604215.localdomain python3.9[135614]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937081.3297405-368-7600610106807/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:22 np0005604215.localdomain sudo[135612]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:23 np0005604215.localdomain sudo[135704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kxbjoeztauqsqupbcdmhhrcqgllbbngq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937082.6608455-413-23163916214563/AnsiballZ_systemd.py
Feb 01 09:11:23 np0005604215.localdomain sudo[135704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:23 np0005604215.localdomain python3.9[135706]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:11:23 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:11:23 np0005604215.localdomain systemd-rc-local-generator[135727]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:11:23 np0005604215.localdomain systemd-sysv-generator[135732]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:11:23 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:11:23 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:11:23 np0005604215.localdomain systemd-rc-local-generator[135769]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:11:23 np0005604215.localdomain systemd-sysv-generator[135772]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:11:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:11:24 np0005604215.localdomain systemd[1]: Starting EDPM Container Shutdown...
Feb 01 09:11:24 np0005604215.localdomain systemd[1]: Finished EDPM Container Shutdown.
Feb 01 09:11:24 np0005604215.localdomain sudo[135704]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30269 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=2116717942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651E70D0000000001030307) 
Feb 01 09:11:25 np0005604215.localdomain sudo[135873]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnwekneimbaaucdofuupaocrdcihdtqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937085.5569978-438-185206347603244/AnsiballZ_stat.py
Feb 01 09:11:25 np0005604215.localdomain sudo[135873]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:26 np0005604215.localdomain python3.9[135875]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:26 np0005604215.localdomain sudo[135873]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:26 np0005604215.localdomain sudo[135946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zitbijvzaoxgkpvfqxyycxxrtjkgerlu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937085.5569978-438-185206347603244/AnsiballZ_copy.py
Feb 01 09:11:26 np0005604215.localdomain sudo[135946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:26 np0005604215.localdomain python3.9[135948]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937085.5569978-438-185206347603244/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:26 np0005604215.localdomain sudo[135946]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:27 np0005604215.localdomain sudo[136038]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xtyyxajdueqylozwusubummdnsypqkev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937086.8911963-482-10817982704719/AnsiballZ_stat.py
Feb 01 09:11:27 np0005604215.localdomain sudo[136038]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:27 np0005604215.localdomain python3.9[136040]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:27 np0005604215.localdomain sudo[136038]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:27 np0005604215.localdomain sudo[136111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubutrsldpzemtbxszylkbmujgkgksuik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937086.8911963-482-10817982704719/AnsiballZ_copy.py
Feb 01 09:11:27 np0005604215.localdomain sudo[136111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:27 np0005604215.localdomain python3.9[136113]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937086.8911963-482-10817982704719/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:28 np0005604215.localdomain sudo[136111]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:28 np0005604215.localdomain sudo[136203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rfdhqbkxymvmjioxtcrlrmchfpvombzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937088.1869473-528-106739856012254/AnsiballZ_systemd.py
Feb 01 09:11:28 np0005604215.localdomain sudo[136203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:28 np0005604215.localdomain python3.9[136205]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:11:28 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:11:28 np0005604215.localdomain systemd-rc-local-generator[136226]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:11:28 np0005604215.localdomain systemd-sysv-generator[136234]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:11:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:11:29 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 09:11:29 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 09:11:29 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 09:11:29 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 09:11:29 np0005604215.localdomain sudo[136203]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36903 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=53325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651F93C0000000001030307) 
Feb 01 09:11:30 np0005604215.localdomain python3.9[136337]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:11:30 np0005604215.localdomain network[136354]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:11:30 np0005604215.localdomain network[136355]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:11:30 np0005604215.localdomain network[136356]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:11:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36904 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=53325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA651FD4D0000000001030307) 
Feb 01 09:11:31 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:11:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36905 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=53325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652054E0000000001030307) 
Feb 01 09:11:35 np0005604215.localdomain sudo[136556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnqikttqqryaurluhmdxybnyzcpfvczk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937095.7038262-606-275940723110126/AnsiballZ_stat.py
Feb 01 09:11:35 np0005604215.localdomain sudo[136556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:36 np0005604215.localdomain python3.9[136558]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:36 np0005604215.localdomain sudo[136556]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44201 DF PROTO=TCP SPT=38496 DPT=9102 SEQ=86592264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65212CD0000000001030307) 
Feb 01 09:11:36 np0005604215.localdomain sudo[136631]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lflmxietodzczmfpirslwvgythimdywl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937095.7038262-606-275940723110126/AnsiballZ_copy.py
Feb 01 09:11:36 np0005604215.localdomain sudo[136631]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:36 np0005604215.localdomain python3.9[136633]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937095.7038262-606-275940723110126/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:36 np0005604215.localdomain sudo[136631]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:37 np0005604215.localdomain sudo[136724]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjolftpgklpbwkjuiewaosfiweyjkpou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937097.1356957-650-267845231757401/AnsiballZ_systemd.py
Feb 01 09:11:37 np0005604215.localdomain sudo[136724]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:37 np0005604215.localdomain python3.9[136726]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:11:37 np0005604215.localdomain systemd[1]: Reloading OpenSSH server daemon...
Feb 01 09:11:37 np0005604215.localdomain sshd[118325]: Received SIGHUP; restarting.
Feb 01 09:11:37 np0005604215.localdomain systemd[1]: Reloaded OpenSSH server daemon.
Feb 01 09:11:37 np0005604215.localdomain sshd[118325]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:11:37 np0005604215.localdomain sshd[118325]: Server listening on 0.0.0.0 port 22.
Feb 01 09:11:37 np0005604215.localdomain sshd[118325]: Server listening on :: port 22.
Feb 01 09:11:37 np0005604215.localdomain sudo[136724]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:38 np0005604215.localdomain sudo[136820]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wlufcyaavpkpveofyxvauxikzsiocpmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937098.0908546-675-79589100317481/AnsiballZ_file.py
Feb 01 09:11:38 np0005604215.localdomain sudo[136820]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:38 np0005604215.localdomain python3.9[136822]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:38 np0005604215.localdomain sudo[136820]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:39 np0005604215.localdomain sudo[136912]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctissuhhfwgqozcpwnrdyzuezaqsxnyp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937098.8293376-699-2733109829542/AnsiballZ_stat.py
Feb 01 09:11:39 np0005604215.localdomain sudo[136912]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:39 np0005604215.localdomain python3.9[136914]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:39 np0005604215.localdomain sudo[136912]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:39 np0005604215.localdomain sudo[136985]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjrkxvxwglmizerpnywieiscvynmghvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937098.8293376-699-2733109829542/AnsiballZ_copy.py
Feb 01 09:11:39 np0005604215.localdomain sudo[136985]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28144 DF PROTO=TCP SPT=55938 DPT=9100 SEQ=2373809073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6521F4D0000000001030307) 
Feb 01 09:11:39 np0005604215.localdomain python3.9[136987]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937098.8293376-699-2733109829542/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:39 np0005604215.localdomain sudo[136985]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:40 np0005604215.localdomain sudo[137077]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wiuhfzwruluhtmldvvmwfqzmkcmrddoq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937100.3487585-752-279985049372645/AnsiballZ_timezone.py
Feb 01 09:11:40 np0005604215.localdomain sudo[137077]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:40 np0005604215.localdomain python3.9[137079]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Feb 01 09:11:40 np0005604215.localdomain systemd[1]: Starting Time & Date Service...
Feb 01 09:11:41 np0005604215.localdomain systemd[1]: Started Time & Date Service.
Feb 01 09:11:41 np0005604215.localdomain sudo[137077]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:41 np0005604215.localdomain sudo[137173]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etflqmiboavjdcnpnmwuydvbiinjnaym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937101.4762342-779-121465907387075/AnsiballZ_file.py
Feb 01 09:11:41 np0005604215.localdomain sudo[137173]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:41 np0005604215.localdomain python3.9[137175]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:41 np0005604215.localdomain sudo[137173]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:42 np0005604215.localdomain sudo[137265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nzlnopcsmzykeuoqrijxlkvmsqxpramr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937102.1820183-804-173387805585602/AnsiballZ_stat.py
Feb 01 09:11:42 np0005604215.localdomain sudo[137265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:42 np0005604215.localdomain python3.9[137267]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:42 np0005604215.localdomain sudo[137265]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:43 np0005604215.localdomain sudo[137338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kunnankaeqbwhcgnoamakhsceiemhips ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937102.1820183-804-173387805585602/AnsiballZ_copy.py
Feb 01 09:11:43 np0005604215.localdomain sudo[137338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18998 DF PROTO=TCP SPT=46222 DPT=9101 SEQ=1641614223 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6522C8D0000000001030307) 
Feb 01 09:11:43 np0005604215.localdomain python3.9[137340]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937102.1820183-804-173387805585602/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:43 np0005604215.localdomain sudo[137338]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:43 np0005604215.localdomain sudo[137430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jwnszffvwzqmdgblvohtxmplbazplbek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937103.5909505-849-137157334630153/AnsiballZ_stat.py
Feb 01 09:11:43 np0005604215.localdomain sudo[137430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:44 np0005604215.localdomain python3.9[137432]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:44 np0005604215.localdomain sudo[137430]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:44 np0005604215.localdomain sudo[137503]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dtbtjusilabihjcgbvgdimnibikdiabw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937103.5909505-849-137157334630153/AnsiballZ_copy.py
Feb 01 09:11:44 np0005604215.localdomain sudo[137503]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:44 np0005604215.localdomain python3.9[137505]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937103.5909505-849-137157334630153/.source.yaml _original_basename=.pngzmdvh follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:44 np0005604215.localdomain sudo[137503]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:45 np0005604215.localdomain sudo[137595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dqflxrteuqvcgwctutmzqxnmpwvqcqdq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937104.8498-894-263492183878018/AnsiballZ_stat.py
Feb 01 09:11:45 np0005604215.localdomain sudo[137595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:45 np0005604215.localdomain python3.9[137597]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36907 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=53325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652350E0000000001030307) 
Feb 01 09:11:45 np0005604215.localdomain sudo[137595]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:45 np0005604215.localdomain sudo[137670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tilnfrsuwktmokpuggagsuofusvcehfk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937104.8498-894-263492183878018/AnsiballZ_copy.py
Feb 01 09:11:45 np0005604215.localdomain sudo[137670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:45 np0005604215.localdomain python3.9[137672]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937104.8498-894-263492183878018/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:45 np0005604215.localdomain sudo[137670]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:46 np0005604215.localdomain sudo[137762]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymavnwtihyyuvljhuwtlovuckpzpveth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937106.093505-939-29576391343894/AnsiballZ_command.py
Feb 01 09:11:46 np0005604215.localdomain sudo[137762]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:46 np0005604215.localdomain python3.9[137764]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:11:46 np0005604215.localdomain sudo[137762]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:47 np0005604215.localdomain sudo[137855]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvxmtxkrxlxeqdrncagecwgxkofpesxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937106.9687405-963-216107588647889/AnsiballZ_command.py
Feb 01 09:11:47 np0005604215.localdomain sudo[137855]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:47 np0005604215.localdomain python3.9[137857]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:11:47 np0005604215.localdomain sudo[137855]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:48 np0005604215.localdomain sudo[137948]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eyfeykjevfvskdsntggigfbrndchttgx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937107.6464784-987-65561903290305/AnsiballZ_edpm_nftables_from_files.py
Feb 01 09:11:48 np0005604215.localdomain sudo[137948]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:48 np0005604215.localdomain python3[137950]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 01 09:11:48 np0005604215.localdomain sudo[137948]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:48 np0005604215.localdomain sudo[138040]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zksjeuaspzscfcfvdpihmuxqhbrcszrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937108.4793015-1011-134968220902261/AnsiballZ_stat.py
Feb 01 09:11:48 np0005604215.localdomain sudo[138040]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44203 DF PROTO=TCP SPT=38496 DPT=9102 SEQ=86592264 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652430D0000000001030307) 
Feb 01 09:11:48 np0005604215.localdomain python3.9[138042]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:48 np0005604215.localdomain sudo[138040]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:49 np0005604215.localdomain sudo[138113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fldkafjzackgcjqgnophgqzvlhtxdyfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937108.4793015-1011-134968220902261/AnsiballZ_copy.py
Feb 01 09:11:49 np0005604215.localdomain sudo[138113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:49 np0005604215.localdomain python3.9[138115]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937108.4793015-1011-134968220902261/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:49 np0005604215.localdomain sudo[138113]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:50 np0005604215.localdomain sudo[138205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmoktpxgofjruesfswloifhhernjkfrh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937109.689764-1055-269122162516297/AnsiballZ_stat.py
Feb 01 09:11:50 np0005604215.localdomain sudo[138205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:50 np0005604215.localdomain python3.9[138207]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:50 np0005604215.localdomain sudo[138205]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:50 np0005604215.localdomain sudo[138278]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-etngxvzwcvfcyiqcuwmssckyfoerfsdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937109.689764-1055-269122162516297/AnsiballZ_copy.py
Feb 01 09:11:50 np0005604215.localdomain sudo[138278]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:50 np0005604215.localdomain python3.9[138280]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937109.689764-1055-269122162516297/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:50 np0005604215.localdomain sudo[138278]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:51 np0005604215.localdomain sudo[138370]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjvvpeimhrweedjcpyjqmxyjvgrqpryc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937111.0106928-1101-171140909163157/AnsiballZ_stat.py
Feb 01 09:11:51 np0005604215.localdomain sudo[138370]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:51 np0005604215.localdomain python3.9[138372]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:51 np0005604215.localdomain sudo[138370]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:51 np0005604215.localdomain sudo[138400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:11:51 np0005604215.localdomain sudo[138400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:11:51 np0005604215.localdomain sudo[138400]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:51 np0005604215.localdomain sudo[138428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:11:51 np0005604215.localdomain sudo[138428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:11:51 np0005604215.localdomain sudo[138473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fspwmpyojhgotlxxhitzvzhdbosznpyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937111.0106928-1101-171140909163157/AnsiballZ_copy.py
Feb 01 09:11:51 np0005604215.localdomain sudo[138473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28146 DF PROTO=TCP SPT=55938 DPT=9100 SEQ=2373809073 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6524F0E0000000001030307) 
Feb 01 09:11:52 np0005604215.localdomain python3.9[138475]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937111.0106928-1101-171140909163157/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:52 np0005604215.localdomain sudo[138473]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:52 np0005604215.localdomain sudo[138428]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:52 np0005604215.localdomain sudo[138598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hacdxmbjesoxwrtydewxvsfepigykaxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937112.2959337-1146-203979935508217/AnsiballZ_stat.py
Feb 01 09:11:52 np0005604215.localdomain sudo[138598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:52 np0005604215.localdomain python3.9[138600]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:52 np0005604215.localdomain sudo[138598]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:53 np0005604215.localdomain sudo[138641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:11:53 np0005604215.localdomain sudo[138641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:11:53 np0005604215.localdomain sudo[138641]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:53 np0005604215.localdomain sudo[138664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- inventory --format=json-pretty --filter-for-batch
Feb 01 09:11:53 np0005604215.localdomain sudo[138664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:11:53 np0005604215.localdomain sudo[138701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtgtgrszqhisnyelkrrgflmoxhcmlsjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937112.2959337-1146-203979935508217/AnsiballZ_copy.py
Feb 01 09:11:53 np0005604215.localdomain sudo[138701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:53 np0005604215.localdomain python3.9[138703]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937112.2959337-1146-203979935508217/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:53 np0005604215.localdomain sudo[138701]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 2026-02-01 09:11:53.635539988 +0000 UTC m=+0.075998008 container create baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_babbage, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, distribution-scope=public, ceph=True, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, version=7, RELEASE=main)
Feb 01 09:11:53 np0005604215.localdomain systemd[1]: Started libpod-conmon-baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5.scope.
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 2026-02-01 09:11:53.604587204 +0000 UTC m=+0.045045244 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:11:53 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 2026-02-01 09:11:53.72422636 +0000 UTC m=+0.164684380 container init baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_babbage, vcs-type=git, distribution-scope=public, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vendor=Red Hat, Inc., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True)
Feb 01 09:11:53 np0005604215.localdomain systemd[1]: tmp-crun.YrZCSm.mount: Deactivated successfully.
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 2026-02-01 09:11:53.742948353 +0000 UTC m=+0.183406373 container start baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_babbage, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z)
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 2026-02-01 09:11:53.743249693 +0000 UTC m=+0.183707703 container attach baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_babbage, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, release=1764794109, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git)
Feb 01 09:11:53 np0005604215.localdomain unruffled_babbage[138774]: 167 167
Feb 01 09:11:53 np0005604215.localdomain systemd[1]: libpod-baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5.scope: Deactivated successfully.
Feb 01 09:11:53 np0005604215.localdomain podman[138759]: 2026-02-01 09:11:53.747147244 +0000 UTC m=+0.187605324 container died baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_babbage, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, release=1764794109, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, GIT_CLEAN=True, ceph=True, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, name=rhceph, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z)
Feb 01 09:11:53 np0005604215.localdomain podman[138787]: 2026-02-01 09:11:53.838453778 +0000 UTC m=+0.082388958 container remove baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_babbage, ceph=True, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109)
Feb 01 09:11:53 np0005604215.localdomain systemd[1]: libpod-conmon-baeb742de6e520c8b143cad4dadc1e6a9bb0cb0cc7401ac683cd4c78a5ae4fd5.scope: Deactivated successfully.
Feb 01 09:11:54 np0005604215.localdomain podman[138839]: 
Feb 01 09:11:54 np0005604215.localdomain podman[138839]: 2026-02-01 09:11:54.016538264 +0000 UTC m=+0.064813520 container create 6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_bohr, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, release=1764794109, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, ceph=True, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container)
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e.scope.
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:11:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aa27e443812a62a9bed344742514e4e00f14d0322874282f0c00e4424e2740/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 09:11:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aa27e443812a62a9bed344742514e4e00f14d0322874282f0c00e4424e2740/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:11:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aa27e443812a62a9bed344742514e4e00f14d0322874282f0c00e4424e2740/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 09:11:54 np0005604215.localdomain podman[138839]: 2026-02-01 09:11:54.076628746 +0000 UTC m=+0.124904022 container init 6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_bohr, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, vendor=Red Hat, Inc., release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, version=7, architecture=x86_64, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:11:54 np0005604215.localdomain podman[138839]: 2026-02-01 09:11:54.084298615 +0000 UTC m=+0.132573861 container start 6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_bohr, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., GIT_BRANCH=main, io.buildah.version=1.41.4, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, version=7, release=1764794109, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git)
Feb 01 09:11:54 np0005604215.localdomain podman[138839]: 2026-02-01 09:11:54.084434449 +0000 UTC m=+0.132709695 container attach 6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_bohr, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, version=7, io.openshift.tags=rhceph ceph, vcs-type=git)
Feb 01 09:11:54 np0005604215.localdomain podman[138839]: 2026-02-01 09:11:53.988246083 +0000 UTC m=+0.036521399 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:11:54 np0005604215.localdomain sudo[138898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-owdpsqibhhuwwjphvzvwrfxulxygherw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937113.7721438-1190-170711815420544/AnsiballZ_stat.py
Feb 01 09:11:54 np0005604215.localdomain sudo[138898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:54 np0005604215.localdomain python3.9[138900]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:11:54 np0005604215.localdomain sudo[138898]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5b76bd61099ee1a55f4022a929660daf3f366b904d62f5d9c0252ee609c694f2-merged.mount: Deactivated successfully.
Feb 01 09:11:54 np0005604215.localdomain sudo[139219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lhnnzmalicchrnrlwuntmnktsmcjlcsi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937113.7721438-1190-170711815420544/AnsiballZ_copy.py
Feb 01 09:11:54 np0005604215.localdomain sudo[139219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]: [
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:     {
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "available": false,
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "ceph_device": false,
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "lsm_data": {},
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "lvs": [],
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "path": "/dev/sr0",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "rejected_reasons": [
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "Insufficient space (<5GB)",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "Has a FileSystem"
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         ],
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         "sys_api": {
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "actuators": null,
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "device_nodes": "sr0",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "human_readable_size": "482.00 KB",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "id_bus": "ata",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "model": "QEMU DVD-ROM",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "nr_requests": "2",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "partitions": {},
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "path": "/dev/sr0",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "removable": "1",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "rev": "2.5+",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "ro": "0",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "rotational": "1",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "sas_address": "",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "sas_device_handle": "",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "scheduler_mode": "mq-deadline",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "sectors": 0,
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "sectorsize": "2048",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "size": 493568.0,
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "support_discard": "0",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "type": "disk",
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:             "vendor": "QEMU"
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:         }
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]:     }
Feb 01 09:11:54 np0005604215.localdomain friendly_bohr[138877]: ]
Feb 01 09:11:54 np0005604215.localdomain python3.9[139391]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937113.7721438-1190-170711815420544/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: libpod-6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e.scope: Deactivated successfully.
Feb 01 09:11:54 np0005604215.localdomain sudo[139219]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:54 np0005604215.localdomain podman[140475]: 2026-02-01 09:11:54.936849748 +0000 UTC m=+0.037392605 container died 6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_bohr, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, build-date=2025-12-08T17:28:53Z, version=7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: tmp-crun.sWxAkX.mount: Deactivated successfully.
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-81aa27e443812a62a9bed344742514e4e00f14d0322874282f0c00e4424e2740-merged.mount: Deactivated successfully.
Feb 01 09:11:54 np0005604215.localdomain podman[140475]: 2026-02-01 09:11:54.970143035 +0000 UTC m=+0.070685872 container remove 6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=friendly_bohr, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.openshift.tags=rhceph ceph, architecture=x86_64, GIT_CLEAN=True, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.buildah.version=1.41.4)
Feb 01 09:11:54 np0005604215.localdomain systemd[1]: libpod-conmon-6548419d4fd22b49339a05eccf46b4b6a8e325728635f39642afba162af72c7e.scope: Deactivated successfully.
Feb 01 09:11:55 np0005604215.localdomain sudo[138664]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:55 np0005604215.localdomain sudo[140564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:11:55 np0005604215.localdomain sudo[140564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:11:55 np0005604215.localdomain sudo[140564]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:55 np0005604215.localdomain sudo[140593]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfoojlzmgwyrpioenqnifrfzmohoyedb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937115.1079786-1235-7778416130674/AnsiballZ_file.py
Feb 01 09:11:55 np0005604215.localdomain sudo[140593]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:55 np0005604215.localdomain python3.9[140597]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:55 np0005604215.localdomain sudo[140593]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:56 np0005604215.localdomain sudo[140687]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpogpxjqwghjyxpejwugwiveeidgufka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937116.1660492-1260-164101065482952/AnsiballZ_command.py
Feb 01 09:11:56 np0005604215.localdomain sudo[140687]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:56 np0005604215.localdomain python3.9[140689]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:11:56 np0005604215.localdomain sudo[140687]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:57 np0005604215.localdomain sudo[140782]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmfwecutnlfipmhacytqtziclshsfrgm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937116.8682287-1284-28252694578479/AnsiballZ_blockinfile.py
Feb 01 09:11:57 np0005604215.localdomain sudo[140782]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:57 np0005604215.localdomain python3.9[140784]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:57 np0005604215.localdomain sudo[140782]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:58 np0005604215.localdomain sudo[140875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cknfcmhrandbquetgisdvnmdcgqycuja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937117.855055-1310-261307844900816/AnsiballZ_file.py
Feb 01 09:11:58 np0005604215.localdomain sudo[140875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:58 np0005604215.localdomain sshd[140878]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:11:58 np0005604215.localdomain python3.9[140877]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:58 np0005604215.localdomain sudo[140875]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:58 np0005604215.localdomain sshd[140878]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:11:58 np0005604215.localdomain sudo[140969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bspvabjpczzrvrqlekjgtfrloqlbaaqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937118.5025494-1310-218083538807235/AnsiballZ_file.py
Feb 01 09:11:58 np0005604215.localdomain sudo[140969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:58 np0005604215.localdomain python3.9[140971]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:11:58 np0005604215.localdomain sudo[140969]: pam_unix(sudo:session): session closed for user root
Feb 01 09:11:59 np0005604215.localdomain sudo[141061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scfptmynkulnnjjuxnkdkgbscjbptrjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937119.225891-1356-35961653085023/AnsiballZ_mount.py
Feb 01 09:11:59 np0005604215.localdomain sudo[141061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:11:59 np0005604215.localdomain python3.9[141063]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 01 09:11:59 np0005604215.localdomain sudo[141061]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60998 DF PROTO=TCP SPT=41384 DPT=9882 SEQ=3312263370 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6526E6C0000000001030307) 
Feb 01 09:12:00 np0005604215.localdomain sudo[141154]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pdoniookvlrbenrjyafimtxxevbtweke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937119.9984658-1356-125317600883155/AnsiballZ_mount.py
Feb 01 09:12:00 np0005604215.localdomain sudo[141154]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:00 np0005604215.localdomain python3.9[141156]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Feb 01 09:12:00 np0005604215.localdomain sudo[141154]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:00 np0005604215.localdomain sshd[134258]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:12:00 np0005604215.localdomain systemd[1]: session-43.scope: Deactivated successfully.
Feb 01 09:12:00 np0005604215.localdomain systemd[1]: session-43.scope: Consumed 28.102s CPU time.
Feb 01 09:12:00 np0005604215.localdomain systemd-logind[761]: Session 43 logged out. Waiting for processes to exit.
Feb 01 09:12:00 np0005604215.localdomain systemd-logind[761]: Removed session 43.
Feb 01 09:12:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25635 DF PROTO=TCP SPT=38470 DPT=9105 SEQ=944910504 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65274760000000001030307) 
Feb 01 09:12:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36908 DF PROTO=TCP SPT=51088 DPT=9882 SEQ=53325184 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652750E0000000001030307) 
Feb 01 09:12:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6068 DF PROTO=TCP SPT=41644 DPT=9102 SEQ=2525634233 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6527C010000000001030307) 
Feb 01 09:12:06 np0005604215.localdomain sshd[141172]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:12:06 np0005604215.localdomain sshd[141172]: Accepted publickey for zuul from 192.168.122.31 port 59748 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:12:06 np0005604215.localdomain systemd-logind[761]: New session 44 of user zuul.
Feb 01 09:12:06 np0005604215.localdomain systemd[1]: Started Session 44 of User zuul.
Feb 01 09:12:06 np0005604215.localdomain sshd[141172]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:12:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59541 DF PROTO=TCP SPT=43214 DPT=9100 SEQ=2497500456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652887C0000000001030307) 
Feb 01 09:12:06 np0005604215.localdomain sudo[141265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-remxewllmlwmgiprekopwxmrjpdvxyuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937126.5354645-23-189003579322189/AnsiballZ_tempfile.py
Feb 01 09:12:06 np0005604215.localdomain sudo[141265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:07 np0005604215.localdomain python3.9[141267]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Feb 01 09:12:07 np0005604215.localdomain sudo[141265]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:08 np0005604215.localdomain sudo[141357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qegbzijkujqahowmwfynwfhrxspvaltl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937128.2351484-95-156216176832800/AnsiballZ_stat.py
Feb 01 09:12:08 np0005604215.localdomain sudo[141357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:08 np0005604215.localdomain python3.9[141359]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:12:08 np0005604215.localdomain sudo[141357]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:09 np0005604215.localdomain sshd[141409]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:12:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=26511 DF PROTO=TCP SPT=42952 DPT=9101 SEQ=4126678113 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652959F0000000001030307) 
Feb 01 09:12:10 np0005604215.localdomain sudo[141453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qyfkrpijvbkyxorkzrqhdenqvbzebaeu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937129.6940122-143-48923344637673/AnsiballZ_slurp.py
Feb 01 09:12:10 np0005604215.localdomain sudo[141453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:10 np0005604215.localdomain python3.9[141455]: ansible-ansible.builtin.slurp Invoked with src=/etc/ssh/ssh_known_hosts
Feb 01 09:12:10 np0005604215.localdomain sudo[141453]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:10 np0005604215.localdomain sshd[141409]: Invalid user miusuario from 85.206.171.113 port 33442
Feb 01 09:12:10 np0005604215.localdomain sshd[141409]: Received disconnect from 85.206.171.113 port 33442:11: Bye Bye [preauth]
Feb 01 09:12:10 np0005604215.localdomain sshd[141409]: Disconnected from invalid user miusuario 85.206.171.113 port 33442 [preauth]
Feb 01 09:12:11 np0005604215.localdomain systemd[1]: systemd-timedated.service: Deactivated successfully.
Feb 01 09:12:11 np0005604215.localdomain sudo[141547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-inwfshaaxlabbvhiysxihrqaqhpnnrfd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937131.2500932-192-19814776055426/AnsiballZ_stat.py
Feb 01 09:12:11 np0005604215.localdomain sudo[141547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:11 np0005604215.localdomain python3.9[141549]: ansible-ansible.legacy.stat Invoked with path=/tmp/ansible._lgttopg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:12:11 np0005604215.localdomain sudo[141547]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:12 np0005604215.localdomain sudo[141622]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ilpjcrcaeehqpionnfetjlfqufqgnkgk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937131.2500932-192-19814776055426/AnsiballZ_copy.py
Feb 01 09:12:12 np0005604215.localdomain sudo[141622]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:12 np0005604215.localdomain python3.9[141624]: ansible-ansible.legacy.copy Invoked with dest=/tmp/ansible._lgttopg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937131.2500932-192-19814776055426/.source._lgttopg _original_basename=.fxs1vthu follow=False checksum=b6259656501c187ae53f530254d9fd01725b4ecf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:12 np0005604215.localdomain sudo[141622]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:14 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=30271 DF PROTO=TCP SPT=33402 DPT=9101 SEQ=2116717942 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652A50D0000000001030307) 
Feb 01 09:12:14 np0005604215.localdomain sudo[141714]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jhlnzgfxfcxycuamxgthdsarlsfzvbsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937133.6891072-281-206151418780154/AnsiballZ_setup.py
Feb 01 09:12:14 np0005604215.localdomain sudo[141714]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:14 np0005604215.localdomain python3.9[141716]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:12:14 np0005604215.localdomain sudo[141714]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:15 np0005604215.localdomain sudo[141806]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vqrjndgsomwoenkinuzlyxgbxbiamsoc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937135.5193896-331-116876341816036/AnsiballZ_blockinfile.py
Feb 01 09:12:15 np0005604215.localdomain sudo[141806]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:16 np0005604215.localdomain python3.9[141808]: ansible-ansible.builtin.blockinfile Invoked with block=np0005604212.localdomain,192.168.122.106,np0005604212* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCx/MKX//74FswFkw1c1lfM5mahSRoD4B8bhCZSm2/IQ//syuq+Qpi1sEoMv/N1mOrU8atXNtYkVNozl/ypDe2YJkUS8OTt37bT9A7XnBlfFSc5OwXS7VGHpVWbiMbImJibSV7HjoQP0yA8SvCJCcrI3Eh14+cna8tT1rJ9lOFRHvxLfG52XnzFiNUVDU+TG3uRtWEjY5epI8j/U73tEqdP4OAk7ZQ9riN1nllCCIs9FOErOEw14VW+151TbOCzcm9kvzeQMit9jPXTGqmTPKoidZFLhJwEAXq4M9+DFfKQWkVSqfcU3cvPz6S03lUcpPWiJxgGZiIPXxCdRjvI3bKCm898lFYwZq8EfdAwUFMyhmz4GHSyhMwqZWE46cikXf/skoSrEF8ji3NjmyQL7T304iKenZca6rHDI56veO0+PTzZj/pBiaWBWXlqF0WQLAn804z3yapsLNuR8R4EaREmk1Tc2ESg1//73pCUypwEMQWESHsAJ/LCHhyqNHY6Bjc=
                                                            np0005604212.localdomain,192.168.122.106,np0005604212* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIE8ydwus/1P6AnrixkRz4PJNoZXio9ATjx1wpGE9aUxy
                                                            np0005604212.localdomain,192.168.122.106,np0005604212* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBHuZj1kjh43u8MkLoV7TID8opUYqcB9nbV+TEcV1Khgm9NhSBcQeUlB5GJecVMFtUp1FQn9l3Oxy0aNJL0spiWE=
                                                            np0005604210.localdomain,192.168.122.104,np0005604210* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDeVlqpmEgZX6yoZkE7SzVbEM6MqJe/9qDZPPgFZPb/N85k+uB3cINsoq0pMJYeKjcKY8H56WyuNkVVwVHaouZnJCN4p1rCJmATIDieU8QMDwGucQpbrNRrQWheWQDkmHNIPOxnUDCRgEzDfYiaE4prLHMPKtf8XJAKUKVd6lpZrVSCovGz0UC3U1Le/0N1PJOi4kYEuipVrcfoYHC63A32I+w+7tybU8Rpknhc/UHhdn39PBGuAhbkSf2JEJbLLzLaPkZXT6HOPiBUT9jWKnymCGEcfPjIWOkeelx3fkPoXZCtnYHlSoQSkCVsUmXgHNj7X3+6sJi9+iV/+8jRWQyk6aCC+HjXDhSwxbBUaM9AOimJ9EK7vo8/IK9pQ3gNsEct6rHuvGytACNMWpaT5sRRaVEnS8uz/PL8urB6+59GYGunjAaw8lCQcxw+VNVJaLtj+BpVJZA2EA6XE4fwq7v0s9u0ApIMSyV3DcYzIcDFlT11I5g3RM8vZNipXfnub3U=
                                                            np0005604210.localdomain,192.168.122.104,np0005604210* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOlN+6Wna5zexGzaC7+fSuZYqptFJJzfc4fNurRaPmwC
                                                            np0005604210.localdomain,192.168.122.104,np0005604210* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMf1D0EfcBESlFDd0NV4yvsDLeyI7zSTGShGHjV17TDeMwOZQ9X97P3K+p+QICvUvg8AXGXxFhArHCUmm+iJ0Q8=
                                                            np0005604209.localdomain,192.168.122.103,np0005604209* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDAdXF2/8XBq3bWgr/9swIkzjlkm7PzpC1vdYXglaExGeIUwK5n05/HLobUMrYjOh6yE81+tctBT51wuPLw9qOGf4X3lRx3x0AHUqWSs00OL5nZsMRAd6PknZVyeCWf9jv13mVWIExCYbP8e4VK4M3w1m2xSLFd1aHtGkEUYJKCmacxrxFu2opq+kNCclpMC0BlFeSeX/NZeGwcfVCEyP46JVB9pNDo6D4s98FzzQNtG4DTv8NqE0S8Fj44dajq/80IKXeVEbhVmBikwFGMMEHhsRass2m0Q0rBw1Cv2jqW9hrTO1AWHY2aNDDqr6cKttP27XKfc/unDFFDb0mcc/HRa8JAUYEvuO0FIV6n28+Q5hWoYHAZfMU15U/bQPN1UxbF/MmSIZWvwY+vzCJ+icSJ9qfhDfbd1DttRuV0F3Jdi0jq01TyyPdOz8qT7kKSftD3Awn6BNLlseR8MaOTS+YF4fOnSP/xzj0B+nx/nr5Mrq8+QzKb2YyqdMfWWMGdCw8=
                                                            np0005604209.localdomain,192.168.122.103,np0005604209* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEyfGu/WqJIvC6oouYQjgcrJPk9Bg07JDIkt1JPKTeA0
                                                            np0005604209.localdomain,192.168.122.103,np0005604209* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBH4jrW0M0jOqWvBkwMTs5aJ7MoUwB68xLOHVc4M2y1jfTW9cs2+E3JaFwH6xJLpPXRNwbxblwTFdTeLzxwq3Nwk=
                                                            np0005604211.localdomain,192.168.122.105,np0005604211* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCQ5JUOdiESLpaYomijw3u9LxHN4VxpmenW9EczyVvVdofuEESAIR1Q8BIVkW7gxgVyrzHxOpbaoAS+aZaKazruu7/chC8MkDw1lvfeyQwMZax6UziUan2wIFVTaCc7kITOHrdWkJm+OIvCs/ImtkSgsTmvTiQedvs86ME3gHNyA+7taoDXnH6UCB6d5ex6PzwXsKI03iUVWFfsGP3ZU7r52IBwgrLG+VplbaPBRNNP/RvKULVsokG3UCMd3pjHv3VYBdXPYTFOPf666ZEuxEz+Frz43oXzEhr4W61RN70cAFJDDFoOmBDxXzZqrmF7r1vSV3ojl+aHaVLCGL4Wnjrp9wl5Zq8XCGN/7ttzaZKrjj/flccfBEiYL9odgqp92EjmxsRqG4bFq/nEzS/DTJ88QQVpGQNC2T6bElJVdBIrpZAyv7n5HlwNQwfsltQtzbqe1E32azZb1wq13ajV9Ii7QrVd81nGYFM79NqiVVbXs5NypsJOMQ6ZoqyHK5+yyHk=
                                                            np0005604211.localdomain,192.168.122.105,np0005604211* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILEIcduNL0DMEDOErXXJ0uk29DlGSUk7f/QOEFebs4e8
                                                            np0005604211.localdomain,192.168.122.105,np0005604211* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBCGGOVXInAZnlCFh3rgVH6nrUWtitrkOeovDtC1WeeR/gHrJ+susCZPN3v3pAe5flAEf/hpjySdS/u1PmS0N8Ho=
                                                            np0005604215.localdomain,192.168.122.108,np0005604215* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC/jKlZ/vxfazmNjpekfENGpQi8TTD6ErYy0BH9P8CRIiiKVdA/53XGSAQlY17b4tT5hzyHsUuXDmbv5R98FSy/Fi8F4KrjgogVPhd/zYoMrffr9ydwv+ih2mIyCPjZC+N92i92gM2OBHBXj5vqyh5yl1t4H1LhFab7P/m42K75mcTytGvGTLKXZbcs/1Ot/APGrs5wqg/c9XFQtgBEn6ttSKQ9caqbgUw88VGRkzaHvzheQvtIjZL0AwigTS24tqFx+bF+liSnSaYk1R8TKe1yMNODv5OCUmFYvPqls4Y3AQkpuroQQXHcQCe0QPuz9nGgPebNOxyTHsK66oDWIUskoYIbrZZhjDxlpdzJ+POEU/jXtGox0/0wlpRK7jNN6r4Fzx6uIzxB5SWn/UJ4BYS853pUsC32TeD0pZXfUAzOGUOzQfvYkUCElyRi8zDN4ubwEWnxvCEPaAFihafbviqQwLNFFmth36owDHV2zU/Q/BtW8vrwfx0cPr2A4WvQvp8=
                                                            np0005604215.localdomain,192.168.122.108,np0005604215* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAqxjQs+R8e7wYqi9vXJigqVZC1H7cyvu0Lob0wgHHpY
                                                            np0005604215.localdomain,192.168.122.108,np0005604215* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBpZtz+gA3A28TfIAE9+rHy4sghRBF4nh1U9zBwiez8FWMv0OjVQriiYnYh6sbsEW0tK+yZBRm7xEpd3W14ioec=
                                                            np0005604213.localdomain,192.168.122.107,np0005604213* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDhh44DuXnO4hBZJvT1vLnO8ZhT8GKLkBI0M+Q/lXSbHymnCyNerLMqVRhTb5ZUw07lkP6FtBJS95SUtdJuAbUi4jphShtJfBdicoa+uGqI1icHUQCbtCAACtas0lGeGi5q/q1LfzeuKh+LTRj60W+r2OZoChKxeSWYBQ8gIScKe1HgVCJVEESXwNv4CBs6ffOWVYHE+3JDUA3AN3nX931xw4oLMBkwi0q4sNh9Sb0oS79OX+dKdlGfnPLLWKF9QrLrHYdHVkKtPre9d1BdNkl38gRE45uwrAAxXBfeZjbzzfbUlWb54SZwL8P2ej29L5VAbE/97j1HD6+kUZ5wFb6v9oJyFwq8udFDqO1SUMkW4t1VmwD5G4rIU2+u0yHd4H7//fgbf8WAhPv1Qx5tXEqB6LIHqYCz7RekNQO5Xv8ge/gVMzzlxB0DJP6a4DJ8E0/Djnyzw81L2fmyeriPLqt/n/wHscNr1RRI4T1X2iINRwk5QfrxwTEHhJ00FY1kB90=
                                                            np0005604213.localdomain,192.168.122.107,np0005604213* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHpQ8q5SipY+Tg88mzREiMhmtuvQNv/rHiJfQhVqjy49
                                                            np0005604213.localdomain,192.168.122.107,np0005604213* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBM6lbWtwCks630IMm3N6slgTXAS2/BDd/gLT/86gsZQSUwulBMm6OKfJ9eje+B7RGiNR4je3u2+SDaZwwywpAos=
                                                             create=True mode=0644 path=/tmp/ansible._lgttopg state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:16 np0005604215.localdomain sudo[141806]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:17 np0005604215.localdomain sudo[141898]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qefnzewcmwngydoojxkdhkrhbkvfydrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937136.8964984-378-258122472700980/AnsiballZ_command.py
Feb 01 09:12:17 np0005604215.localdomain sudo[141898]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:17 np0005604215.localdomain python3.9[141900]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible._lgttopg' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:12:17 np0005604215.localdomain sudo[141898]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:18 np0005604215.localdomain sudo[141992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uhdhrwtsyppbltjktswhwfvsgcbfnnuq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937138.256078-426-209510528475841/AnsiballZ_file.py
Feb 01 09:12:18 np0005604215.localdomain sudo[141992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:18 np0005604215.localdomain python3.9[141994]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible._lgttopg state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:18 np0005604215.localdomain sudo[141992]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:19 np0005604215.localdomain sshd[141172]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:12:19 np0005604215.localdomain systemd[1]: session-44.scope: Deactivated successfully.
Feb 01 09:12:19 np0005604215.localdomain systemd[1]: session-44.scope: Consumed 4.276s CPU time.
Feb 01 09:12:19 np0005604215.localdomain systemd-logind[761]: Session 44 logged out. Waiting for processes to exit.
Feb 01 09:12:19 np0005604215.localdomain systemd-logind[761]: Removed session 44.
Feb 01 09:12:25 np0005604215.localdomain sshd[142009]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:12:25 np0005604215.localdomain sshd[142009]: Accepted publickey for zuul from 192.168.122.31 port 60430 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:12:25 np0005604215.localdomain systemd-logind[761]: New session 45 of user zuul.
Feb 01 09:12:25 np0005604215.localdomain systemd[1]: Started Session 45 of User zuul.
Feb 01 09:12:25 np0005604215.localdomain sshd[142009]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:12:26 np0005604215.localdomain python3.9[142102]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:12:27 np0005604215.localdomain sudo[142196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syvkfrvvuudnyxnqgcbfiqrlhfqwqzye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937147.0716765-53-239885831246560/AnsiballZ_systemd.py
Feb 01 09:12:27 np0005604215.localdomain sudo[142196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:27 np0005604215.localdomain python3.9[142198]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 01 09:12:28 np0005604215.localdomain sudo[142196]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:28 np0005604215.localdomain sudo[142290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpglynyoapgqmwbucdduoemuhxdsnusv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937148.1835144-77-28550621657920/AnsiballZ_systemd.py
Feb 01 09:12:28 np0005604215.localdomain sudo[142290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:28 np0005604215.localdomain python3.9[142292]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:12:28 np0005604215.localdomain sudo[142290]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:29 np0005604215.localdomain sudo[142383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yrbtijqszkqpvzkkjyuswcfpmofzceaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937149.128812-104-239647242320901/AnsiballZ_command.py
Feb 01 09:12:29 np0005604215.localdomain sudo[142383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:29 np0005604215.localdomain python3.9[142385]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:12:29 np0005604215.localdomain sudo[142383]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16520 DF PROTO=TCP SPT=60794 DPT=9882 SEQ=2947710787 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652E39D0000000001030307) 
Feb 01 09:12:30 np0005604215.localdomain sudo[142476]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-urrqroxilyihpbeqxezzqjgpudswmshp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937149.8820288-128-4872653173394/AnsiballZ_stat.py
Feb 01 09:12:30 np0005604215.localdomain sudo[142476]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:30 np0005604215.localdomain python3.9[142478]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:12:30 np0005604215.localdomain sudo[142476]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:30 np0005604215.localdomain sudo[142570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-venzjsjjconvplculmwfjxwncvhgnwye ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937150.7093072-153-201878466379159/AnsiballZ_command.py
Feb 01 09:12:30 np0005604215.localdomain sudo[142570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:31 np0005604215.localdomain python3.9[142572]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:12:31 np0005604215.localdomain sudo[142570]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2590 DF PROTO=TCP SPT=58390 DPT=9105 SEQ=166841420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652E9A80000000001030307) 
Feb 01 09:12:31 np0005604215.localdomain sudo[142665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewxytxctcxsemzdhmwjfqrdshlrvcajm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937151.3812344-176-162217085889647/AnsiballZ_file.py
Feb 01 09:12:31 np0005604215.localdomain sudo[142665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:31 np0005604215.localdomain python3.9[142667]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:31 np0005604215.localdomain sudo[142665]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:32 np0005604215.localdomain sshd[142009]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:12:32 np0005604215.localdomain systemd[1]: session-45.scope: Deactivated successfully.
Feb 01 09:12:32 np0005604215.localdomain systemd[1]: session-45.scope: Consumed 3.870s CPU time.
Feb 01 09:12:32 np0005604215.localdomain systemd-logind[761]: Session 45 logged out. Waiting for processes to exit.
Feb 01 09:12:32 np0005604215.localdomain systemd-logind[761]: Removed session 45.
Feb 01 09:12:32 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2591 DF PROTO=TCP SPT=58390 DPT=9105 SEQ=166841420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652EDCD0000000001030307) 
Feb 01 09:12:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22570 DF PROTO=TCP SPT=45074 DPT=9102 SEQ=1475244689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652F1310000000001030307) 
Feb 01 09:12:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22571 DF PROTO=TCP SPT=45074 DPT=9102 SEQ=1475244689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652F54E0000000001030307) 
Feb 01 09:12:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2592 DF PROTO=TCP SPT=58390 DPT=9105 SEQ=166841420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652F5CD0000000001030307) 
Feb 01 09:12:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22572 DF PROTO=TCP SPT=45074 DPT=9102 SEQ=1475244689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652FD4D0000000001030307) 
Feb 01 09:12:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20465 DF PROTO=TCP SPT=59290 DPT=9100 SEQ=1529206010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA652FDAC0000000001030307) 
Feb 01 09:12:37 np0005604215.localdomain sshd[142682]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:12:37 np0005604215.localdomain sshd[142682]: Accepted publickey for zuul from 192.168.122.30 port 49470 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:12:37 np0005604215.localdomain systemd-logind[761]: New session 46 of user zuul.
Feb 01 09:12:37 np0005604215.localdomain systemd[1]: Started Session 46 of User zuul.
Feb 01 09:12:37 np0005604215.localdomain sshd[142682]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:12:38 np0005604215.localdomain python3.9[142775]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:12:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20467 DF PROTO=TCP SPT=59290 DPT=9100 SEQ=1529206010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65309CD0000000001030307) 
Feb 01 09:12:39 np0005604215.localdomain sudo[142869]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xysxkdrmpqbufutojzwqiwnekgepltbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937159.5677264-59-274628400363305/AnsiballZ_setup.py
Feb 01 09:12:39 np0005604215.localdomain sudo[142869]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:40 np0005604215.localdomain python3.9[142871]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:12:40 np0005604215.localdomain sudo[142869]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:40 np0005604215.localdomain sudo[142923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsgeiuysfslsrwcmvczckexiooandksa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937159.5677264-59-274628400363305/AnsiballZ_dnf.py
Feb 01 09:12:40 np0005604215.localdomain sudo[142923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:41 np0005604215.localdomain python3.9[142925]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Feb 01 09:12:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8392 DF PROTO=TCP SPT=53800 DPT=9101 SEQ=2319487304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65316CD0000000001030307) 
Feb 01 09:12:44 np0005604215.localdomain sudo[142923]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:45 np0005604215.localdomain python3.9[143017]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:12:46 np0005604215.localdomain sudo[143108]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqhcnboexbxpolnyjjmbpzpgjrjxqvpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937166.2197182-122-168925880343867/AnsiballZ_file.py
Feb 01 09:12:46 np0005604215.localdomain sudo[143108]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:46 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2594 DF PROTO=TCP SPT=58390 DPT=9105 SEQ=166841420 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653250D0000000001030307) 
Feb 01 09:12:46 np0005604215.localdomain python3.9[143110]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/reboot_required/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:46 np0005604215.localdomain sudo[143108]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:47 np0005604215.localdomain sudo[143200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrbrljuropvfprmupuhptkidfpmqtfya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937167.0640092-146-169559945912464/AnsiballZ_file.py
Feb 01 09:12:47 np0005604215.localdomain sudo[143200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:47 np0005604215.localdomain sshd[143203]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:12:47 np0005604215.localdomain sshd[143203]: fatal: mm_answer_sign: sign: error in libcrypto
Feb 01 09:12:47 np0005604215.localdomain python3.9[143202]: ansible-ansible.builtin.file Invoked with mode=0600 path=/var/lib/openstack/reboot_required/needs_restarting state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:47 np0005604215.localdomain sudo[143200]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:48 np0005604215.localdomain sudo[143294]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hxyjuoekasswrtzdyfgcjoydxzxdajxx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937167.7369423-170-71149893675769/AnsiballZ_lineinfile.py
Feb 01 09:12:48 np0005604215.localdomain sudo[143294]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:48 np0005604215.localdomain python3.9[143296]: ansible-ansible.builtin.lineinfile Invoked with dest=/var/lib/openstack/reboot_required/needs_restarting line=Not root, Subscription Management repositories not updated
                                                            Core libraries or services have been updated since boot-up:
                                                              * systemd
                                                            
                                                            Reboot is required to fully utilize these updates.
                                                            More information: https://access.redhat.com/solutions/27943 path=/var/lib/openstack/reboot_required/needs_restarting state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:12:48 np0005604215.localdomain sudo[143294]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22574 DF PROTO=TCP SPT=45074 DPT=9102 SEQ=1475244689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6532D0E0000000001030307) 
Feb 01 09:12:49 np0005604215.localdomain python3.9[143386]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:12:49 np0005604215.localdomain python3.9[143476]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:12:50 np0005604215.localdomain python3.9[143568]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:12:51 np0005604215.localdomain sshd[142682]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:12:51 np0005604215.localdomain systemd-logind[761]: Session 46 logged out. Waiting for processes to exit.
Feb 01 09:12:51 np0005604215.localdomain systemd[1]: session-46.scope: Deactivated successfully.
Feb 01 09:12:51 np0005604215.localdomain systemd[1]: session-46.scope: Consumed 8.836s CPU time.
Feb 01 09:12:51 np0005604215.localdomain systemd-logind[761]: Removed session 46.
Feb 01 09:12:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20469 DF PROTO=TCP SPT=59290 DPT=9100 SEQ=1529206010 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653390D0000000001030307) 
Feb 01 09:12:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8394 DF PROTO=TCP SPT=53800 DPT=9101 SEQ=2319487304 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653470D0000000001030307) 
Feb 01 09:12:55 np0005604215.localdomain sshd[143585]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:12:55 np0005604215.localdomain sudo[143587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:12:55 np0005604215.localdomain sudo[143587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:12:55 np0005604215.localdomain sudo[143587]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:55 np0005604215.localdomain sshd[143585]: Accepted publickey for zuul from 192.168.122.30 port 46984 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:12:55 np0005604215.localdomain systemd-logind[761]: New session 47 of user zuul.
Feb 01 09:12:55 np0005604215.localdomain systemd[1]: Started Session 47 of User zuul.
Feb 01 09:12:55 np0005604215.localdomain sudo[143602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:12:55 np0005604215.localdomain sshd[143585]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:12:55 np0005604215.localdomain sudo[143602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:12:56 np0005604215.localdomain sudo[143602]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:56 np0005604215.localdomain python3.9[143740]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:12:57 np0005604215.localdomain sudo[143759]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:12:57 np0005604215.localdomain sudo[143759]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:12:57 np0005604215.localdomain sudo[143759]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:59 np0005604215.localdomain sudo[143849]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxaoeegvulykwfimkjtifnrtnwairtbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937178.6415417-156-101403588278259/AnsiballZ_file.py
Feb 01 09:12:59 np0005604215.localdomain sudo[143849]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:12:59 np0005604215.localdomain python3.9[143851]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:12:59 np0005604215.localdomain sudo[143849]: pam_unix(sudo:session): session closed for user root
Feb 01 09:12:59 np0005604215.localdomain sudo[143941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wdkeohgnrpmeucfiqsopaqdxxmptsutq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937179.3799179-177-225984705522519/AnsiballZ_stat.py
Feb 01 09:12:59 np0005604215.localdomain sudo[143941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46837 DF PROTO=TCP SPT=44422 DPT=9882 SEQ=2102494178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65358CC0000000001030307) 
Feb 01 09:13:00 np0005604215.localdomain python3.9[143943]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:00 np0005604215.localdomain sudo[143941]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:00 np0005604215.localdomain sudo[144014]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xusbvqtvcsjwerofmaenmkuxachljsnz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937179.3799179-177-225984705522519/AnsiballZ_copy.py
Feb 01 09:13:00 np0005604215.localdomain sudo[144014]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:00 np0005604215.localdomain python3.9[144016]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937179.3799179-177-225984705522519/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:00 np0005604215.localdomain sudo[144014]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46838 DF PROTO=TCP SPT=44422 DPT=9882 SEQ=2102494178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6535CCD0000000001030307) 
Feb 01 09:13:01 np0005604215.localdomain sudo[144106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-meuasedxnxdtoncpnutsquxgnqbbckps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937181.0260043-228-9118468654949/AnsiballZ_file.py
Feb 01 09:13:01 np0005604215.localdomain sudo[144106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:01 np0005604215.localdomain python3.9[144108]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-sriov setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:01 np0005604215.localdomain sudo[144106]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:01 np0005604215.localdomain sudo[144198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ofrchvsattzymloeuruhqyivjehnpgrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937181.6586397-252-203254520046019/AnsiballZ_stat.py
Feb 01 09:13:01 np0005604215.localdomain sudo[144198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:02 np0005604215.localdomain python3.9[144200]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:02 np0005604215.localdomain sudo[144198]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:02 np0005604215.localdomain sudo[144271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roympuzfazrrpzskfculnbzkmiijpxdj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937181.6586397-252-203254520046019/AnsiballZ_copy.py
Feb 01 09:13:02 np0005604215.localdomain sudo[144271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:02 np0005604215.localdomain python3.9[144273]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937181.6586397-252-203254520046019/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:02 np0005604215.localdomain sudo[144271]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46839 DF PROTO=TCP SPT=44422 DPT=9882 SEQ=2102494178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65364CD0000000001030307) 
Feb 01 09:13:03 np0005604215.localdomain sudo[144363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksslqhwacisfkvexsiufyyqzhuqgnrnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937182.9341748-303-67430475261693/AnsiballZ_file.py
Feb 01 09:13:03 np0005604215.localdomain sudo[144363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:03 np0005604215.localdomain python3.9[144365]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-dhcp setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:03 np0005604215.localdomain sudo[144363]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:03 np0005604215.localdomain sudo[144455]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lgvfzorzvrgvbgmlbnuvosrlfxqrswzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937183.529224-328-163416117337822/AnsiballZ_stat.py
Feb 01 09:13:03 np0005604215.localdomain sudo[144455]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:03 np0005604215.localdomain python3.9[144457]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:03 np0005604215.localdomain sudo[144455]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:04 np0005604215.localdomain sudo[144528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjebehaimeoxdmvxzdwvjrwhrnanxmob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937183.529224-328-163416117337822/AnsiballZ_copy.py
Feb 01 09:13:04 np0005604215.localdomain sudo[144528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:04 np0005604215.localdomain python3.9[144530]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937183.529224-328-163416117337822/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:04 np0005604215.localdomain sudo[144528]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:05 np0005604215.localdomain sudo[144620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvnsvbqmylhdejvripmuorkmqutzrfqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937184.7790031-370-232381261628307/AnsiballZ_file.py
Feb 01 09:13:05 np0005604215.localdomain sudo[144620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:05 np0005604215.localdomain python3.9[144622]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:05 np0005604215.localdomain sudo[144620]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:05 np0005604215.localdomain sudo[144712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-feafahxvkaizivapmhwhcukvjhxmrbch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937185.384328-394-190542600162643/AnsiballZ_stat.py
Feb 01 09:13:05 np0005604215.localdomain sudo[144712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:05 np0005604215.localdomain python3.9[144714]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:05 np0005604215.localdomain sudo[144712]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:06 np0005604215.localdomain sudo[144785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nkprnkpgrvcrizuhhgcfcdzfrybxphnj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937185.384328-394-190542600162643/AnsiballZ_copy.py
Feb 01 09:13:06 np0005604215.localdomain sudo[144785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:06 np0005604215.localdomain python3.9[144787]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937185.384328-394-190542600162643/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:06 np0005604215.localdomain sudo[144785]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54095 DF PROTO=TCP SPT=44992 DPT=9102 SEQ=122241056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653724D0000000001030307) 
Feb 01 09:13:06 np0005604215.localdomain sudo[144877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogxqvhvxahjeujpyniverytozswrzprr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937186.599221-439-193256210393583/AnsiballZ_file.py
Feb 01 09:13:06 np0005604215.localdomain sudo[144877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:07 np0005604215.localdomain python3.9[144879]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:07 np0005604215.localdomain sudo[144877]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:07 np0005604215.localdomain sudo[144969]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-prcmjcyyfjyiojjrnplkifajskjxuvqd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937187.3565273-463-222454447581263/AnsiballZ_stat.py
Feb 01 09:13:07 np0005604215.localdomain sudo[144969]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:07 np0005604215.localdomain python3.9[144971]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:07 np0005604215.localdomain sudo[144969]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:08 np0005604215.localdomain sudo[145042]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-allslvqduehrcdnvppqrvoojxbazjctq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937187.3565273-463-222454447581263/AnsiballZ_copy.py
Feb 01 09:13:08 np0005604215.localdomain sudo[145042]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:08 np0005604215.localdomain python3.9[145044]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937187.3565273-463-222454447581263/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:08 np0005604215.localdomain sudo[145042]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:08 np0005604215.localdomain sudo[145134]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhexnbykzmcznipbnjovklfbvklxgkmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937188.6392577-512-144599907326168/AnsiballZ_file.py
Feb 01 09:13:08 np0005604215.localdomain sudo[145134]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:09 np0005604215.localdomain python3.9[145136]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:09 np0005604215.localdomain sudo[145134]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:09 np0005604215.localdomain sudo[145226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wqrxvorypgalyqrtcjtyevpvvjrrrbsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937189.2451031-538-211459311670180/AnsiballZ_stat.py
Feb 01 09:13:09 np0005604215.localdomain sudo[145226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:09 np0005604215.localdomain python3.9[145228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60149 DF PROTO=TCP SPT=36410 DPT=9100 SEQ=3909379442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6537ECD0000000001030307) 
Feb 01 09:13:09 np0005604215.localdomain sudo[145226]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:10 np0005604215.localdomain sudo[145299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ewyuslifnrrfjkwnpmczkmtiuvinnfyx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937189.2451031-538-211459311670180/AnsiballZ_copy.py
Feb 01 09:13:10 np0005604215.localdomain sudo[145299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:10 np0005604215.localdomain python3.9[145301]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937189.2451031-538-211459311670180/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:10 np0005604215.localdomain sudo[145299]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:10 np0005604215.localdomain sudo[145391]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-meppvovcpkrfjzxupbgclzsniuxxpgcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937190.5319247-588-207215800164933/AnsiballZ_file.py
Feb 01 09:13:10 np0005604215.localdomain sudo[145391]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:10 np0005604215.localdomain python3.9[145393]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:11 np0005604215.localdomain sudo[145391]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:11 np0005604215.localdomain sudo[145483]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sravjlxbdkmpuahycnbebumfzstgaejq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937191.12612-613-184721366672374/AnsiballZ_stat.py
Feb 01 09:13:11 np0005604215.localdomain sudo[145483]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:11 np0005604215.localdomain python3.9[145485]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:11 np0005604215.localdomain sudo[145483]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:12 np0005604215.localdomain sudo[145556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yztxhtmebqrouytdbhroygklyxynfwqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937191.12612-613-184721366672374/AnsiballZ_copy.py
Feb 01 09:13:12 np0005604215.localdomain sudo[145556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:12 np0005604215.localdomain python3.9[145558]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937191.12612-613-184721366672374/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:12 np0005604215.localdomain sudo[145556]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:12 np0005604215.localdomain sudo[145648]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlxgffgfjbupcpkhtlegwdhosgsdvhjv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937192.631002-662-205612417898183/AnsiballZ_file.py
Feb 01 09:13:12 np0005604215.localdomain sudo[145648]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:13 np0005604215.localdomain python3.9[145650]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:13 np0005604215.localdomain sudo[145648]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31986 DF PROTO=TCP SPT=35728 DPT=9101 SEQ=1042277212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6538C0E0000000001030307) 
Feb 01 09:13:13 np0005604215.localdomain chronyd[134242]: Selected source 216.232.132.95 (pool.ntp.org)
Feb 01 09:13:13 np0005604215.localdomain sudo[145740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihvmbpzvtskqjitbzsnnrtmomnhvvvzc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937193.2733457-687-95803049889184/AnsiballZ_stat.py
Feb 01 09:13:13 np0005604215.localdomain sudo[145740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:13 np0005604215.localdomain python3.9[145742]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:13 np0005604215.localdomain sudo[145740]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:14 np0005604215.localdomain sudo[145813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anajsmepyhatryikcvsjzpkgidcnrbun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937193.2733457-687-95803049889184/AnsiballZ_copy.py
Feb 01 09:13:14 np0005604215.localdomain sudo[145813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:14 np0005604215.localdomain python3.9[145815]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937193.2733457-687-95803049889184/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=3730c7272422aae3617cbe6ef3938e59e92fe8bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:14 np0005604215.localdomain sudo[145813]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:14 np0005604215.localdomain sshd[143585]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:13:14 np0005604215.localdomain systemd-logind[761]: Session 47 logged out. Waiting for processes to exit.
Feb 01 09:13:14 np0005604215.localdomain systemd[1]: session-47.scope: Deactivated successfully.
Feb 01 09:13:14 np0005604215.localdomain systemd[1]: session-47.scope: Consumed 11.520s CPU time.
Feb 01 09:13:14 np0005604215.localdomain systemd-logind[761]: Removed session 47.
Feb 01 09:13:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46841 DF PROTO=TCP SPT=44422 DPT=9882 SEQ=2102494178 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653950D0000000001030307) 
Feb 01 09:13:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=54097 DF PROTO=TCP SPT=44992 DPT=9102 SEQ=122241056 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653A30E0000000001030307) 
Feb 01 09:13:20 np0005604215.localdomain sshd[145830]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:13:20 np0005604215.localdomain sshd[145830]: Accepted publickey for zuul from 192.168.122.30 port 36586 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:13:20 np0005604215.localdomain systemd-logind[761]: New session 48 of user zuul.
Feb 01 09:13:20 np0005604215.localdomain systemd[1]: Started Session 48 of User zuul.
Feb 01 09:13:20 np0005604215.localdomain sshd[145830]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:13:21 np0005604215.localdomain sudo[145923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sltqezadaerxbcvcspcryhwfablzdeqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937200.7819943-23-118698266762754/AnsiballZ_file.py
Feb 01 09:13:21 np0005604215.localdomain sudo[145923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:21 np0005604215.localdomain python3.9[145925]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:21 np0005604215.localdomain sudo[145923]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60151 DF PROTO=TCP SPT=36410 DPT=9100 SEQ=3909379442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653AF0E0000000001030307) 
Feb 01 09:13:22 np0005604215.localdomain sudo[146015]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipzulyuxlwzyfrenyiaugcwaleprkjfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937201.867174-60-158141258812831/AnsiballZ_stat.py
Feb 01 09:13:22 np0005604215.localdomain sudo[146015]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:22 np0005604215.localdomain python3.9[146017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:22 np0005604215.localdomain sudo[146015]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:22 np0005604215.localdomain sudo[146088]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsdjmfkodqkkqxrxgkyrogeyokjknkdi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937201.867174-60-158141258812831/AnsiballZ_copy.py
Feb 01 09:13:22 np0005604215.localdomain sudo[146088]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:23 np0005604215.localdomain python3.9[146090]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.client.openstack.keyring mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937201.867174-60-158141258812831/.source.keyring _original_basename=ceph.client.openstack.keyring follow=False checksum=814f759dcc97f4b50c85badaa6f3819c2533c70a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:23 np0005604215.localdomain sudo[146088]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:23 np0005604215.localdomain sudo[146180]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abcfelbikmdhipyvdmrgiwlknbodruhd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937203.310061-60-124996377668773/AnsiballZ_stat.py
Feb 01 09:13:23 np0005604215.localdomain sudo[146180]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:23 np0005604215.localdomain python3.9[146182]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/ceph/ceph.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:23 np0005604215.localdomain sudo[146180]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:24 np0005604215.localdomain sudo[146253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zeinkplgtxcestidlzufjjhpksahagpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937203.310061-60-124996377668773/AnsiballZ_copy.py
Feb 01 09:13:24 np0005604215.localdomain sudo[146253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:24 np0005604215.localdomain python3.9[146255]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/ceph/ceph.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937203.310061-60-124996377668773/.source.conf _original_basename=ceph.conf follow=False checksum=6c8f40813464a566eca7252d9e693fc8375e148c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:24 np0005604215.localdomain sudo[146253]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:24 np0005604215.localdomain sshd[145830]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:13:24 np0005604215.localdomain systemd[1]: session-48.scope: Deactivated successfully.
Feb 01 09:13:24 np0005604215.localdomain systemd[1]: session-48.scope: Consumed 2.274s CPU time.
Feb 01 09:13:24 np0005604215.localdomain systemd-logind[761]: Session 48 logged out. Waiting for processes to exit.
Feb 01 09:13:24 np0005604215.localdomain systemd-logind[761]: Removed session 48.
Feb 01 09:13:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31988 DF PROTO=TCP SPT=35728 DPT=9101 SEQ=1042277212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653BD0D0000000001030307) 
Feb 01 09:13:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59474 DF PROTO=TCP SPT=54120 DPT=9882 SEQ=2905514156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653CDFD0000000001030307) 
Feb 01 09:13:30 np0005604215.localdomain sshd[146270]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:13:30 np0005604215.localdomain sshd[146270]: Accepted publickey for zuul from 192.168.122.30 port 46390 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:13:30 np0005604215.localdomain systemd-logind[761]: New session 49 of user zuul.
Feb 01 09:13:30 np0005604215.localdomain systemd[1]: Started Session 49 of User zuul.
Feb 01 09:13:30 np0005604215.localdomain sshd[146270]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:13:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59475 DF PROTO=TCP SPT=54120 DPT=9882 SEQ=2905514156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653D20D0000000001030307) 
Feb 01 09:13:31 np0005604215.localdomain python3.9[146363]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:13:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59476 DF PROTO=TCP SPT=54120 DPT=9882 SEQ=2905514156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653DA0E0000000001030307) 
Feb 01 09:13:33 np0005604215.localdomain sudo[146457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vwkcuynteohdtzrrbrtbmmlmkmzzuapa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937212.1421285-59-219844831511629/AnsiballZ_file.py
Feb 01 09:13:33 np0005604215.localdomain sudo[146457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:33 np0005604215.localdomain sshd[146460]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:13:33 np0005604215.localdomain python3.9[146459]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:33 np0005604215.localdomain sudo[146457]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:34 np0005604215.localdomain sudo[146551]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yiqfjuqoheculbodpbvzgtxrmrjykyop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937213.9402092-59-31091269092304/AnsiballZ_file.py
Feb 01 09:13:34 np0005604215.localdomain sudo[146551]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:34 np0005604215.localdomain python3.9[146553]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:13:34 np0005604215.localdomain sudo[146551]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:34 np0005604215.localdomain sshd[146460]: Received disconnect from 85.206.171.113 port 34882:11: Bye Bye [preauth]
Feb 01 09:13:34 np0005604215.localdomain sshd[146460]: Disconnected from authenticating user root 85.206.171.113 port 34882 [preauth]
Feb 01 09:13:35 np0005604215.localdomain python3.9[146643]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:13:35 np0005604215.localdomain sudo[146733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fbsojlhvlhhcuvipgoiyidltbmocvzdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937215.3430908-128-193597562396024/AnsiballZ_seboolean.py
Feb 01 09:13:35 np0005604215.localdomain sudo[146733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:36 np0005604215.localdomain python3.9[146735]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 01 09:13:36 np0005604215.localdomain sudo[146733]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25548 DF PROTO=TCP SPT=37740 DPT=9102 SEQ=3613580980 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653E78E0000000001030307) 
Feb 01 09:13:36 np0005604215.localdomain sudo[146825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dywvyauvrquqfczviuyclyesbectswyu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937216.4854383-158-182179418362935/AnsiballZ_setup.py
Feb 01 09:13:36 np0005604215.localdomain sudo[146825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:37 np0005604215.localdomain python3.9[146827]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:13:37 np0005604215.localdomain sudo[146825]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:37 np0005604215.localdomain sudo[146879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bwzzvwyqkxhuqhkgvkrnccjheegvvwhb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937216.4854383-158-182179418362935/AnsiballZ_dnf.py
Feb 01 09:13:37 np0005604215.localdomain sudo[146879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:38 np0005604215.localdomain python3.9[146881]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:13:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43228 DF PROTO=TCP SPT=36978 DPT=9100 SEQ=3856211447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653F40D0000000001030307) 
Feb 01 09:13:41 np0005604215.localdomain sudo[146879]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:42 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31989 DF PROTO=TCP SPT=35728 DPT=9101 SEQ=1042277212 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA653FD0D0000000001030307) 
Feb 01 09:13:42 np0005604215.localdomain sudo[146973]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qdoamxoulbzahdgmcadxwjssuudqepif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937221.4323456-194-248989952082359/AnsiballZ_systemd.py
Feb 01 09:13:42 np0005604215.localdomain sudo[146973]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:42 np0005604215.localdomain python3.9[146975]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:13:43 np0005604215.localdomain sudo[146973]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:44 np0005604215.localdomain sudo[147068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtafojaghktciarsyerhosdiijrcxnfv ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937223.8229263-218-168780683042423/AnsiballZ_edpm_nftables_snippet.py
Feb 01 09:13:44 np0005604215.localdomain sudo[147068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:44 np0005604215.localdomain python3[147070]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks
                                                            rule:
                                                              proto: udp
                                                              dport: 4789
                                                          - rule_name: 119 neutron geneve networks
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              state: ["UNTRACKED"]
                                                          - rule_name: 120 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: OUTPUT
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                          - rule_name: 121 neutron geneve networks no conntrack
                                                            rule:
                                                              proto: udp
                                                              dport: 6081
                                                              table: raw
                                                              chain: PREROUTING
                                                              jump: NOTRACK
                                                              action: append
                                                              state: []
                                                           dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Feb 01 09:13:44 np0005604215.localdomain sudo[147068]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:45 np0005604215.localdomain sudo[147160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-myiyksjbecsquoxstlxacescjdsfcwia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937224.7974017-245-126003684027285/AnsiballZ_file.py
Feb 01 09:13:45 np0005604215.localdomain sudo[147160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:45 np0005604215.localdomain python3.9[147162]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:45 np0005604215.localdomain sudo[147160]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=59478 DF PROTO=TCP SPT=54120 DPT=9882 SEQ=2905514156 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6540B0D0000000001030307) 
Feb 01 09:13:45 np0005604215.localdomain sudo[147252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkurtqnzkkmzicrhpayiirlosdbqyqnb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937225.4756143-270-237929068107026/AnsiballZ_stat.py
Feb 01 09:13:45 np0005604215.localdomain sudo[147252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:46 np0005604215.localdomain python3.9[147254]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:46 np0005604215.localdomain sudo[147252]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:46 np0005604215.localdomain sudo[147300]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xoylotfpcbtokrlvnicpsuizxzssrhzy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937225.4756143-270-237929068107026/AnsiballZ_file.py
Feb 01 09:13:46 np0005604215.localdomain sudo[147300]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:46 np0005604215.localdomain python3.9[147302]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:46 np0005604215.localdomain sudo[147300]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:47 np0005604215.localdomain sudo[147392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fkinjkenpcozrlltjlqlkgwztkislvut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937226.8461692-306-24268383734001/AnsiballZ_stat.py
Feb 01 09:13:47 np0005604215.localdomain sudo[147392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:47 np0005604215.localdomain python3.9[147394]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:47 np0005604215.localdomain sudo[147392]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:47 np0005604215.localdomain sudo[147440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqphjejdfauamlzrwvzpwyyvjbhhjclx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937226.8461692-306-24268383734001/AnsiballZ_file.py
Feb 01 09:13:47 np0005604215.localdomain sudo[147440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:48 np0005604215.localdomain python3.9[147442]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=._7bmvx44 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:48 np0005604215.localdomain sudo[147440]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:48 np0005604215.localdomain sudo[147532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zwdzhrkbfaqcppcxoqjrhcsqwvtssmsq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937228.2168977-341-230668872335367/AnsiballZ_stat.py
Feb 01 09:13:48 np0005604215.localdomain sudo[147532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:48 np0005604215.localdomain python3.9[147534]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:48 np0005604215.localdomain sudo[147532]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25550 DF PROTO=TCP SPT=37740 DPT=9102 SEQ=3613580980 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654170D0000000001030307) 
Feb 01 09:13:48 np0005604215.localdomain sudo[147580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abkjftxcwuoboxqdsvsygiiqvzkrvlaj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937228.2168977-341-230668872335367/AnsiballZ_file.py
Feb 01 09:13:48 np0005604215.localdomain sudo[147580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:49 np0005604215.localdomain python3.9[147582]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:49 np0005604215.localdomain sudo[147580]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:49 np0005604215.localdomain sudo[147672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuwnxtwsnhrikqpgagdfisfgugnwhmzm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937229.3881843-380-128073684985595/AnsiballZ_command.py
Feb 01 09:13:49 np0005604215.localdomain sudo[147672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:50 np0005604215.localdomain python3.9[147674]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:13:50 np0005604215.localdomain sudo[147672]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:50 np0005604215.localdomain sudo[147765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmjdghnelonlkayiigdqivnnxliukqae ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937230.2980068-404-33840039555272/AnsiballZ_edpm_nftables_from_files.py
Feb 01 09:13:50 np0005604215.localdomain sudo[147765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:50 np0005604215.localdomain python3[147767]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 01 09:13:50 np0005604215.localdomain sudo[147765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:51 np0005604215.localdomain sudo[147857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjggdludvioucjgvxhgflkurwcuimtog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937231.133279-428-82978230770284/AnsiballZ_stat.py
Feb 01 09:13:51 np0005604215.localdomain sudo[147857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:51 np0005604215.localdomain python3.9[147859]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:51 np0005604215.localdomain sudo[147857]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:52 np0005604215.localdomain sudo[147932]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnnzqfuzrymzfeueulyjntmgsouwjqne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937231.133279-428-82978230770284/AnsiballZ_copy.py
Feb 01 09:13:52 np0005604215.localdomain sudo[147932]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43230 DF PROTO=TCP SPT=36978 DPT=9100 SEQ=3856211447 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654250E0000000001030307) 
Feb 01 09:13:52 np0005604215.localdomain python3.9[147934]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937231.133279-428-82978230770284/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:52 np0005604215.localdomain sudo[147932]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:52 np0005604215.localdomain sudo[148024]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvpbjwotkotxdvxaxmomkvtrffyphelq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937232.540732-473-13596121964821/AnsiballZ_stat.py
Feb 01 09:13:52 np0005604215.localdomain sudo[148024]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:53 np0005604215.localdomain python3.9[148026]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:53 np0005604215.localdomain sudo[148024]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:53 np0005604215.localdomain sudo[148099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpkqeoqocyygdczrnsklvwxtwdklokgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937232.540732-473-13596121964821/AnsiballZ_copy.py
Feb 01 09:13:53 np0005604215.localdomain sudo[148099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:53 np0005604215.localdomain python3.9[148101]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937232.540732-473-13596121964821/.source.nft follow=False _original_basename=jump-chain.j2 checksum=ac8dea350c18f51f54d48dacc09613cda4c5540c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:53 np0005604215.localdomain sudo[148099]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:54 np0005604215.localdomain sudo[148191]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsivctsacqsyszaaxowygfsxghhsqtjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937233.7307591-519-224998040110946/AnsiballZ_stat.py
Feb 01 09:13:54 np0005604215.localdomain sudo[148191]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:54 np0005604215.localdomain python3.9[148193]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:54 np0005604215.localdomain sudo[148191]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:54 np0005604215.localdomain sudo[148266]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxtekxotsnarggkociudfpdzamyhuric ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937233.7307591-519-224998040110946/AnsiballZ_copy.py
Feb 01 09:13:54 np0005604215.localdomain sudo[148266]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:54 np0005604215.localdomain python3.9[148268]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937233.7307591-519-224998040110946/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:54 np0005604215.localdomain sudo[148266]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:55 np0005604215.localdomain sudo[148358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywprevrigzpuuuaksgajijswuuwdhtrs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937235.021637-563-167786611964579/AnsiballZ_stat.py
Feb 01 09:13:55 np0005604215.localdomain sudo[148358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21739 DF PROTO=TCP SPT=55952 DPT=9101 SEQ=3486246171 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654310D0000000001030307) 
Feb 01 09:13:55 np0005604215.localdomain python3.9[148360]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:55 np0005604215.localdomain sudo[148358]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:55 np0005604215.localdomain sudo[148433]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ytumberyngvgnieugxspapplyykhruoi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937235.021637-563-167786611964579/AnsiballZ_copy.py
Feb 01 09:13:55 np0005604215.localdomain sudo[148433]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:56 np0005604215.localdomain python3.9[148435]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937235.021637-563-167786611964579/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:56 np0005604215.localdomain sudo[148433]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:56 np0005604215.localdomain sudo[148525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vstxayetgwamwmmpfqvqwyajzauugcas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937236.3777828-608-72438423011113/AnsiballZ_stat.py
Feb 01 09:13:56 np0005604215.localdomain sudo[148525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:56 np0005604215.localdomain python3.9[148527]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:13:56 np0005604215.localdomain sudo[148525]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:57 np0005604215.localdomain sudo[148600]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tirjrsyyzikpbmghsfxildmitegtpmbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937236.3777828-608-72438423011113/AnsiballZ_copy.py
Feb 01 09:13:57 np0005604215.localdomain sudo[148600]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:57 np0005604215.localdomain python3.9[148602]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937236.3777828-608-72438423011113/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:57 np0005604215.localdomain sudo[148600]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:57 np0005604215.localdomain sudo[148649]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:13:57 np0005604215.localdomain sudo[148649]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:13:57 np0005604215.localdomain sudo[148649]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:57 np0005604215.localdomain sudo[148664]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 09:13:57 np0005604215.localdomain sudo[148664]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:13:58 np0005604215.localdomain sudo[148722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayycdysznzfvzinpfrtumbormspupfqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937237.69815-653-258738066948901/AnsiballZ_file.py
Feb 01 09:13:58 np0005604215.localdomain sudo[148722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:58 np0005604215.localdomain python3.9[148724]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:58 np0005604215.localdomain sudo[148722]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:58 np0005604215.localdomain sudo[148664]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:58 np0005604215.localdomain sudo[148758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:13:58 np0005604215.localdomain sudo[148758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:13:58 np0005604215.localdomain sudo[148758]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:58 np0005604215.localdomain sudo[148776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:13:58 np0005604215.localdomain sudo[148776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:13:58 np0005604215.localdomain sudo[148866]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpyntclwhbvudbofndxpexkuhaqylwux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937238.5311542-677-173388122663251/AnsiballZ_command.py
Feb 01 09:13:58 np0005604215.localdomain sudo[148866]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:58 np0005604215.localdomain python3.9[148870]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:13:58 np0005604215.localdomain sudo[148866]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:59 np0005604215.localdomain sudo[148776]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:59 np0005604215.localdomain sudo[148992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnfhpioojbvihqakvzqlqhunitxpklse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937239.207036-702-225811710592069/AnsiballZ_blockinfile.py
Feb 01 09:13:59 np0005604215.localdomain sudo[148992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:13:59 np0005604215.localdomain python3.9[148994]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:13:59 np0005604215.localdomain sudo[148995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:13:59 np0005604215.localdomain sudo[148992]: pam_unix(sudo:session): session closed for user root
Feb 01 09:13:59 np0005604215.localdomain sudo[148995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:13:59 np0005604215.localdomain sudo[148995]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56978 DF PROTO=TCP SPT=41636 DPT=9882 SEQ=2464527128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654432D0000000001030307) 
Feb 01 09:14:00 np0005604215.localdomain sudo[149099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzkbsfmkodxzpxrljzmfpisefwyasqvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937240.2308962-728-62111500301298/AnsiballZ_command.py
Feb 01 09:14:00 np0005604215.localdomain sudo[149099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:00 np0005604215.localdomain python3.9[149101]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:00 np0005604215.localdomain sudo[149099]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56979 DF PROTO=TCP SPT=41636 DPT=9882 SEQ=2464527128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654474D0000000001030307) 
Feb 01 09:14:01 np0005604215.localdomain sudo[149192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhfngulyvdiftqxfqrrwevwdlvphnktx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937240.8923268-753-29152878307361/AnsiballZ_stat.py
Feb 01 09:14:01 np0005604215.localdomain sudo[149192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:01 np0005604215.localdomain python3.9[149194]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:14:01 np0005604215.localdomain sudo[149192]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:01 np0005604215.localdomain sudo[149286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwycybletexsqyxyfkycqyvmgqvqwbmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937241.5570402-776-122171303224257/AnsiballZ_command.py
Feb 01 09:14:01 np0005604215.localdomain sudo[149286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:02 np0005604215.localdomain python3.9[149288]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:02 np0005604215.localdomain sudo[149286]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:02 np0005604215.localdomain sudo[149381]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbnqcapexudccvvilsptclggnryarbgw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937242.2854786-800-73369757721128/AnsiballZ_file.py
Feb 01 09:14:02 np0005604215.localdomain sudo[149381]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:02 np0005604215.localdomain python3.9[149383]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:02 np0005604215.localdomain sudo[149381]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56980 DF PROTO=TCP SPT=41636 DPT=9882 SEQ=2464527128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6544F4D0000000001030307) 
Feb 01 09:14:03 np0005604215.localdomain python3.9[149473]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:14:05 np0005604215.localdomain sudo[149564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vngyxdynkqtpcztfgkevwmobmolpwzme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937244.8165364-920-23603791053294/AnsiballZ_command.py
Feb 01 09:14:05 np0005604215.localdomain sudo[149564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:05 np0005604215.localdomain python3.9[149566]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=np0005604215.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:0e:0a:99:13:90:9c" external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch 
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:05 np0005604215.localdomain ovs-vsctl[149567]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=np0005604215.localdomain external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:0e:0a:99:13:90:9c external_ids:ovn-encap-ip=172.19.0.108 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=tcp:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Feb 01 09:14:05 np0005604215.localdomain sudo[149564]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:05 np0005604215.localdomain sudo[149657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tdibnhpwcesgapbimotudupfwdxzhwoj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937245.5215037-948-174940325641424/AnsiballZ_command.py
Feb 01 09:14:05 np0005604215.localdomain sudo[149657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:05 np0005604215.localdomain python3.9[149659]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ovs-vsctl show | grep -q "Manager"
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:06 np0005604215.localdomain sudo[149657]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32301 DF PROTO=TCP SPT=58952 DPT=9102 SEQ=3476738004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6545CCD0000000001030307) 
Feb 01 09:14:06 np0005604215.localdomain python3.9[149752]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:14:07 np0005604215.localdomain sudo[149844]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wahkyszqjjqzgwcukauksrawcdtkkzfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937247.0751276-1001-250454173371219/AnsiballZ_file.py
Feb 01 09:14:07 np0005604215.localdomain sudo[149844]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:07 np0005604215.localdomain python3.9[149846]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:07 np0005604215.localdomain sudo[149844]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:08 np0005604215.localdomain sudo[149936]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eonsnljbftizhejcratuhpfavyzlumpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937247.7464535-1025-188630367867972/AnsiballZ_stat.py
Feb 01 09:14:08 np0005604215.localdomain sudo[149936]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:08 np0005604215.localdomain python3.9[149938]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:08 np0005604215.localdomain sudo[149936]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:08 np0005604215.localdomain sudo[149984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oraczgckcdisjwavgwwqkbbdmblzpshx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937247.7464535-1025-188630367867972/AnsiballZ_file.py
Feb 01 09:14:08 np0005604215.localdomain sudo[149984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:08 np0005604215.localdomain python3.9[149986]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:08 np0005604215.localdomain sudo[149984]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:09 np0005604215.localdomain sudo[150076]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oezrceihwrurwsbxudlskntwvdexmphu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937248.8177958-1025-124497805540321/AnsiballZ_stat.py
Feb 01 09:14:09 np0005604215.localdomain sudo[150076]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:09 np0005604215.localdomain python3.9[150078]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:09 np0005604215.localdomain sudo[150076]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:09 np0005604215.localdomain sudo[150124]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abuvzkjdnpfqzghnvmjkpywkhcdwngyz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937248.8177958-1025-124497805540321/AnsiballZ_file.py
Feb 01 09:14:09 np0005604215.localdomain sudo[150124]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13061 DF PROTO=TCP SPT=39794 DPT=9100 SEQ=145092015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654694E0000000001030307) 
Feb 01 09:14:09 np0005604215.localdomain python3.9[150126]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:09 np0005604215.localdomain sudo[150124]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:10 np0005604215.localdomain sudo[150216]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pjrertqzdsntohqpotzakyfpzteuisdl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937250.1782598-1094-33114190090174/AnsiballZ_file.py
Feb 01 09:14:10 np0005604215.localdomain sudo[150216]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:10 np0005604215.localdomain python3.9[150218]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:10 np0005604215.localdomain sudo[150216]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:11 np0005604215.localdomain sudo[150308]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-payulhhnkmrybrvikfthfwjiepqjywsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937250.83296-1118-125095006460011/AnsiballZ_stat.py
Feb 01 09:14:11 np0005604215.localdomain sudo[150308]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:11 np0005604215.localdomain python3.9[150310]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:11 np0005604215.localdomain sudo[150308]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:11 np0005604215.localdomain sudo[150356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ksqznywvimofdycziqhffxwqucjmgfsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937250.83296-1118-125095006460011/AnsiballZ_file.py
Feb 01 09:14:11 np0005604215.localdomain sudo[150356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:11 np0005604215.localdomain python3.9[150358]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:11 np0005604215.localdomain sudo[150356]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:12 np0005604215.localdomain sudo[150448]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hupdjnouglakfegjmqbemetehvrmcjmi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937252.288551-1155-224557870517950/AnsiballZ_stat.py
Feb 01 09:14:12 np0005604215.localdomain sudo[150448]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:12 np0005604215.localdomain python3.9[150450]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:12 np0005604215.localdomain sudo[150448]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:13 np0005604215.localdomain sudo[150496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llumaoyomrxqpuedodgqsvjgfroxbyes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937252.288551-1155-224557870517950/AnsiballZ_file.py
Feb 01 09:14:13 np0005604215.localdomain sudo[150496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47171 DF PROTO=TCP SPT=34220 DPT=9101 SEQ=2025353292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654764E0000000001030307) 
Feb 01 09:14:13 np0005604215.localdomain python3.9[150498]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:13 np0005604215.localdomain sudo[150496]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:13 np0005604215.localdomain sudo[150588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tznvpqzzxndxzlunaobflxqlynevkzul ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937253.436232-1190-146110359726105/AnsiballZ_systemd.py
Feb 01 09:14:13 np0005604215.localdomain sudo[150588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:14 np0005604215.localdomain python3.9[150590]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:14:14 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:14:14 np0005604215.localdomain systemd-sysv-generator[150619]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:14:14 np0005604215.localdomain systemd-rc-local-generator[150616]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:14:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:14:14 np0005604215.localdomain sudo[150588]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56982 DF PROTO=TCP SPT=41636 DPT=9882 SEQ=2464527128 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6547F0D0000000001030307) 
Feb 01 09:14:15 np0005604215.localdomain sudo[150717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ndagiwmdyiokjiqiclpitbzhyuncqcmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937255.541711-1215-246653995522481/AnsiballZ_stat.py
Feb 01 09:14:15 np0005604215.localdomain sudo[150717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:16 np0005604215.localdomain python3.9[150719]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:16 np0005604215.localdomain sudo[150717]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:16 np0005604215.localdomain sudo[150765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjpzwoqbntlcadgxjjtmfychezrioyhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937255.541711-1215-246653995522481/AnsiballZ_file.py
Feb 01 09:14:16 np0005604215.localdomain sudo[150765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:16 np0005604215.localdomain python3.9[150767]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:16 np0005604215.localdomain sudo[150765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:16 np0005604215.localdomain sudo[150857]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ssbihxramikkjbqmlhpcusqhlityvrqi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937256.6964214-1251-204225063548022/AnsiballZ_stat.py
Feb 01 09:14:16 np0005604215.localdomain sudo[150857]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:17 np0005604215.localdomain python3.9[150859]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:17 np0005604215.localdomain sudo[150857]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:17 np0005604215.localdomain sudo[150905]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-charwjdvxbttumdyvrmqkctypwazabem ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937256.6964214-1251-204225063548022/AnsiballZ_file.py
Feb 01 09:14:17 np0005604215.localdomain sudo[150905]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:17 np0005604215.localdomain python3.9[150907]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:17 np0005604215.localdomain sudo[150905]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:18 np0005604215.localdomain sudo[150997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aojmvzhmiicjrcwsjjqlwyerghwyoduj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937257.933072-1287-336457640555/AnsiballZ_systemd.py
Feb 01 09:14:18 np0005604215.localdomain sudo[150997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:18 np0005604215.localdomain python3.9[150999]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:14:18 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:14:18 np0005604215.localdomain systemd-sysv-generator[151027]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:14:18 np0005604215.localdomain systemd-rc-local-generator[151024]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:14:18 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:14:18 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 09:14:18 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 09:14:18 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 09:14:18 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 09:14:18 np0005604215.localdomain sudo[150997]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32303 DF PROTO=TCP SPT=58952 DPT=9102 SEQ=3476738004 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6548D0D0000000001030307) 
Feb 01 09:14:20 np0005604215.localdomain sudo[151133]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yvxczaodqqyhppofnirkogurmlvjsdwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937260.3198779-1317-179281535532551/AnsiballZ_file.py
Feb 01 09:14:20 np0005604215.localdomain sudo[151133]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:20 np0005604215.localdomain python3.9[151135]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:20 np0005604215.localdomain sudo[151133]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:21 np0005604215.localdomain sudo[151225]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnyvtpavvxyehfqrqmagybyzfidsjxzf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937260.989658-1340-256392541365729/AnsiballZ_stat.py
Feb 01 09:14:21 np0005604215.localdomain sudo[151225]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:21 np0005604215.localdomain python3.9[151227]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:21 np0005604215.localdomain sudo[151225]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:21 np0005604215.localdomain sudo[151298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltymrbzznzygpeflzctqgcbnvmefwcqs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937260.989658-1340-256392541365729/AnsiballZ_copy.py
Feb 01 09:14:21 np0005604215.localdomain sudo[151298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=13063 DF PROTO=TCP SPT=39794 DPT=9100 SEQ=145092015 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654990D0000000001030307) 
Feb 01 09:14:22 np0005604215.localdomain python3.9[151300]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937260.989658-1340-256392541365729/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:22 np0005604215.localdomain sudo[151298]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:22 np0005604215.localdomain sudo[151390]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvtqhlqkrifgvdqrtwhqfpwryivusgms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937262.5353355-1391-248623700011980/AnsiballZ_file.py
Feb 01 09:14:22 np0005604215.localdomain sudo[151390]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:23 np0005604215.localdomain python3.9[151392]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:23 np0005604215.localdomain sudo[151390]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:23 np0005604215.localdomain sudo[151482]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qrftmkwmzvvnxlzzthydkoviukiznwuy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937263.222164-1416-63852628653867/AnsiballZ_file.py
Feb 01 09:14:23 np0005604215.localdomain sudo[151482]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:23 np0005604215.localdomain python3.9[151484]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:23 np0005604215.localdomain sudo[151482]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:24 np0005604215.localdomain sudo[151574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdrobodmmvfdvycwhhjwsvpwzkkjlnlo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937263.908645-1439-32607004191211/AnsiballZ_stat.py
Feb 01 09:14:24 np0005604215.localdomain sudo[151574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:24 np0005604215.localdomain python3.9[151576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:24 np0005604215.localdomain sudo[151574]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:24 np0005604215.localdomain sudo[151649]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jptcgfccpxtbqhofxyoqpxftbfiohqod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937263.908645-1439-32607004191211/AnsiballZ_copy.py
Feb 01 09:14:24 np0005604215.localdomain sudo[151649]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:24 np0005604215.localdomain python3.9[151651]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937263.908645-1439-32607004191211/.source.json _original_basename=.bmxxkd6m follow=False checksum=38f75f59f5c2ef6b5da12297bfd31cd1e97012ac backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:24 np0005604215.localdomain sudo[151649]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47173 DF PROTO=TCP SPT=34220 DPT=9101 SEQ=2025353292 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654A70D0000000001030307) 
Feb 01 09:14:25 np0005604215.localdomain python3.9[151741]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:27 np0005604215.localdomain sudo[151992]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrvkipgnvhlwcekjeqrcrmfqdjbzisyc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937267.3907666-1559-47478983496996/AnsiballZ_container_config_data.py
Feb 01 09:14:27 np0005604215.localdomain sudo[151992]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:28 np0005604215.localdomain python3.9[151994]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Feb 01 09:14:28 np0005604215.localdomain sudo[151992]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:28 np0005604215.localdomain sudo[152084]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xihheieoispddxetrnywijlgljirpryu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937268.4784613-1592-119926829427106/AnsiballZ_container_config_hash.py
Feb 01 09:14:28 np0005604215.localdomain sudo[152084]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:29 np0005604215.localdomain python3.9[152086]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:14:29 np0005604215.localdomain sudo[152084]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16169 DF PROTO=TCP SPT=48340 DPT=9882 SEQ=4134950418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654B85D0000000001030307) 
Feb 01 09:14:30 np0005604215.localdomain sudo[152176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-juhowdgadqldlkobprqqoxrmfsubtasx ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937269.5390756-1622-153237527317481/AnsiballZ_edpm_container_manage.py
Feb 01 09:14:30 np0005604215.localdomain sudo[152176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:30 np0005604215.localdomain python3[152178]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json containers=['ovn_controller'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:14:30 np0005604215.localdomain python3[152178]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "9f8c6308802db66f6c1100257e3fa9593740e85d82f038b4185cf756493dc94e",
                                                                    "Digest": "sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller@sha256:099d88ae13fa2b3409da5310cdcba7fa01d2c87a8bc98296299a57054b9a075e"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:38:56.623500445Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 346422728,
                                                                    "VirtualSize": 346422728,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:033e0289d512b27a678c3feb7195acb9c5f2fbb27c9b2d8c8b5b5f6156f0d11f",
                                                                              "sha256:f848a534c5dfe59c31c3da34c3d2466bdea7e8da7def4225acdd3ffef1544d2f"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:00.623406883Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:55.918991169Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch openvswitch-ovn-common python3-netifaces python3-openvswitch tcpdump && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:57.814850041Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:38:21.443386852Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ovn-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:38:56.622512308Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openvswitch-ovn-host && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:38:57.466949121Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 01 09:14:30 np0005604215.localdomain podman[152228]: 2026-02-01 09:14:30.814552964 +0000 UTC m=+0.092954164 container remove e9aad77783b40c342d984b2f22c7e9e198801b7dddada155e338bf18c344e257 (image=registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1, name=ovn_controller, io.k8s.description=Red Hat OpenStack Platform 17.1 ovn-controller, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'test': '/openstack/healthcheck 6642'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ovn-controller:17.1', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'user': 'root', 'volumes': ['/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/log/containers/openvswitch:/var/log/openvswitch:z', '/var/log/containers/openvswitch:/var/log/ovn:z']}, summary=Red Hat OpenStack Platform 17.1 ovn-controller, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, build-date=2026-01-12T22:36:40Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ovn-controller, release=1766032510, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ovn-controller, container_name=ovn_controller, url=https://www.redhat.com, org.opencontainers.image.revision=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, architecture=x86_64, io.openshift.expose-services=, version=17.1.13, name=rhosp-rhel9/openstack-ovn-controller, io.buildah.version=1.41.5, tcib_managed=true, config_id=tripleo_step4, org.opencontainers.image.created=2026-01-12T22:36:40Z, vcs-ref=ac1441bb241fd7ea83ec557f8c760d89937b0b0c, description=Red Hat OpenStack Platform 17.1 ovn-controller, maintainer=OpenStack TripleO Team, cpe=cpe:/a:redhat:openstack:17.1::el9, vendor=Red Hat, Inc., distribution-scope=public, vcs-type=git, com.redhat.component=openstack-ovn-controller-container)
Feb 01 09:14:30 np0005604215.localdomain python3[152178]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_controller
Feb 01 09:14:30 np0005604215.localdomain podman[152241]: 
Feb 01 09:14:30 np0005604215.localdomain podman[152241]: 2026-02-01 09:14:30.924189184 +0000 UTC m=+0.088020651 container create c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 01 09:14:30 np0005604215.localdomain podman[152241]: 2026-02-01 09:14:30.88110272 +0000 UTC m=+0.044934227 image pull  quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 01 09:14:30 np0005604215.localdomain python3[152178]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519 --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Feb 01 09:14:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16170 DF PROTO=TCP SPT=48340 DPT=9882 SEQ=4134950418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654BC4E0000000001030307) 
Feb 01 09:14:31 np0005604215.localdomain sudo[152176]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:31 np0005604215.localdomain sudo[152368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nqsvetornjqvmlxhogeqmkpwzxcylxmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937271.366709-1646-188943123697121/AnsiballZ_stat.py
Feb 01 09:14:31 np0005604215.localdomain sudo[152368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:32 np0005604215.localdomain python3.9[152370]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:14:32 np0005604215.localdomain sudo[152368]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:32 np0005604215.localdomain sudo[152462]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtdruzkfoheypxacjmchvnyifgemnocp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937272.399568-1673-18575340160414/AnsiballZ_file.py
Feb 01 09:14:32 np0005604215.localdomain sudo[152462]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:32 np0005604215.localdomain python3.9[152464]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:32 np0005604215.localdomain sudo[152462]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16171 DF PROTO=TCP SPT=48340 DPT=9882 SEQ=4134950418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654C44D0000000001030307) 
Feb 01 09:14:33 np0005604215.localdomain sudo[152508]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipzxzzrhjxyctsqbpgqslcqaomzkrxju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937272.399568-1673-18575340160414/AnsiballZ_stat.py
Feb 01 09:14:33 np0005604215.localdomain sudo[152508]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:33 np0005604215.localdomain python3.9[152510]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:14:33 np0005604215.localdomain sudo[152508]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:33 np0005604215.localdomain sudo[152599]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcyihansatenaghwzjpzezqyikfxocly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937273.348835-1673-17743743854873/AnsiballZ_copy.py
Feb 01 09:14:33 np0005604215.localdomain sudo[152599]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:33 np0005604215.localdomain python3.9[152601]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937273.348835-1673-17743743854873/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:33 np0005604215.localdomain sudo[152599]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:34 np0005604215.localdomain sudo[152645]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qwtbgcnzdnobrwhkumqjlfolxlomrysn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937273.348835-1673-17743743854873/AnsiballZ_systemd.py
Feb 01 09:14:34 np0005604215.localdomain sudo[152645]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:34 np0005604215.localdomain python3.9[152647]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:14:34 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:14:34 np0005604215.localdomain systemd-sysv-generator[152675]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:14:34 np0005604215.localdomain systemd-rc-local-generator[152671]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:14:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:14:34 np0005604215.localdomain sudo[152645]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:35 np0005604215.localdomain sudo[152727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epavgwtwoqvhxjykzqnwfoekfrtwkoqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937273.348835-1673-17743743854873/AnsiballZ_systemd.py
Feb 01 09:14:35 np0005604215.localdomain sudo[152727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:35 np0005604215.localdomain python3.9[152729]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:14:35 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:14:35 np0005604215.localdomain systemd-rc-local-generator[152757]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:14:35 np0005604215.localdomain systemd-sysv-generator[152760]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:14:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:14:35 np0005604215.localdomain systemd[1]: Starting dnf makecache...
Feb 01 09:14:35 np0005604215.localdomain systemd[1]: Starting ovn_controller container...
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: tmp-crun.4xu3db.mount: Deactivated successfully.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:14:36 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b883fdd7a94716d25b4111e0521450c77982b7d94557f6b979a4ec8b45324f27/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:14:36 np0005604215.localdomain podman[152772]: 2026-02-01 09:14:36.063196252 +0000 UTC m=+0.180847039 container init c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + sudo -E kolla_set_configs
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:14:36 np0005604215.localdomain podman[152772]: 2026-02-01 09:14:36.095839235 +0000 UTC m=+0.213490042 container start c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:14:36 np0005604215.localdomain edpm-start-podman-container[152772]: ovn_controller
Feb 01 09:14:36 np0005604215.localdomain dnf[152769]: Updating Subscription Management repositories.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 0.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/0...
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/0.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Starting User Manager for UID 0...
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: pam_unix(systemd-user:session): session opened for user root(uid=0) by (uid=0)
Feb 01 09:14:36 np0005604215.localdomain podman[152794]: 2026-02-01 09:14:36.20946927 +0000 UTC m=+0.103803291 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller)
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Queued start job for default target Main User Target.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Created slice User Application Slice.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Reached target Paths.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Reached target Timers.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Starting D-Bus User Message Bus Socket...
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Starting Create User's Volatile Files and Directories...
Feb 01 09:14:36 np0005604215.localdomain podman[152794]: 2026-02-01 09:14:36.301529798 +0000 UTC m=+0.195863749 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Listening on D-Bus User Message Bus Socket.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Reached target Sockets.
Feb 01 09:14:36 np0005604215.localdomain podman[152794]: unhealthy
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Finished Create User's Volatile Files and Directories.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Reached target Basic System.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Reached target Main User Target.
Feb 01 09:14:36 np0005604215.localdomain systemd[152818]: Startup finished in 111ms.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started User Manager for UID 0.
Feb 01 09:14:36 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.1 (250 of 333 items), suggesting rotation.
Feb 01 09:14:36 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 09:14:36 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started Session c11 of User root.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Failed with result 'exit-code'.
Feb 01 09:14:36 np0005604215.localdomain edpm-start-podman-container[152771]: Creating additional drop-in dependency for "ovn_controller" (c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835)
Feb 01 09:14:36 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: INFO:__main__:Validating config file
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: INFO:__main__:Writing out command to execute
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: ++ cat /run_command
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + ARGS=
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + sudo kolla_copy_cacerts
Feb 01 09:14:36 np0005604215.localdomain systemd-rc-local-generator[152882]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:14:36 np0005604215.localdomain systemd-sysv-generator[152885]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:14:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16437 DF PROTO=TCP SPT=39674 DPT=9102 SEQ=4255682689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654D20E0000000001030307) 
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: session-c11.scope: Deactivated successfully.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started ovn_controller container.
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: Started Session c12 of User root.
Feb 01 09:14:36 np0005604215.localdomain sudo[152727]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:36 np0005604215.localdomain systemd[1]: session-c12.scope: Deactivated successfully.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + [[ ! -n '' ]]
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + . kolla_extend_start
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '\'''
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + umask 0022
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock '
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00005|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00007|reconnect|INFO|tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00012|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00013|main|INFO|OVS feature set changed, force recompute.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00014|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00015|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00017|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00018|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00019|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00020|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00021|main|INFO|OVS feature set changed, force recompute.
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00022|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 01 09:14:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:36Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Feb 01 09:14:37 np0005604215.localdomain dnf[152769]: Metadata cache refreshed recently.
Feb 01 09:14:38 np0005604215.localdomain systemd[1]: dnf-makecache.service: Deactivated successfully.
Feb 01 09:14:38 np0005604215.localdomain systemd[1]: Finished dnf makecache.
Feb 01 09:14:38 np0005604215.localdomain systemd[1]: dnf-makecache.service: Consumed 2.143s CPU time.
Feb 01 09:14:38 np0005604215.localdomain python3.9[152985]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:14:39 np0005604215.localdomain sudo[153075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lvvatafamxxntnuoruaxdtlrinvcqxpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937279.1309137-1808-29059686757745/AnsiballZ_stat.py
Feb 01 09:14:39 np0005604215.localdomain sudo[153075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:39 np0005604215.localdomain python3.9[153077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:39 np0005604215.localdomain sudo[153075]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45408 DF PROTO=TCP SPT=48746 DPT=9100 SEQ=754574183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654DE8D0000000001030307) 
Feb 01 09:14:39 np0005604215.localdomain sudo[153148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ohyirgjcilrzytyusphmarnnroqrsyuw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937279.1309137-1808-29059686757745/AnsiballZ_copy.py
Feb 01 09:14:39 np0005604215.localdomain sudo[153148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:40 np0005604215.localdomain python3.9[153150]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937279.1309137-1808-29059686757745/.source.yaml _original_basename=.3jiymto_ follow=False checksum=4ef88525fff00a5112f620461f949f82fa85c4cb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:14:40 np0005604215.localdomain sudo[153148]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:40 np0005604215.localdomain sudo[153240]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cyqdcrztpkoryccbttzvourgawkedbjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937280.3829687-1854-76817607135287/AnsiballZ_command.py
Feb 01 09:14:40 np0005604215.localdomain sudo[153240]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:40 np0005604215.localdomain python3.9[153242]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:40 np0005604215.localdomain ovs-vsctl[153243]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Feb 01 09:14:40 np0005604215.localdomain sudo[153240]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:41 np0005604215.localdomain sudo[153333]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pcetavpvboalzypopsavrrkrlztbawmb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937281.0592427-1878-180496218217291/AnsiballZ_command.py
Feb 01 09:14:41 np0005604215.localdomain sudo[153333]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:41 np0005604215.localdomain python3.9[153335]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:41 np0005604215.localdomain ovs-vsctl[153337]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Feb 01 09:14:41 np0005604215.localdomain sudo[153333]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:42 np0005604215.localdomain sudo[153428]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwmgefffqeaqyrloglhtggketqpxdeil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937282.1809893-1920-83375374988549/AnsiballZ_command.py
Feb 01 09:14:42 np0005604215.localdomain sudo[153428]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:42 np0005604215.localdomain python3.9[153430]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:14:42 np0005604215.localdomain ovs-vsctl[153431]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Feb 01 09:14:42 np0005604215.localdomain sudo[153428]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40900 DF PROTO=TCP SPT=44330 DPT=9101 SEQ=2650133204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654EB8E0000000001030307) 
Feb 01 09:14:43 np0005604215.localdomain sshd[146270]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:14:43 np0005604215.localdomain systemd[1]: session-49.scope: Deactivated successfully.
Feb 01 09:14:43 np0005604215.localdomain systemd[1]: session-49.scope: Consumed 41.213s CPU time.
Feb 01 09:14:43 np0005604215.localdomain systemd-logind[761]: Session 49 logged out. Waiting for processes to exit.
Feb 01 09:14:43 np0005604215.localdomain systemd-logind[761]: Removed session 49.
Feb 01 09:14:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16173 DF PROTO=TCP SPT=48340 DPT=9882 SEQ=4134950418 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA654F50D0000000001030307) 
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 0...
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Activating special unit Exit the Session...
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped target Main User Target.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped target Basic System.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped target Paths.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped target Sockets.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped target Timers.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Closed D-Bus User Message Bus Socket.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Stopped Create User's Volatile Files and Directories.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Removed slice User Application Slice.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Reached target Shutdown.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Finished Exit the Session.
Feb 01 09:14:46 np0005604215.localdomain systemd[152818]: Reached target Exit the Session.
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: user@0.service: Deactivated successfully.
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 0.
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/0...
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: run-user-0.mount: Deactivated successfully.
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/0.
Feb 01 09:14:46 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 0.
Feb 01 09:14:49 np0005604215.localdomain sshd[153448]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:14:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16439 DF PROTO=TCP SPT=39674 DPT=9102 SEQ=4255682689 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655030D0000000001030307) 
Feb 01 09:14:49 np0005604215.localdomain sshd[153448]: Accepted publickey for zuul from 192.168.122.30 port 33944 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:14:49 np0005604215.localdomain systemd-logind[761]: New session 51 of user zuul.
Feb 01 09:14:49 np0005604215.localdomain systemd[1]: Started Session 51 of User zuul.
Feb 01 09:14:49 np0005604215.localdomain sshd[153448]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:14:50 np0005604215.localdomain python3.9[153541]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:14:51 np0005604215.localdomain sudo[153635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-culcbhvivwgatyiioulnmolvnarngtfg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937290.8654926-59-36424152774426/AnsiballZ_file.py
Feb 01 09:14:51 np0005604215.localdomain sudo[153635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:51 np0005604215.localdomain python3.9[153637]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/openstack/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:51 np0005604215.localdomain sudo[153635]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:51 np0005604215.localdomain sudo[153727]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wopoupwlebsgblzrzibcpvvcdsmvtikc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937291.6251204-59-153207157406857/AnsiballZ_file.py
Feb 01 09:14:51 np0005604215.localdomain sudo[153727]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:52 np0005604215.localdomain python3.9[153729]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:52 np0005604215.localdomain sudo[153727]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45410 DF PROTO=TCP SPT=48746 DPT=9100 SEQ=754574183 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6550F0D0000000001030307) 
Feb 01 09:14:52 np0005604215.localdomain sudo[153819]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjaxcbiijkksxpvzdslkxfkqbkgpfpel ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937292.2486985-59-137043429874445/AnsiballZ_file.py
Feb 01 09:14:52 np0005604215.localdomain sudo[153819]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:52 np0005604215.localdomain python3.9[153821]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:52 np0005604215.localdomain sudo[153819]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:52Z|00023|memory|INFO|14972 kB peak resident set size after 16.3 seconds
Feb 01 09:14:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:14:52Z|00024|memory|INFO|idl-cells-OVN_Southbound:4033 idl-cells-Open_vSwitch:813 ofctrl_desired_flow_usage-KB:9 ofctrl_installed_flow_usage-KB:7 ofctrl_sb_flow_ref_usage-KB:3
Feb 01 09:14:53 np0005604215.localdomain sudo[153911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nybkksbjuymhnqkopwrjfsssbhrzfnkd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937292.9350514-59-100634313567295/AnsiballZ_file.py
Feb 01 09:14:53 np0005604215.localdomain sudo[153911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:53 np0005604215.localdomain python3.9[153913]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:53 np0005604215.localdomain sudo[153911]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:53 np0005604215.localdomain sudo[154003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axlmhdyawtyegznmqnsdufvuszqcxezl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937293.6606312-59-121830860982510/AnsiballZ_file.py
Feb 01 09:14:53 np0005604215.localdomain sudo[154003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:54 np0005604215.localdomain python3.9[154005]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:54 np0005604215.localdomain sudo[154003]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:54 np0005604215.localdomain sshd[154052]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:14:54 np0005604215.localdomain python3.9[154097]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:14:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=40902 DF PROTO=TCP SPT=44330 DPT=9101 SEQ=2650133204 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6551B0D0000000001030307) 
Feb 01 09:14:55 np0005604215.localdomain sshd[154052]: Invalid user tomcat9 from 85.206.171.113 port 52114
Feb 01 09:14:55 np0005604215.localdomain sshd[154052]: Received disconnect from 85.206.171.113 port 52114:11: Bye Bye [preauth]
Feb 01 09:14:55 np0005604215.localdomain sshd[154052]: Disconnected from invalid user tomcat9 85.206.171.113 port 52114 [preauth]
Feb 01 09:14:56 np0005604215.localdomain sudo[154188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cysbbintwobvvrmhgvhyffhxvgfqdxxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937295.6270328-191-123269376763596/AnsiballZ_seboolean.py
Feb 01 09:14:56 np0005604215.localdomain sudo[154188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:14:56 np0005604215.localdomain python3.9[154190]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Feb 01 09:14:56 np0005604215.localdomain sudo[154188]: pam_unix(sudo:session): session closed for user root
Feb 01 09:14:57 np0005604215.localdomain python3.9[154280]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:58 np0005604215.localdomain python3.9[154353]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937296.7601776-215-265932697695622/.source follow=False _original_basename=haproxy.j2 checksum=a5072e7b19ca96a1f495d94f97f31903737cfd27 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:14:58 np0005604215.localdomain python3.9[154443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:14:59 np0005604215.localdomain python3.9[154516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937298.2856524-260-276888307442129/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:00 np0005604215.localdomain sudo[154590]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:15:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19082 DF PROTO=TCP SPT=35370 DPT=9882 SEQ=883227336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6552D8D0000000001030307) 
Feb 01 09:15:00 np0005604215.localdomain sudo[154590]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:15:00 np0005604215.localdomain sudo[154620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kepoedxdanqtnqclofvfvhxmubjqicbq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937299.7233083-311-157780872289401/AnsiballZ_setup.py
Feb 01 09:15:00 np0005604215.localdomain sudo[154620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:00 np0005604215.localdomain sudo[154590]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:00 np0005604215.localdomain sudo[154624]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:15:00 np0005604215.localdomain sudo[154624]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:15:00 np0005604215.localdomain python3.9[154623]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:15:00 np0005604215.localdomain sudo[154620]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:00 np0005604215.localdomain sudo[154624]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:01 np0005604215.localdomain sudo[154721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vougjhpsicfgwgdvvcxhvxoopiyvqkji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937299.7233083-311-157780872289401/AnsiballZ_dnf.py
Feb 01 09:15:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19083 DF PROTO=TCP SPT=35370 DPT=9882 SEQ=883227336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655318E0000000001030307) 
Feb 01 09:15:01 np0005604215.localdomain sudo[154721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:01 np0005604215.localdomain python3.9[154723]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:15:01 np0005604215.localdomain sudo[154725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:15:01 np0005604215.localdomain sudo[154725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:15:01 np0005604215.localdomain sudo[154725]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19084 DF PROTO=TCP SPT=35370 DPT=9882 SEQ=883227336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655398D0000000001030307) 
Feb 01 09:15:04 np0005604215.localdomain sudo[154721]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:05 np0005604215.localdomain sudo[154830]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqtkxaeetbnrjjfcqzetwpwwewplnbju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937304.661793-347-68939704900858/AnsiballZ_systemd.py
Feb 01 09:15:05 np0005604215.localdomain sudo[154830]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:05 np0005604215.localdomain python3.9[154832]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:15:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:15:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5367 DF PROTO=TCP SPT=52206 DPT=9102 SEQ=3731879935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655472F0000000001030307) 
Feb 01 09:15:06 np0005604215.localdomain sudo[154830]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:06 np0005604215.localdomain podman[154835]: 2026-02-01 09:15:06.651946773 +0000 UTC m=+0.090044409 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 01 09:15:06 np0005604215.localdomain podman[154835]: 2026-02-01 09:15:06.690482598 +0000 UTC m=+0.128580234 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 01 09:15:06 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:15:07 np0005604215.localdomain python3.9[154950]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:07 np0005604215.localdomain python3.9[155021]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937306.777654-371-104713098715086/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:08 np0005604215.localdomain python3.9[155111]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:08 np0005604215.localdomain python3.9[155182]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937307.8167362-371-53900048470524/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28239 DF PROTO=TCP SPT=38894 DPT=9100 SEQ=4282933993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655538D0000000001030307) 
Feb 01 09:15:10 np0005604215.localdomain python3.9[155272]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:10 np0005604215.localdomain python3.9[155343]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937309.747433-504-103882764735297/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=aa9e89725fbcebf7a5c773d7b97083445b7b7759 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:11 np0005604215.localdomain python3.9[155433]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:11 np0005604215.localdomain python3.9[155504]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937310.8610072-504-164437226338806/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=979187b925479d81d0609f4188e5b95fe1f92c18 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:12 np0005604215.localdomain python3.9[155594]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:15:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34642 DF PROTO=TCP SPT=48596 DPT=9101 SEQ=3909676888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65560CD0000000001030307) 
Feb 01 09:15:13 np0005604215.localdomain sudo[155686]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwmegemdvpcxgalpnrqcazivtnykvleo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937313.045873-618-280242498205651/AnsiballZ_file.py
Feb 01 09:15:13 np0005604215.localdomain sudo[155686]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:13 np0005604215.localdomain python3.9[155688]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:13 np0005604215.localdomain sudo[155686]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:13 np0005604215.localdomain sudo[155778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfmhkhfpjlqlwkpagaectevqxeukklnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937313.738836-641-77192033043771/AnsiballZ_stat.py
Feb 01 09:15:13 np0005604215.localdomain sudo[155778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:14 np0005604215.localdomain python3.9[155780]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:14 np0005604215.localdomain sudo[155778]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:14 np0005604215.localdomain sudo[155826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ycbgbenwcqtpgakiuxjwmofdxqtecnbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937313.738836-641-77192033043771/AnsiballZ_file.py
Feb 01 09:15:14 np0005604215.localdomain sudo[155826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:14 np0005604215.localdomain python3.9[155828]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:14 np0005604215.localdomain sudo[155826]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:15 np0005604215.localdomain sudo[155918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxaepulognbznhcnlqbnocjmuhokgcmk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937314.7982614-641-22207339795259/AnsiballZ_stat.py
Feb 01 09:15:15 np0005604215.localdomain sudo[155918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19086 DF PROTO=TCP SPT=35370 DPT=9882 SEQ=883227336 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655690D0000000001030307) 
Feb 01 09:15:15 np0005604215.localdomain python3.9[155920]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:15 np0005604215.localdomain sudo[155918]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:15 np0005604215.localdomain sudo[155966]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mltubwigjhikmsvlsqwguzzwavjsdbor ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937314.7982614-641-22207339795259/AnsiballZ_file.py
Feb 01 09:15:15 np0005604215.localdomain sudo[155966]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:15 np0005604215.localdomain python3.9[155968]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:15 np0005604215.localdomain sudo[155966]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:17 np0005604215.localdomain sudo[156058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pvrkovynvssrqdxrgeoxdakliimkrqtk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937317.081105-710-267249296049159/AnsiballZ_file.py
Feb 01 09:15:17 np0005604215.localdomain sudo[156058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:17 np0005604215.localdomain python3.9[156060]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:17 np0005604215.localdomain sudo[156058]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:18 np0005604215.localdomain sudo[156150]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjzdirxoiiwnicrhjyfshowcdhqttcve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937317.760896-734-226804464193738/AnsiballZ_stat.py
Feb 01 09:15:18 np0005604215.localdomain sudo[156150]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:18 np0005604215.localdomain python3.9[156152]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:18 np0005604215.localdomain sudo[156150]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:18 np0005604215.localdomain sudo[156198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oyqkwfkqsrlcpeocotehpdwwlpjdmjji ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937317.760896-734-226804464193738/AnsiballZ_file.py
Feb 01 09:15:18 np0005604215.localdomain sudo[156198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:18 np0005604215.localdomain python3.9[156200]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:18 np0005604215.localdomain sudo[156198]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5369 DF PROTO=TCP SPT=52206 DPT=9102 SEQ=3731879935 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655770D0000000001030307) 
Feb 01 09:15:19 np0005604215.localdomain sudo[156290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tknzlbavbnalbljpnlqhezgrllpmrcgy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937318.941379-771-18600965317011/AnsiballZ_stat.py
Feb 01 09:15:19 np0005604215.localdomain sudo[156290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:19 np0005604215.localdomain python3.9[156292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:19 np0005604215.localdomain sudo[156290]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:19 np0005604215.localdomain sudo[156338]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcoygqfjkminrbccutojkgjfymzexyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937318.941379-771-18600965317011/AnsiballZ_file.py
Feb 01 09:15:19 np0005604215.localdomain sudo[156338]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:19 np0005604215.localdomain python3.9[156340]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:19 np0005604215.localdomain sudo[156338]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:20 np0005604215.localdomain sudo[156430]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fodgirflqnolhjgowfdmjzljgazuxnqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937320.0573802-806-58185515784762/AnsiballZ_systemd.py
Feb 01 09:15:20 np0005604215.localdomain sudo[156430]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:20 np0005604215.localdomain python3.9[156432]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:15:20 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:15:20 np0005604215.localdomain systemd-sysv-generator[156463]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:15:20 np0005604215.localdomain systemd-rc-local-generator[156459]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:15:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:15:21 np0005604215.localdomain sudo[156430]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:21 np0005604215.localdomain sudo[156560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gxknrapjolemmyrwuanmbcqjrzuropad ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937321.1967623-830-131474249455286/AnsiballZ_stat.py
Feb 01 09:15:21 np0005604215.localdomain sudo[156560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:21 np0005604215.localdomain python3.9[156562]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:21 np0005604215.localdomain sudo[156560]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28241 DF PROTO=TCP SPT=38894 DPT=9100 SEQ=4282933993 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655830D0000000001030307) 
Feb 01 09:15:21 np0005604215.localdomain sudo[156608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngitfetfamblbclavpvbeyxqdnueigol ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937321.1967623-830-131474249455286/AnsiballZ_file.py
Feb 01 09:15:21 np0005604215.localdomain sudo[156608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:22 np0005604215.localdomain python3.9[156610]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:22 np0005604215.localdomain sudo[156608]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:22 np0005604215.localdomain sudo[156700]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxtrxrqowjqgcmfswfddfxbmvtpthuzo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937322.3507895-866-183240090186303/AnsiballZ_stat.py
Feb 01 09:15:22 np0005604215.localdomain sudo[156700]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:22 np0005604215.localdomain python3.9[156702]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:22 np0005604215.localdomain sudo[156700]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:23 np0005604215.localdomain sudo[156748]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzignwovdxdkbofwbasgwybpldlruurc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937322.3507895-866-183240090186303/AnsiballZ_file.py
Feb 01 09:15:23 np0005604215.localdomain sudo[156748]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:23 np0005604215.localdomain python3.9[156750]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:23 np0005604215.localdomain sudo[156748]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:23 np0005604215.localdomain sudo[156840]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uoipjsszzfxbfstlaknqxoorsywjybfo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937323.5211234-902-73997802942283/AnsiballZ_systemd.py
Feb 01 09:15:23 np0005604215.localdomain sudo[156840]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:24 np0005604215.localdomain python3.9[156842]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:15:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:15:24 np0005604215.localdomain systemd-rc-local-generator[156863]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:15:24 np0005604215.localdomain systemd-sysv-generator[156870]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:15:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:15:24 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 09:15:24 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 09:15:24 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 09:15:24 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 09:15:24 np0005604215.localdomain sudo[156840]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:25 np0005604215.localdomain sudo[156974]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krsnwjvzvhwkmkxuflvusuvfmozjwxqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937324.9134786-933-202221226753964/AnsiballZ_file.py
Feb 01 09:15:25 np0005604215.localdomain sudo[156974]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:25 np0005604215.localdomain python3.9[156976]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:25 np0005604215.localdomain sudo[156974]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=34644 DF PROTO=TCP SPT=48596 DPT=9101 SEQ=3909676888 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655910D0000000001030307) 
Feb 01 09:15:25 np0005604215.localdomain sudo[157066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szmxntlazryglistpefugqqctdyfmhob ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937325.620699-956-130211412349834/AnsiballZ_stat.py
Feb 01 09:15:25 np0005604215.localdomain sudo[157066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:26 np0005604215.localdomain python3.9[157068]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:26 np0005604215.localdomain sudo[157066]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:26 np0005604215.localdomain sudo[157139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ezkzqbfnxfpyfzncizkwmizjrgoptxew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937325.620699-956-130211412349834/AnsiballZ_copy.py
Feb 01 09:15:26 np0005604215.localdomain sudo[157139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:26 np0005604215.localdomain python3.9[157141]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937325.620699-956-130211412349834/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:26 np0005604215.localdomain sudo[157139]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:27 np0005604215.localdomain sudo[157231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkafbwafzihdooweljixawqkkhfjthxq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937327.142447-1007-39644698833904/AnsiballZ_file.py
Feb 01 09:15:27 np0005604215.localdomain sudo[157231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:27 np0005604215.localdomain python3.9[157233]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:27 np0005604215.localdomain sudo[157231]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:28 np0005604215.localdomain sudo[157323]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uuproubthyilzvymfsaagyajriungign ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937327.8676033-1031-49444101519175/AnsiballZ_file.py
Feb 01 09:15:28 np0005604215.localdomain sudo[157323]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:28 np0005604215.localdomain python3.9[157325]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:15:28 np0005604215.localdomain sudo[157323]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:28 np0005604215.localdomain sudo[157415]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rgiihmhmexnkkxvgzgcaxbwrdiabnfay ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937328.5966964-1055-108474674340956/AnsiballZ_stat.py
Feb 01 09:15:28 np0005604215.localdomain sudo[157415]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:29 np0005604215.localdomain python3.9[157417]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:29 np0005604215.localdomain sudo[157415]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:29 np0005604215.localdomain sudo[157490]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puigmaadpmvxsqpehmnrxzgkvvjwcwsz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937328.5966964-1055-108474674340956/AnsiballZ_copy.py
Feb 01 09:15:29 np0005604215.localdomain sudo[157490]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:29 np0005604215.localdomain python3.9[157492]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937328.5966964-1055-108474674340956/.source.json _original_basename=.dj85r6xt follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:29 np0005604215.localdomain sudo[157490]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28013 DF PROTO=TCP SPT=41122 DPT=9882 SEQ=1891771518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655A2BD0000000001030307) 
Feb 01 09:15:30 np0005604215.localdomain python3.9[157582]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28014 DF PROTO=TCP SPT=41122 DPT=9882 SEQ=1891771518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655A6CE0000000001030307) 
Feb 01 09:15:32 np0005604215.localdomain sudo[157833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apwpxkqhqanxvypjbkbhbwlnlaqbbmcr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937331.9393816-1175-193118850975633/AnsiballZ_container_config_data.py
Feb 01 09:15:32 np0005604215.localdomain sudo[157833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:32 np0005604215.localdomain python3.9[157835]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Feb 01 09:15:32 np0005604215.localdomain sudo[157833]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28015 DF PROTO=TCP SPT=41122 DPT=9882 SEQ=1891771518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655AECD0000000001030307) 
Feb 01 09:15:33 np0005604215.localdomain sudo[157925]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpvtjbqfezgwwtzrvlkmpgbhkzghcwqm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937332.9972408-1209-161912646882764/AnsiballZ_container_config_hash.py
Feb 01 09:15:33 np0005604215.localdomain sudo[157925]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:33 np0005604215.localdomain python3.9[157927]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:15:33 np0005604215.localdomain sudo[157925]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:34 np0005604215.localdomain sudo[158017]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lxbmtnexhoovhoszbdpmbzbjfnimgrjw ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937333.9797883-1238-189915424334747/AnsiballZ_edpm_container_manage.py
Feb 01 09:15:34 np0005604215.localdomain sudo[158017]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:34 np0005604215.localdomain python3[158019]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json containers=['ovn_metadata_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:15:34 np0005604215.localdomain python3[158019]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "19964fda6b912d3d57e21b0bcc221725d936e513025030cb508474fe04b06af8",
                                                                    "Digest": "sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn@sha256:7c305a77ab65247f0dc2ea1616c427b173cb95f37bb37e34c631d9615a73d2cc"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:29:34.446261637Z",
                                                                    "Config": {
                                                                         "User": "neutron",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 785500417,
                                                                    "VirtualSize": 785500417,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc/diff:/var/lib/containers/storage/overlay/33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:d3cc9cdab7e3e7c1a0a6c80e61bbd8cc5eeeba7069bab1cc064ed2e6cc28ed58",
                                                                              "sha256:d5cbf3016eca6267717119e8ebab3c6c083cae6c589c6961ae23bfa93ef3afa4",
                                                                              "sha256:0096ee5d07436ac5b94d9d58b8b2407cc5e6854d70de5e7f89b9a7a1ad4912ad"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "neutron",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:16:21.310836362Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:16:46.153105676Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:23.560707988Z",
                                                                              "created_by": "/bin/sh -c dnf -y install iputils net-tools openstack-neutron openstack-neutron-rpc-server openstack-neutron-ml2 openvswitch python3-networking-baremetal python3-openvswitch python3-unbound && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:41.849131913Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/neutron-base/neutron_sudoers /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.744796961Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers.d/neutron_sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.044382348Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:27:49.126765909Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:28:47.079155224Z",
                                                                              "created_by": "/bin/sh -c dnf -y install libseccomp podman && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:28:49.983056567Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:28:56.370338178Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-neutron-agent-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:34.44483218Z",
                                                                              "created_by": "/bin/sh -c dnf -y install python3-networking-ovn-metadata-agent && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:34.444891241Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER neutron",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:36.920021505Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:15:35 np0005604215.localdomain podman[158071]: 2026-02-01 09:15:35.069941399 +0000 UTC m=+0.090172084 container remove e8f71eedc0903bc3ad8d322be8c78b9b0d301bf239e5520b80b84e9b01b99e06 (image=registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1, name=ovn_metadata_agent, description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.description=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, io.k8s.display-name=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, container_name=ovn_metadata_agent, maintainer=OpenStack TripleO Team, summary=Red Hat OpenStack Platform 17.1 neutron-metadata-agent-ovn, release=1766032510, com.redhat.component=openstack-neutron-metadata-agent-ovn-container, vendor=Red Hat, Inc., managed_by=tripleo_ansible, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-neutron-metadata-agent-ovn, version=17.1.13, batch=17.1_20260112.1, tcib_managed=true, org.opencontainers.image.created=2026-01-12T22:56:19Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.5, org.opencontainers.image.revision=b1428c69debc6a325367957538a456c4ca8d9a0f, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, url=https://www.redhat.com, architecture=x86_64, config_id=tripleo_step4, distribution-scope=public, build-date=2026-01-12T22:56:19Z, cpe=cpe:/a:redhat:openstack:17.1::el9, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '08ca8fb8877681656a098784127ead43'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-neutron-metadata-agent-ovn:17.1', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'start_order': 1, 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/neutron:/var/log/neutron:z', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/neutron:/var/lib/kolla/config_files/src:ro', '/lib/modules:/lib/modules:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/run/netns:/run/netns:shared', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro']}, vcs-type=git, name=rhosp-rhel9/openstack-neutron-metadata-agent-ovn, io.openshift.expose-services=, vcs-ref=b1428c69debc6a325367957538a456c4ca8d9a0f, konflux.additional-tags=17.1.13 17.1_20260112.1)
Feb 01 09:15:35 np0005604215.localdomain python3[158019]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ovn_metadata_agent
Feb 01 09:15:35 np0005604215.localdomain podman[158084]: 
Feb 01 09:15:35 np0005604215.localdomain podman[158084]: 2026-02-01 09:15:35.175193307 +0000 UTC m=+0.083183911 container create 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, managed_by=edpm_ansible, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:15:35 np0005604215.localdomain podman[158084]: 2026-02-01 09:15:35.135078335 +0000 UTC m=+0.043068989 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:15:35 np0005604215.localdomain python3[158019]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311 --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:15:35 np0005604215.localdomain sudo[158017]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:35 np0005604215.localdomain sudo[158208]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ryzffkoxmwdzxrescacybfnwhmagbiai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937335.5737824-1262-89993394164246/AnsiballZ_stat.py
Feb 01 09:15:35 np0005604215.localdomain sudo[158208]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:36 np0005604215.localdomain python3.9[158210]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:15:36 np0005604215.localdomain sudo[158208]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43372 DF PROTO=TCP SPT=55622 DPT=9102 SEQ=2315680981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655BC4E0000000001030307) 
Feb 01 09:15:36 np0005604215.localdomain sudo[158302]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaskynnsruedryfynlagefykkczpndcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937336.4007502-1289-244082713999423/AnsiballZ_file.py
Feb 01 09:15:36 np0005604215.localdomain sudo[158302]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:15:36 np0005604215.localdomain podman[158305]: 2026-02-01 09:15:36.865828431 +0000 UTC m=+0.077128301 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 01 09:15:36 np0005604215.localdomain python3.9[158304]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:36 np0005604215.localdomain sudo[158302]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:36 np0005604215.localdomain podman[158305]: 2026-02-01 09:15:36.967954477 +0000 UTC m=+0.179254307 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:15:36 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:15:37 np0005604215.localdomain sudo[158373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ocqoqmzixgxdmjgktmkkxsdxfyniqhen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937336.4007502-1289-244082713999423/AnsiballZ_stat.py
Feb 01 09:15:37 np0005604215.localdomain sudo[158373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:37 np0005604215.localdomain python3.9[158375]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:15:37 np0005604215.localdomain sudo[158373]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:37 np0005604215.localdomain sudo[158464]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czbfoqypsfsbcjhniwohnnsuxdfehsqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937337.465682-1289-60329146993601/AnsiballZ_copy.py
Feb 01 09:15:37 np0005604215.localdomain sudo[158464]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:38 np0005604215.localdomain python3.9[158466]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937337.465682-1289-60329146993601/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:38 np0005604215.localdomain sudo[158464]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:38 np0005604215.localdomain sudo[158510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tizuvjrhnxeayqiuwsvtzepvpljguwjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937337.465682-1289-60329146993601/AnsiballZ_systemd.py
Feb 01 09:15:38 np0005604215.localdomain sudo[158510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:38 np0005604215.localdomain python3.9[158512]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:15:38 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:15:38 np0005604215.localdomain systemd-rc-local-generator[158535]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:15:38 np0005604215.localdomain systemd-sysv-generator[158539]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:15:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:15:38 np0005604215.localdomain sudo[158510]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:39 np0005604215.localdomain sudo[158592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ctqfizlhsysevpowyzcxsltzdwercwxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937337.465682-1289-60329146993601/AnsiballZ_systemd.py
Feb 01 09:15:39 np0005604215.localdomain sudo[158592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:39 np0005604215.localdomain python3.9[158594]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:15:39 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:15:39 np0005604215.localdomain systemd-rc-local-generator[158620]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:15:39 np0005604215.localdomain systemd-sysv-generator[158623]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:15:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:15:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1963 DF PROTO=TCP SPT=47986 DPT=9100 SEQ=1472699613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655C8CD0000000001030307) 
Feb 01 09:15:39 np0005604215.localdomain systemd[1]: Starting ovn_metadata_agent container...
Feb 01 09:15:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:15:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1847993324db0de5919ae17da5f618058e92eb21b99b51136b8b34c925eccdd/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 01 09:15:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f1847993324db0de5919ae17da5f618058e92eb21b99b51136b8b34c925eccdd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:15:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:15:40 np0005604215.localdomain podman[158636]: 2026-02-01 09:15:40.030895289 +0000 UTC m=+0.152396264 container init 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + sudo -E kolla_set_configs
Feb 01 09:15:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:15:40 np0005604215.localdomain podman[158636]: 2026-02-01 09:15:40.074602411 +0000 UTC m=+0.196103416 container start 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:15:40 np0005604215.localdomain edpm-start-podman-container[158636]: ovn_metadata_agent
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Validating config file
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Copying service configuration files
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Writing out command to execute
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: ++ cat /run_command
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + CMD=neutron-ovn-metadata-agent
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + ARGS=
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + sudo kolla_copy_cacerts
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: Running command: 'neutron-ovn-metadata-agent'
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + [[ ! -n '' ]]
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + . kolla_extend_start
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + umask 0022
Feb 01 09:15:40 np0005604215.localdomain ovn_metadata_agent[158650]: + exec neutron-ovn-metadata-agent
Feb 01 09:15:40 np0005604215.localdomain podman[158658]: 2026-02-01 09:15:40.164069508 +0000 UTC m=+0.083862916 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=starting, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 01 09:15:40 np0005604215.localdomain podman[158658]: 2026-02-01 09:15:40.242689812 +0000 UTC m=+0.162483170 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:15:40 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:15:40 np0005604215.localdomain edpm-start-podman-container[158635]: Creating additional drop-in dependency for "ovn_metadata_agent" (412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5)
Feb 01 09:15:40 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:15:40 np0005604215.localdomain systemd-sysv-generator[158731]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:15:40 np0005604215.localdomain systemd-rc-local-generator[158726]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:15:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:15:40 np0005604215.localdomain systemd[1]: Started ovn_metadata_agent container.
Feb 01 09:15:40 np0005604215.localdomain sudo[158592]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.698 158655 INFO neutron.common.config [-] Logging enabled!
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.698 158655 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev44
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.698 158655 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.699 158655 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.700 158655 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.701 158655 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.702 158655 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.703 158655 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.704 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.705 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.706 158655 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.707 158655 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.708 158655 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.709 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.710 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.711 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.712 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.713 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.714 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.715 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.716 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.717 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.718 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.719 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.720 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.721 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.722 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.723 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.724 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.725 158655 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.726 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.727 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.728 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.729 158655 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.738 158655 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.738 158655 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.738 158655 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.738 158655 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.739 158655 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.752 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name f18e6148-4a7e-452d-80cb-72c86b59e439 (UUID: f18e6148-4a7e-452d-80cb-72c86b59e439) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.770 158655 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.770 158655 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.770 158655 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.770 158655 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.772 158655 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.774 158655 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.783 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', 'f18e6148-4a7e-452d-80cb-72c86b59e439'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], external_ids={'neutron:ovn-metadata-id': '313b4605-18bf-5934-ac37-75f1eb3b119e', 'neutron:ovn-metadata-sb-cfg': '1'}, name=f18e6148-4a7e-452d-80cb-72c86b59e439, nb_cfg_timestamp=1769937286372, nb_cfg=4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.784 158655 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f7fd10c7b20>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.785 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.785 158655 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.785 158655 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.785 158655 INFO oslo_service.service [-] Starting 1 workers
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.787 158655 DEBUG oslo_service.service [-] Started child 158755 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.789 158655 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpez0k0e0b/privsep.sock']
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.791 158755 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-259738'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.817 158755 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.818 158755 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.818 158755 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.824 158755 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connecting...
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.825 158755 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:ovsdbserver-sb.openstack.svc:6642: connected
Feb 01 09:15:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:41.838 158755 INFO eventlet.wsgi.server [-] (158755) wsgi starting up on http:/var/lib/neutron/metadata_proxy
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.386 158655 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.387 158655 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpez0k0e0b/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.274 158836 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.279 158836 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.282 158836 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.283 158836 INFO oslo.privsep.daemon [-] privsep daemon running as pid 158836
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.390 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[d32cc769-0367-47d5-9345-6f725f7094ca]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:15:42 np0005604215.localdomain python3.9[158835]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.788 158836 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.788 158836 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:15:42 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:42.788 158836 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:15:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20652 DF PROTO=TCP SPT=60234 DPT=9101 SEQ=1325276629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655D60D0000000001030307) 
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.224 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[eaaaf608-b59e-444a-a034-f44dd306af60]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.228 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, column=external_ids, values=({'neutron:ovn-metadata-id': '313b4605-18bf-5934-ac37-75f1eb3b119e'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.229 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.230 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.240 158655 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.241 158655 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.241 158655 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.241 158655 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.241 158655 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.241 158655 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.242 158655 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.243 158655 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.243 158655 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.243 158655 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.243 158655 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.243 158655 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.243 158655 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.244 158655 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.245 158655 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.245 158655 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.245 158655 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.245 158655 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.245 158655 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.245 158655 DEBUG oslo_service.service [-] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.246 158655 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.247 158655 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.248 158655 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.249 158655 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.250 158655 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] nova_metadata_protocol         = http log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.251 158655 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.252 158655 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.253 158655 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.254 158655 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.255 158655 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.256 158655 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.257 158655 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.257 158655 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.257 158655 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.257 158655 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.257 158655 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.257 158655 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.258 158655 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.259 158655 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.260 158655 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.261 158655 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.262 158655 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.263 158655 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.264 158655 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.265 158655 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.266 158655 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.267 158655 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.268 158655 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.268 158655 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.268 158655 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.268 158655 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.268 158655 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.268 158655 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.269 158655 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.270 158655 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.271 158655 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.272 158655 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.273 158655 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.273 158655 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.273 158655 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.273 158655 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.273 158655 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.273 158655 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.274 158655 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.274 158655 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.274 158655 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.274 158655 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.274 158655 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.274 158655 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.275 158655 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.275 158655 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.275 158655 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.275 158655 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.275 158655 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.275 158655 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.276 158655 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.276 158655 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.276 158655 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.276 158655 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.276 158655 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = tcp:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.276 158655 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.277 158655 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.277 158655 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.277 158655 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.277 158655 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.277 158655 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.277 158655 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.278 158655 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.278 158655 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.278 158655 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.278 158655 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.278 158655 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.278 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.279 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.279 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.279 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.279 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.279 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.279 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.280 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.280 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.280 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.280 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.280 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.280 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.281 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.281 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.281 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.281 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.281 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.281 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.282 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.283 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.283 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.283 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.283 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.283 158655 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.283 158655 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.284 158655 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.284 158655 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.284 158655 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:15:43 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:15:43.284 158655 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:15:43 np0005604215.localdomain sudo[158930]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhudckrzphbmtgvezlzntuizqhnyapib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937343.164618-1424-268132857875924/AnsiballZ_stat.py
Feb 01 09:15:43 np0005604215.localdomain sudo[158930]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:43 np0005604215.localdomain python3.9[158932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:15:43 np0005604215.localdomain sudo[158930]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:43 np0005604215.localdomain sudo[159005]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgojwnsxmeyubtxnxvednkbislifjnez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937343.164618-1424-268132857875924/AnsiballZ_copy.py
Feb 01 09:15:43 np0005604215.localdomain sudo[159005]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:44 np0005604215.localdomain python3.9[159007]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937343.164618-1424-268132857875924/.source.yaml _original_basename=.xipr1w40 follow=False checksum=08b98aaf8b4739d4298bc1690447f4cee3a9ba74 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:15:44 np0005604215.localdomain sudo[159005]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:44 np0005604215.localdomain sshd[153448]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:15:44 np0005604215.localdomain systemd-logind[761]: Session 51 logged out. Waiting for processes to exit.
Feb 01 09:15:44 np0005604215.localdomain systemd[1]: session-51.scope: Deactivated successfully.
Feb 01 09:15:44 np0005604215.localdomain systemd[1]: session-51.scope: Consumed 32.383s CPU time.
Feb 01 09:15:44 np0005604215.localdomain systemd-logind[761]: Removed session 51.
Feb 01 09:15:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28017 DF PROTO=TCP SPT=41122 DPT=9882 SEQ=1891771518 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655DF0D0000000001030307) 
Feb 01 09:15:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43374 DF PROTO=TCP SPT=55622 DPT=9102 SEQ=2315680981 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655ED0E0000000001030307) 
Feb 01 09:15:50 np0005604215.localdomain sshd[159022]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:15:50 np0005604215.localdomain sshd[159022]: Accepted publickey for zuul from 192.168.122.30 port 58600 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:15:50 np0005604215.localdomain systemd-logind[761]: New session 52 of user zuul.
Feb 01 09:15:50 np0005604215.localdomain systemd[1]: Started Session 52 of User zuul.
Feb 01 09:15:50 np0005604215.localdomain sshd[159022]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:15:51 np0005604215.localdomain python3.9[159115]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:15:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1965 DF PROTO=TCP SPT=47986 DPT=9100 SEQ=1472699613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA655F90E0000000001030307) 
Feb 01 09:15:52 np0005604215.localdomain sudo[159209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nipobyntbrjssyedoyihgywivnjlcqqc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937351.864211-60-81758168953315/AnsiballZ_command.py
Feb 01 09:15:52 np0005604215.localdomain sudo[159209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:52 np0005604215.localdomain python3.9[159211]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:15:52 np0005604215.localdomain sudo[159209]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:53 np0005604215.localdomain sudo[159314]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqggeytjjixutmpgoldjjpehmlrrhfnu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937352.7543135-83-176647813928194/AnsiballZ_command.py
Feb 01 09:15:53 np0005604215.localdomain sudo[159314]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:53 np0005604215.localdomain python3.9[159316]: ansible-ansible.legacy.command Invoked with _raw_params=podman stop nova_virtlogd _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:15:53 np0005604215.localdomain systemd[1]: libpod-80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5.scope: Deactivated successfully.
Feb 01 09:15:53 np0005604215.localdomain podman[159317]: 2026-02-01 09:15:53.28154865 +0000 UTC m=+0.078452390 container died 80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, release=1766032510, vendor=Red Hat, Inc., io.buildah.version=1.41.5, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, tcib_managed=true, name=rhosp-rhel9/openstack-nova-libvirt, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, io.openshift.expose-services=, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, batch=17.1_20260112.1, org.opencontainers.image.created=2026-01-12T23:31:49Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, vcs-type=git, com.redhat.component=openstack-nova-libvirt-container, distribution-scope=public, konflux.additional-tags=17.1.13 17.1_20260112.1, cpe=cpe:/a:redhat:openstack:17.1::el9, url=https://www.redhat.com, build-date=2026-01-12T23:31:49Z, architecture=x86_64, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0)
Feb 01 09:15:53 np0005604215.localdomain systemd[1]: tmp-crun.Cf2hHx.mount: Deactivated successfully.
Feb 01 09:15:53 np0005604215.localdomain podman[159317]: 2026-02-01 09:15:53.317461169 +0000 UTC m=+0.114364909 container cleanup 80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, name=rhosp-rhel9/openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, batch=17.1_20260112.1, description=Red Hat OpenStack Platform 17.1 nova-libvirt, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, distribution-scope=public, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, version=17.1.13, release=1766032510, tcib_managed=true, maintainer=OpenStack TripleO Team, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, url=https://www.redhat.com, io.buildah.version=1.41.5, konflux.additional-tags=17.1.13 17.1_20260112.1, io.openshift.expose-services=, build-date=2026-01-12T23:31:49Z, com.redhat.component=openstack-nova-libvirt-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, cpe=cpe:/a:redhat:openstack:17.1::el9)
Feb 01 09:15:53 np0005604215.localdomain sudo[159314]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:53 np0005604215.localdomain podman[159332]: 2026-02-01 09:15:53.35841566 +0000 UTC m=+0.072406300 container remove 80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5 (image=registry.redhat.io/rhosp-rhel9/openstack-nova-libvirt:17.1, name=nova_virtlogd, cpe=cpe:/a:redhat:openstack:17.1::el9, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-libvirt, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, build-date=2026-01-12T23:31:49Z, org.opencontainers.image.created=2026-01-12T23:31:49Z, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe, konflux.additional-tags=17.1.13 17.1_20260112.1, batch=17.1_20260112.1, vendor=Red Hat, Inc., com.redhat.component=openstack-nova-libvirt-container, url=https://www.redhat.com, version=17.1.13, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, vcs-type=git, release=1766032510, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-libvirt, tcib_managed=true, summary=Red Hat OpenStack Platform 17.1 nova-libvirt, description=Red Hat OpenStack Platform 17.1 nova-libvirt, name=rhosp-rhel9/openstack-nova-libvirt, io.buildah.version=1.41.5, io.openshift.expose-services=, io.k8s.description=Red Hat OpenStack Platform 17.1 nova-libvirt, maintainer=OpenStack TripleO Team)
Feb 01 09:15:53 np0005604215.localdomain systemd[1]: libpod-conmon-80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5.scope: Deactivated successfully.
Feb 01 09:15:54 np0005604215.localdomain systemd[1]: tmp-crun.OBPeod.mount: Deactivated successfully.
Feb 01 09:15:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-79513147d587ecdcd7bb2edc01bb3b7dc549ee20844dd0dc1e7a6b286443d3ff-merged.mount: Deactivated successfully.
Feb 01 09:15:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-80e1d00e5bec4cd7f19cd10160562eb6b3744e0bb96cf5719096238b43ba4ee5-userdata-shm.mount: Deactivated successfully.
Feb 01 09:15:54 np0005604215.localdomain sudo[159434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chucgqxjaledmsvwiqckbsjrozqvxfij ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937354.0601122-113-14456416579707/AnsiballZ_systemd_service.py
Feb 01 09:15:54 np0005604215.localdomain sudo[159434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:15:54 np0005604215.localdomain python3.9[159436]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:15:54 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:15:54 np0005604215.localdomain systemd-rc-local-generator[159459]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:15:54 np0005604215.localdomain systemd-sysv-generator[159464]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:15:54 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:15:55 np0005604215.localdomain sudo[159434]: pam_unix(sudo:session): session closed for user root
Feb 01 09:15:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20654 DF PROTO=TCP SPT=60234 DPT=9101 SEQ=1325276629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656070D0000000001030307) 
Feb 01 09:15:56 np0005604215.localdomain python3.9[159561]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:15:56 np0005604215.localdomain network[159578]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:15:56 np0005604215.localdomain network[159579]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:15:56 np0005604215.localdomain network[159580]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:15:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:16:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9967 DF PROTO=TCP SPT=55966 DPT=9882 SEQ=1571398132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65617ED0000000001030307) 
Feb 01 09:16:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9968 DF PROTO=TCP SPT=55966 DPT=9882 SEQ=1571398132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6561C0E0000000001030307) 
Feb 01 09:16:01 np0005604215.localdomain sudo[159737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:16:01 np0005604215.localdomain sudo[159737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:16:01 np0005604215.localdomain sudo[159737]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:01 np0005604215.localdomain sudo[159765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:16:01 np0005604215.localdomain sudo[159765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:16:01 np0005604215.localdomain sudo[159810]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edybqpbevphnkmpzzrftcxtmcofqdobj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937361.4449036-170-28724820248858/AnsiballZ_systemd_service.py
Feb 01 09:16:01 np0005604215.localdomain sudo[159810]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:02 np0005604215.localdomain python3.9[159812]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:02 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:16:02 np0005604215.localdomain systemd-rc-local-generator[159875]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:16:02 np0005604215.localdomain systemd-sysv-generator[159878]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:16:02 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:16:02 np0005604215.localdomain systemd[1]: Stopped target tripleo_nova_libvirt.target.
Feb 01 09:16:02 np0005604215.localdomain sudo[159810]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:02 np0005604215.localdomain systemd[1]: tmp-crun.mRrbdc.mount: Deactivated successfully.
Feb 01 09:16:02 np0005604215.localdomain podman[159939]: 2026-02-01 09:16:02.577141101 +0000 UTC m=+0.092687934 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, vcs-type=git, release=1764794109, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64)
Feb 01 09:16:02 np0005604215.localdomain podman[159939]: 2026-02-01 09:16:02.692210064 +0000 UTC m=+0.207756927 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_BRANCH=main, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main)
Feb 01 09:16:02 np0005604215.localdomain sudo[160061]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qghhquvvjlvlgjwujtypdoypjrrujvar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937362.5142536-170-133595949463774/AnsiballZ_systemd_service.py
Feb 01 09:16:02 np0005604215.localdomain sudo[160061]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:02 np0005604215.localdomain sudo[159765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:03 np0005604215.localdomain sudo[160082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:16:03 np0005604215.localdomain sudo[160082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:16:03 np0005604215.localdomain sudo[160082]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:03 np0005604215.localdomain sudo[160097]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:16:03 np0005604215.localdomain sudo[160097]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:16:03 np0005604215.localdomain python3.9[160065]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9969 DF PROTO=TCP SPT=55966 DPT=9882 SEQ=1571398132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656240E0000000001030307) 
Feb 01 09:16:03 np0005604215.localdomain sudo[160061]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:03 np0005604215.localdomain sudo[160214]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zuzvsdziaprcjarcfzimekidxrvbrnxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937363.233144-170-234735951879865/AnsiballZ_systemd_service.py
Feb 01 09:16:03 np0005604215.localdomain sudo[160214]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:03 np0005604215.localdomain sudo[160097]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:03 np0005604215.localdomain python3.9[160218]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:03 np0005604215.localdomain sudo[160214]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:04 np0005604215.localdomain sudo[160310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:16:04 np0005604215.localdomain sudo[160310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:16:04 np0005604215.localdomain sudo[160310]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:04 np0005604215.localdomain sudo[160340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipeyvdxilidmaevsoewyyuajzkbngtcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937363.9936044-170-7119500424845/AnsiballZ_systemd_service.py
Feb 01 09:16:04 np0005604215.localdomain sudo[160340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:04 np0005604215.localdomain python3.9[160343]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:04 np0005604215.localdomain sudo[160340]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:05 np0005604215.localdomain sudo[160434]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epiwudieejlpzstweerxfbbxkbkjhqjo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937364.9603784-170-165593253143302/AnsiballZ_systemd_service.py
Feb 01 09:16:05 np0005604215.localdomain sudo[160434]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:05 np0005604215.localdomain python3.9[160436]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:05 np0005604215.localdomain sudo[160434]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:05 np0005604215.localdomain sudo[160527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nuuphuniwqlwgjlfqfmcibqbwzmqpurs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937365.6934435-170-181223112014729/AnsiballZ_systemd_service.py
Feb 01 09:16:05 np0005604215.localdomain sudo[160527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:06 np0005604215.localdomain python3.9[160529]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:06 np0005604215.localdomain sudo[160527]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3112 DF PROTO=TCP SPT=40488 DPT=9102 SEQ=1685993537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656318D0000000001030307) 
Feb 01 09:16:06 np0005604215.localdomain sudo[160620]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-plekzlmjxjzeqztgwvxkhhvxshhbcfsy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937366.5147278-170-80072153691674/AnsiballZ_systemd_service.py
Feb 01 09:16:06 np0005604215.localdomain sudo[160620]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:07 np0005604215.localdomain python3.9[160622]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:16:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:16:07 np0005604215.localdomain sudo[160620]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:07 np0005604215.localdomain podman[160624]: 2026-02-01 09:16:07.213670454 +0000 UTC m=+0.059092399 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 09:16:07 np0005604215.localdomain podman[160624]: 2026-02-01 09:16:07.294740487 +0000 UTC m=+0.140162452 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 01 09:16:07 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:16:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36679 DF PROTO=TCP SPT=33312 DPT=9100 SEQ=1524629492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6563E0D0000000001030307) 
Feb 01 09:16:10 np0005604215.localdomain sudo[160740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qqjmeihtswcwcvqzongcpvefzbleorxb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937370.037471-327-252031722875507/AnsiballZ_file.py
Feb 01 09:16:10 np0005604215.localdomain sudo[160740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:16:10 np0005604215.localdomain podman[160743]: 2026-02-01 09:16:10.578673895 +0000 UTC m=+0.081616133 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:16:10 np0005604215.localdomain podman[160743]: 2026-02-01 09:16:10.609323704 +0000 UTC m=+0.112265822 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:16:10 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:16:10 np0005604215.localdomain python3.9[160742]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:10 np0005604215.localdomain sudo[160740]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:11 np0005604215.localdomain sudo[160851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cphlutiyexcpqlvwpobjqatzaaklhevu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937370.82206-327-106592567803055/AnsiballZ_file.py
Feb 01 09:16:11 np0005604215.localdomain sudo[160851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:11 np0005604215.localdomain python3.9[160853]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:11 np0005604215.localdomain sudo[160851]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:11 np0005604215.localdomain sudo[160943]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mseurewphqtntoaqlosolchksqmbpotp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937371.4622092-327-81537711959943/AnsiballZ_file.py
Feb 01 09:16:11 np0005604215.localdomain sudo[160943]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:11 np0005604215.localdomain python3.9[160945]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:11 np0005604215.localdomain sudo[160943]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:12 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20655 DF PROTO=TCP SPT=60234 DPT=9101 SEQ=1325276629 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656470D0000000001030307) 
Feb 01 09:16:12 np0005604215.localdomain sudo[161035]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbemmgzfwqwtmlrgrvwalefqsguzkmab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937372.097729-327-30541893719508/AnsiballZ_file.py
Feb 01 09:16:12 np0005604215.localdomain sudo[161035]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:12 np0005604215.localdomain python3.9[161037]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:12 np0005604215.localdomain sudo[161035]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:13 np0005604215.localdomain sudo[161127]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jekycmmqrixexbtcvaofxjchjokbqfdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937372.7196372-327-206935438127868/AnsiballZ_file.py
Feb 01 09:16:13 np0005604215.localdomain sudo[161127]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:13 np0005604215.localdomain python3.9[161129]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:13 np0005604215.localdomain sudo[161127]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:13 np0005604215.localdomain sudo[161219]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmgucyrkqqzzzjjwrmbohuizcdcshcoe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937373.3809202-327-131044440230239/AnsiballZ_file.py
Feb 01 09:16:13 np0005604215.localdomain sudo[161219]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:13 np0005604215.localdomain sshd[161222]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:16:13 np0005604215.localdomain python3.9[161221]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:13 np0005604215.localdomain sudo[161219]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:14 np0005604215.localdomain sudo[161313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daspztewxcxucunqsnsrqkekzehnknrt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937374.0365396-327-126513273030482/AnsiballZ_file.py
Feb 01 09:16:14 np0005604215.localdomain sudo[161313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:14 np0005604215.localdomain sshd[161222]: Invalid user frappeuser from 85.206.171.113 port 60468
Feb 01 09:16:14 np0005604215.localdomain python3.9[161315]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:14 np0005604215.localdomain sudo[161313]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:14 np0005604215.localdomain sshd[161222]: Received disconnect from 85.206.171.113 port 60468:11: Bye Bye [preauth]
Feb 01 09:16:14 np0005604215.localdomain sshd[161222]: Disconnected from invalid user frappeuser 85.206.171.113 port 60468 [preauth]
Feb 01 09:16:15 np0005604215.localdomain sudo[161405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ortawwgpgbwbaxdrzazkwreeuapoloxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937374.8328333-476-277629174121230/AnsiballZ_file.py
Feb 01 09:16:15 np0005604215.localdomain sudo[161405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:15 np0005604215.localdomain python3.9[161407]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:15 np0005604215.localdomain sudo[161405]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=9971 DF PROTO=TCP SPT=55966 DPT=9882 SEQ=1571398132 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656550D0000000001030307) 
Feb 01 09:16:15 np0005604215.localdomain sudo[161497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wrapcsjctblibqgznulaqvargkpvkcnv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937375.4605587-476-34203271742199/AnsiballZ_file.py
Feb 01 09:16:15 np0005604215.localdomain sudo[161497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:15 np0005604215.localdomain python3.9[161499]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:15 np0005604215.localdomain sudo[161497]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:16 np0005604215.localdomain sudo[161589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmmcjbrdulzwdgorwamjzqucwmydehhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937376.095936-476-22840954377449/AnsiballZ_file.py
Feb 01 09:16:16 np0005604215.localdomain sudo[161589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:16 np0005604215.localdomain python3.9[161591]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:16 np0005604215.localdomain sudo[161589]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:16 np0005604215.localdomain sudo[161681]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wazsjsrxwgnmlnfrcztfhrgpzmrvgqtz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937376.6948512-476-80083061567148/AnsiballZ_file.py
Feb 01 09:16:16 np0005604215.localdomain sudo[161681]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:17 np0005604215.localdomain python3.9[161683]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:17 np0005604215.localdomain sudo[161681]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:17 np0005604215.localdomain sudo[161773]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utlohfuswuanyclpotdatsdvqpprgkpo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937377.312629-476-219317039082217/AnsiballZ_file.py
Feb 01 09:16:17 np0005604215.localdomain sudo[161773]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:17 np0005604215.localdomain python3.9[161775]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:17 np0005604215.localdomain sudo[161773]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:18 np0005604215.localdomain sudo[161865]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgmmuiaywmdassssevyhwqtbriggsivx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937377.9440005-476-87276731301512/AnsiballZ_file.py
Feb 01 09:16:18 np0005604215.localdomain sudo[161865]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:18 np0005604215.localdomain python3.9[161867]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:18 np0005604215.localdomain sudo[161865]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3114 DF PROTO=TCP SPT=40488 DPT=9102 SEQ=1685993537 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656610D0000000001030307) 
Feb 01 09:16:18 np0005604215.localdomain sudo[161957]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xeyngtxasjxwgbaqainmuwxmycqftvsa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937378.5758896-476-100956293389217/AnsiballZ_file.py
Feb 01 09:16:18 np0005604215.localdomain sudo[161957]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:19 np0005604215.localdomain python3.9[161959]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:16:19 np0005604215.localdomain sudo[161957]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:20 np0005604215.localdomain sudo[162049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmjqerfyihrvifrwyiwjdjtoflabbmdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937379.736813-629-226667034030295/AnsiballZ_command.py
Feb 01 09:16:20 np0005604215.localdomain sudo[162049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:20 np0005604215.localdomain python3.9[162051]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:20 np0005604215.localdomain sudo[162049]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:21 np0005604215.localdomain python3.9[162143]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:16:21 np0005604215.localdomain sudo[162233]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-isqiaqgbpqcvkwmeysoekhffgelkemql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937381.452801-683-110572860289136/AnsiballZ_systemd_service.py
Feb 01 09:16:21 np0005604215.localdomain sudo[162233]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:22 np0005604215.localdomain python3.9[162235]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:16:22 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:16:22 np0005604215.localdomain systemd-sysv-generator[162262]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:16:22 np0005604215.localdomain systemd-rc-local-generator[162256]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:16:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:16:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36681 DF PROTO=TCP SPT=33312 DPT=9100 SEQ=1524629492 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6566F0E0000000001030307) 
Feb 01 09:16:22 np0005604215.localdomain sudo[162233]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:22 np0005604215.localdomain sudo[162361]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rthkcmbqklgupeeablngfiyfozibvgyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937382.6434107-707-148895474075309/AnsiballZ_command.py
Feb 01 09:16:22 np0005604215.localdomain sudo[162361]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:23 np0005604215.localdomain python3.9[162363]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:23 np0005604215.localdomain sudo[162361]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:23 np0005604215.localdomain sudo[162454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xphlpacffgponytmzvdpsxhkfmtprzsg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937383.280618-707-16482904185436/AnsiballZ_command.py
Feb 01 09:16:23 np0005604215.localdomain sudo[162454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:23 np0005604215.localdomain python3.9[162456]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:23 np0005604215.localdomain sudo[162454]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:24 np0005604215.localdomain sudo[162547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxusitkpehsdccfuknsffwdmfwkmptct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937383.8794937-707-51008869515208/AnsiballZ_command.py
Feb 01 09:16:24 np0005604215.localdomain sudo[162547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:24 np0005604215.localdomain python3.9[162549]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:24 np0005604215.localdomain sudo[162547]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:24 np0005604215.localdomain sudo[162640]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tehvcukhiawswnmoysbqhxiywelwidez ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937384.5284514-707-224248315939331/AnsiballZ_command.py
Feb 01 09:16:24 np0005604215.localdomain sudo[162640]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:25 np0005604215.localdomain python3.9[162642]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:25 np0005604215.localdomain sudo[162640]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:25 np0005604215.localdomain sudo[162733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vloygikjdkzlezmfazhnzlsvmppboufo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937385.1613324-707-58765826880413/AnsiballZ_command.py
Feb 01 09:16:25 np0005604215.localdomain sudo[162733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45909 DF PROTO=TCP SPT=54704 DPT=9101 SEQ=2485291887 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6567B0D0000000001030307) 
Feb 01 09:16:25 np0005604215.localdomain python3.9[162735]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:25 np0005604215.localdomain sudo[162733]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:26 np0005604215.localdomain sudo[162826]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gijrflwjrntbjcwpmgokhcwptzzgsmxt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937385.730242-707-212043468734599/AnsiballZ_command.py
Feb 01 09:16:26 np0005604215.localdomain sudo[162826]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:26 np0005604215.localdomain python3.9[162828]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:26 np0005604215.localdomain sudo[162826]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:26 np0005604215.localdomain sudo[162919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vitjxszcwexrqguynafzhqlcvwlmvgaq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937386.4222484-707-270028340279240/AnsiballZ_command.py
Feb 01 09:16:26 np0005604215.localdomain sudo[162919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:26 np0005604215.localdomain python3.9[162921]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:16:26 np0005604215.localdomain sudo[162919]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:28 np0005604215.localdomain sudo[163012]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipoizdzefoffsjihgxwlnthrdtvvfwih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937387.658159-870-21123455285438/AnsiballZ_getent.py
Feb 01 09:16:28 np0005604215.localdomain sudo[163012]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:28 np0005604215.localdomain python3.9[163014]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Feb 01 09:16:28 np0005604215.localdomain sudo[163012]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:28 np0005604215.localdomain sudo[163105]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zjaddbgbvkqhtdwrxagxwpbxpqrstpjr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937388.447164-893-129139881785895/AnsiballZ_group.py
Feb 01 09:16:28 np0005604215.localdomain sudo[163105]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:29 np0005604215.localdomain python3.9[163107]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 01 09:16:29 np0005604215.localdomain groupadd[163108]: group added to /etc/group: name=libvirt, GID=42473
Feb 01 09:16:29 np0005604215.localdomain groupadd[163108]: group added to /etc/gshadow: name=libvirt
Feb 01 09:16:29 np0005604215.localdomain groupadd[163108]: new group: name=libvirt, GID=42473
Feb 01 09:16:29 np0005604215.localdomain sudo[163105]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:29 np0005604215.localdomain sudo[163203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xkvsngdhtrymwhijapgboxkcyczonpxf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937389.3505147-917-244321613179092/AnsiballZ_user.py
Feb 01 09:16:29 np0005604215.localdomain sudo[163203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12654 DF PROTO=TCP SPT=52276 DPT=9882 SEQ=1185292542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6568D1D0000000001030307) 
Feb 01 09:16:30 np0005604215.localdomain python3.9[163205]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604215.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 01 09:16:30 np0005604215.localdomain useradd[163207]: new user: name=libvirt, UID=42473, GID=42473, home=/home/libvirt, shell=/sbin/nologin, from=/dev/pts/0
Feb 01 09:16:30 np0005604215.localdomain sudo[163203]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:30 np0005604215.localdomain sudo[163303]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkofjrhcqqyeoqxtxxoooncnghiruswe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937390.6512728-950-184921144565359/AnsiballZ_setup.py
Feb 01 09:16:30 np0005604215.localdomain sudo[163303]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12655 DF PROTO=TCP SPT=52276 DPT=9882 SEQ=1185292542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656910D0000000001030307) 
Feb 01 09:16:31 np0005604215.localdomain python3.9[163305]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:16:31 np0005604215.localdomain sudo[163303]: pam_unix(sudo:session): session closed for user root
Feb 01 09:16:31 np0005604215.localdomain sudo[163357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-suicnzspxaqdkjclwtioidkxrpsjwrvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937390.6512728-950-184921144565359/AnsiballZ_dnf.py
Feb 01 09:16:31 np0005604215.localdomain sudo[163357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:16:32 np0005604215.localdomain python3.9[163359]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:16:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12656 DF PROTO=TCP SPT=52276 DPT=9882 SEQ=1185292542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656990D0000000001030307) 
Feb 01 09:16:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1971 DF PROTO=TCP SPT=37162 DPT=9102 SEQ=2952759018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656A6CD0000000001030307) 
Feb 01 09:16:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:16:37 np0005604215.localdomain podman[163426]: 2026-02-01 09:16:37.87053399 +0000 UTC m=+0.086588895 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 01 09:16:37 np0005604215.localdomain podman[163426]: 2026-02-01 09:16:37.975324299 +0000 UTC m=+0.191379154 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:16:37 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:16:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15171 DF PROTO=TCP SPT=59796 DPT=9100 SEQ=4276628298 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656B34E0000000001030307) 
Feb 01 09:16:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:16:40 np0005604215.localdomain podman[163455]: 2026-02-01 09:16:40.885019012 +0000 UTC m=+0.087027308 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:16:40 np0005604215.localdomain podman[163455]: 2026-02-01 09:16:40.894672351 +0000 UTC m=+0.096680637 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 09:16:40 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:16:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:16:41.732 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:16:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:16:41.733 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:16:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:16:41.733 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:16:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47358 DF PROTO=TCP SPT=44292 DPT=9101 SEQ=2416393332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656C04D0000000001030307) 
Feb 01 09:16:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12658 DF PROTO=TCP SPT=52276 DPT=9882 SEQ=1185292542 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656C90E0000000001030307) 
Feb 01 09:16:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=1973 DF PROTO=TCP SPT=37162 DPT=9102 SEQ=2952759018 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656D70D0000000001030307) 
Feb 01 09:16:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15173 DF PROTO=TCP SPT=59796 DPT=9100 SEQ=4276628298 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656E30D0000000001030307) 
Feb 01 09:16:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47360 DF PROTO=TCP SPT=44292 DPT=9101 SEQ=2416393332 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA656F10E0000000001030307) 
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  Converting 2744 SID table entries...
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  Context system_u:object_r:insights_client_cache_t:s0 became invalid (unmapped).
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:16:57 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:17:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18894 DF PROTO=TCP SPT=46190 DPT=9882 SEQ=1165195499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657024C0000000001030307) 
Feb 01 09:17:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18895 DF PROTO=TCP SPT=46190 DPT=9882 SEQ=1165195499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657064D0000000001030307) 
Feb 01 09:17:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18896 DF PROTO=TCP SPT=46190 DPT=9882 SEQ=1165195499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6570E4D0000000001030307) 
Feb 01 09:17:04 np0005604215.localdomain sudo[164513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:17:04 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=19 res=1
Feb 01 09:17:04 np0005604215.localdomain sudo[164513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:17:04 np0005604215.localdomain sudo[164513]: pam_unix(sudo:session): session closed for user root
Feb 01 09:17:04 np0005604215.localdomain sudo[164531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:17:04 np0005604215.localdomain sudo[164531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:17:05 np0005604215.localdomain sudo[164531]: pam_unix(sudo:session): session closed for user root
Feb 01 09:17:05 np0005604215.localdomain sudo[164581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:17:05 np0005604215.localdomain sudo[164581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:17:05 np0005604215.localdomain sudo[164581]: pam_unix(sudo:session): session closed for user root
Feb 01 09:17:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4776 DF PROTO=TCP SPT=58730 DPT=9102 SEQ=338499038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6571BCD0000000001030307) 
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  Converting 2747 SID table entries...
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:17:07 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:17:08 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=20 res=1
Feb 01 09:17:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:17:08 np0005604215.localdomain podman[164608]: 2026-02-01 09:17:08.878157646 +0000 UTC m=+0.083058308 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:17:08 np0005604215.localdomain podman[164608]: 2026-02-01 09:17:08.92234972 +0000 UTC m=+0.127250382 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 01 09:17:08 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:17:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63894 DF PROTO=TCP SPT=54936 DPT=9100 SEQ=3599968820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657284D0000000001030307) 
Feb 01 09:17:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:17:11 np0005604215.localdomain podman[164637]: 2026-02-01 09:17:11.860841617 +0000 UTC m=+0.076723099 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:17:11 np0005604215.localdomain podman[164637]: 2026-02-01 09:17:11.869666651 +0000 UTC m=+0.085548133 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:17:11 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:17:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42629 DF PROTO=TCP SPT=53314 DPT=9101 SEQ=1484207914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657358D0000000001030307) 
Feb 01 09:17:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18898 DF PROTO=TCP SPT=46190 DPT=9882 SEQ=1165195499 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6573F0D0000000001030307) 
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  Converting 2750 SID table entries...
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:17:17 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:17:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4778 DF PROTO=TCP SPT=58730 DPT=9102 SEQ=338499038 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6574B0D0000000001030307) 
Feb 01 09:17:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63896 DF PROTO=TCP SPT=54936 DPT=9100 SEQ=3599968820 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657590D0000000001030307) 
Feb 01 09:17:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42631 DF PROTO=TCP SPT=53314 DPT=9101 SEQ=1484207914 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657650D0000000001030307) 
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  Converting 2750 SID table entries...
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:17:25 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:17:26 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:17:26 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=22 res=1
Feb 01 09:17:26 np0005604215.localdomain systemd-rc-local-generator[164696]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:17:26 np0005604215.localdomain systemd-sysv-generator[164701]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:17:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:17:26 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:17:26 np0005604215.localdomain systemd-sysv-generator[164737]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:17:26 np0005604215.localdomain systemd-rc-local-generator[164732]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:17:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:17:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36879 DF PROTO=TCP SPT=56628 DPT=9882 SEQ=47570602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657777D0000000001030307) 
Feb 01 09:17:30 np0005604215.localdomain sshd[164751]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:17:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36880 DF PROTO=TCP SPT=56628 DPT=9882 SEQ=47570602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6577B8D0000000001030307) 
Feb 01 09:17:31 np0005604215.localdomain sshd[164751]: Invalid user erpnext from 85.206.171.113 port 59820
Feb 01 09:17:31 np0005604215.localdomain sshd[164751]: Received disconnect from 85.206.171.113 port 59820:11: Bye Bye [preauth]
Feb 01 09:17:31 np0005604215.localdomain sshd[164751]: Disconnected from invalid user erpnext 85.206.171.113 port 59820 [preauth]
Feb 01 09:17:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36881 DF PROTO=TCP SPT=56628 DPT=9882 SEQ=47570602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657838D0000000001030307) 
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  Converting 2751 SID table entries...
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability network_peer_controls=1
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability open_perms=1
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability extended_socket_class=1
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability always_check_network=0
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability cgroup_seclabel=1
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability nnp_nosuid_transition=1
Feb 01 09:17:35 np0005604215.localdomain kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Feb 01 09:17:36 np0005604215.localdomain groupadd[164763]: group added to /etc/group: name=clevis, GID=985
Feb 01 09:17:36 np0005604215.localdomain groupadd[164763]: group added to /etc/gshadow: name=clevis
Feb 01 09:17:36 np0005604215.localdomain groupadd[164763]: new group: name=clevis, GID=985
Feb 01 09:17:36 np0005604215.localdomain useradd[164770]: new user: name=clevis, UID=985, GID=985, home=/var/cache/clevis, shell=/usr/sbin/nologin, from=none
Feb 01 09:17:36 np0005604215.localdomain usermod[164780]: add 'clevis' to group 'tss'
Feb 01 09:17:36 np0005604215.localdomain usermod[164780]: add 'clevis' to shadow group 'tss'
Feb 01 09:17:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21995 DF PROTO=TCP SPT=33846 DPT=9102 SEQ=3172487913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657910D0000000001030307) 
Feb 01 09:17:39 np0005604215.localdomain dbus-broker-launch[756]: avc:  op=load_policy lsm=selinux seqno=23 res=1
Feb 01 09:17:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:17:39 np0005604215.localdomain groupadd[164806]: group added to /etc/group: name=dnsmasq, GID=984
Feb 01 09:17:39 np0005604215.localdomain groupadd[164806]: group added to /etc/gshadow: name=dnsmasq
Feb 01 09:17:39 np0005604215.localdomain groupadd[164806]: new group: name=dnsmasq, GID=984
Feb 01 09:17:39 np0005604215.localdomain useradd[164822]: new user: name=dnsmasq, UID=984, GID=984, home=/var/lib/dnsmasq, shell=/usr/sbin/nologin, from=none
Feb 01 09:17:39 np0005604215.localdomain podman[164805]: 2026-02-01 09:17:39.425011125 +0000 UTC m=+0.089280158 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_controller)
Feb 01 09:17:39 np0005604215.localdomain dbus-broker-launch[752]: Noticed file-system modification, trigger reload.
Feb 01 09:17:39 np0005604215.localdomain podman[164805]: 2026-02-01 09:17:39.490064453 +0000 UTC m=+0.154333466 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:17:39 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:17:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38161 DF PROTO=TCP SPT=37594 DPT=9100 SEQ=3871358318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6579D8D0000000001030307) 
Feb 01 09:17:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:17:41.733 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:17:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:17:41.734 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:17:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:17:41.734 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:17:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:17:42 np0005604215.localdomain podman[164851]: 2026-02-01 09:17:42.858732186 +0000 UTC m=+0.073545300 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:17:42 np0005604215.localdomain podman[164851]: 2026-02-01 09:17:42.864334653 +0000 UTC m=+0.079147757 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:17:42 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:17:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36817 DF PROTO=TCP SPT=56806 DPT=9101 SEQ=3784984786 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657AACD0000000001030307) 
Feb 01 09:17:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36883 DF PROTO=TCP SPT=56628 DPT=9882 SEQ=47570602 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657B30D0000000001030307) 
Feb 01 09:17:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=21997 DF PROTO=TCP SPT=33846 DPT=9102 SEQ=3172487913 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657C10D0000000001030307) 
Feb 01 09:17:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=38163 DF PROTO=TCP SPT=37594 DPT=9100 SEQ=3871358318 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657CD0D0000000001030307) 
Feb 01 09:17:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36819 DF PROTO=TCP SPT=56806 DPT=9101 SEQ=3784984786 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657DB0D0000000001030307) 
Feb 01 09:18:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42648 DF PROTO=TCP SPT=50878 DPT=9882 SEQ=1958605745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657ECAC0000000001030307) 
Feb 01 09:18:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42649 DF PROTO=TCP SPT=50878 DPT=9882 SEQ=1958605745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657F0CD0000000001030307) 
Feb 01 09:18:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42650 DF PROTO=TCP SPT=50878 DPT=9882 SEQ=1958605745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA657F8CD0000000001030307) 
Feb 01 09:18:06 np0005604215.localdomain sudo[174842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:18:06 np0005604215.localdomain sudo[174842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:18:06 np0005604215.localdomain sudo[174842]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:06 np0005604215.localdomain sudo[174938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:18:06 np0005604215.localdomain sudo[174938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:18:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29942 DF PROTO=TCP SPT=38818 DPT=9102 SEQ=1030711405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658064D0000000001030307) 
Feb 01 09:18:06 np0005604215.localdomain sudo[174938]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:07 np0005604215.localdomain sudo[176072]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:18:07 np0005604215.localdomain sudo[176072]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:18:07 np0005604215.localdomain sudo[176072]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20393 DF PROTO=TCP SPT=44070 DPT=9100 SEQ=2168692773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65812CD0000000001030307) 
Feb 01 09:18:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:18:09 np0005604215.localdomain podman[178219]: 2026-02-01 09:18:09.876277706 +0000 UTC m=+0.083242880 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:18:09 np0005604215.localdomain podman[178219]: 2026-02-01 09:18:09.980704806 +0000 UTC m=+0.187669960 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:18:09 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:18:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62856 DF PROTO=TCP SPT=58168 DPT=9101 SEQ=3083926355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6581FCE0000000001030307) 
Feb 01 09:18:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:18:13 np0005604215.localdomain podman[181599]: 2026-02-01 09:18:13.921527618 +0000 UTC m=+0.137141955 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 01 09:18:13 np0005604215.localdomain podman[181599]: 2026-02-01 09:18:13.95419796 +0000 UTC m=+0.169812317 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:18:13 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Reloading rules
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Collecting garbage unconditionally...
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Loading rules from directory /etc/polkit-1/rules.d
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Finished loading, compiling and executing 5 rules
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Reloading rules
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Collecting garbage unconditionally...
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Loading rules from directory /etc/polkit-1/rules.d
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Loading rules from directory /usr/share/polkit-1/rules.d
Feb 01 09:18:14 np0005604215.localdomain polkitd[1029]: Finished loading, compiling and executing 5 rules
Feb 01 09:18:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42652 DF PROTO=TCP SPT=50878 DPT=9882 SEQ=1958605745 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658290E0000000001030307) 
Feb 01 09:18:17 np0005604215.localdomain groupadd[182173]: group added to /etc/group: name=ceph, GID=167
Feb 01 09:18:17 np0005604215.localdomain groupadd[182173]: group added to /etc/gshadow: name=ceph
Feb 01 09:18:17 np0005604215.localdomain groupadd[182173]: new group: name=ceph, GID=167
Feb 01 09:18:17 np0005604215.localdomain useradd[182179]: new user: name=ceph, UID=167, GID=167, home=/var/lib/ceph, shell=/sbin/nologin, from=none
Feb 01 09:18:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29944 DF PROTO=TCP SPT=38818 DPT=9102 SEQ=1030711405 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658370D0000000001030307) 
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Stopping OpenSSH server daemon...
Feb 01 09:18:20 np0005604215.localdomain sshd[118325]: Received signal 15; terminating.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: sshd.service: Deactivated successfully.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Stopped OpenSSH server daemon.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: sshd.service: Consumed 1.652s CPU time, read 32.0K from disk, written 0B to disk.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Stopped target sshd-keygen.target.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Stopping sshd-keygen.target...
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Reached target sshd-keygen.target.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Starting OpenSSH server daemon...
Feb 01 09:18:20 np0005604215.localdomain sshd[182822]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:18:20 np0005604215.localdomain sshd[182822]: Server listening on 0.0.0.0 port 22.
Feb 01 09:18:20 np0005604215.localdomain sshd[182822]: Server listening on :: port 22.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: Started OpenSSH server daemon.
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20395 DF PROTO=TCP SPT=44070 DPT=9100 SEQ=2168692773 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658430E0000000001030307) 
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:22 np0005604215.localdomain systemd-rc-local-generator[183078]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:22 np0005604215.localdomain systemd-sysv-generator[183082]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/libvirtd.service:29: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 09:18:22 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:18:24 np0005604215.localdomain sudo[163357]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62858 DF PROTO=TCP SPT=58168 DPT=9101 SEQ=3083926355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6584F0D0000000001030307) 
Feb 01 09:18:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49544 DF PROTO=TCP SPT=45860 DPT=9882 SEQ=2523180970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65861DD0000000001030307) 
Feb 01 09:18:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49545 DF PROTO=TCP SPT=45860 DPT=9882 SEQ=2523180970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65865CD0000000001030307) 
Feb 01 09:18:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49546 DF PROTO=TCP SPT=45860 DPT=9882 SEQ=2523180970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6586DCD0000000001030307) 
Feb 01 09:18:33 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 09:18:33 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 09:18:33 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Consumed 13.681s CPU time.
Feb 01 09:18:33 np0005604215.localdomain systemd[1]: run-re1bb5755a20945de89da22f2d015ad2c.service: Deactivated successfully.
Feb 01 09:18:33 np0005604215.localdomain systemd[1]: run-rcaf97bb4a3794f97a11806f5f12b390e.service: Deactivated successfully.
Feb 01 09:18:34 np0005604215.localdomain sudo[191656]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoykazsnrpwhlaaoyneocczisqcxdbod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937513.8690946-987-16649223292302/AnsiballZ_systemd.py
Feb 01 09:18:34 np0005604215.localdomain sudo[191656]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:34 np0005604215.localdomain python3.9[191658]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:35 np0005604215.localdomain systemd-rc-local-generator[191686]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:35 np0005604215.localdomain systemd-sysv-generator[191690]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:35 np0005604215.localdomain sudo[191656]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:35 np0005604215.localdomain sudo[191805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lknnprknlgpyxzycvbhamksxhekluska ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937515.5437467-987-141704657795145/AnsiballZ_systemd.py
Feb 01 09:18:35 np0005604215.localdomain sudo[191805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:36 np0005604215.localdomain python3.9[191807]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:36 np0005604215.localdomain systemd-rc-local-generator[191838]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:36 np0005604215.localdomain systemd-sysv-generator[191841]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:36 np0005604215.localdomain sudo[191805]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=898 DF PROTO=TCP SPT=49174 DPT=9102 SEQ=2722967410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6587B8D0000000001030307) 
Feb 01 09:18:37 np0005604215.localdomain sudo[191955]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiibujjvqvsascrkhgxeqdhvraroomew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937516.6987782-987-19628503436466/AnsiballZ_systemd.py
Feb 01 09:18:37 np0005604215.localdomain sudo[191955]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:37 np0005604215.localdomain python3.9[191957]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:37 np0005604215.localdomain systemd-sysv-generator[191991]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:37 np0005604215.localdomain systemd-rc-local-generator[191986]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:37 np0005604215.localdomain sudo[191955]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:38 np0005604215.localdomain sudo[192104]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehzpglxqgmunmiliqgovbwdcfoczuweq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937517.8703268-987-80210285615082/AnsiballZ_systemd.py
Feb 01 09:18:38 np0005604215.localdomain sudo[192104]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:38 np0005604215.localdomain python3.9[192106]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:38 np0005604215.localdomain systemd-rc-local-generator[192131]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:38 np0005604215.localdomain systemd-sysv-generator[192135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:38 np0005604215.localdomain sudo[192104]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23653 DF PROTO=TCP SPT=39478 DPT=9100 SEQ=2595214617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658880D0000000001030307) 
Feb 01 09:18:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:18:40 np0005604215.localdomain podman[192162]: 2026-02-01 09:18:40.872901946 +0000 UTC m=+0.086427108 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 01 09:18:40 np0005604215.localdomain podman[192162]: 2026-02-01 09:18:40.918027965 +0000 UTC m=+0.131553147 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:18:40 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:18:41 np0005604215.localdomain sudo[192277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wazcetfayjynhnbapmdxkvwyxkqyaezh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937521.1051667-1073-70064705659359/AnsiballZ_systemd.py
Feb 01 09:18:41 np0005604215.localdomain sudo[192277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:41 np0005604215.localdomain python3.9[192279]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:18:41.734 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:18:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:18:41.736 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:18:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:18:41.736 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:41 np0005604215.localdomain systemd-rc-local-generator[192309]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:41 np0005604215.localdomain systemd-sysv-generator[192312]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:42 np0005604215.localdomain sudo[192277]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:42 np0005604215.localdomain sudo[192425]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrxrymssfbizxtcpyzmbapepmvwvujcw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937522.216477-1073-85305802206928/AnsiballZ_systemd.py
Feb 01 09:18:42 np0005604215.localdomain sudo[192425]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:42 np0005604215.localdomain python3.9[192427]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:42 np0005604215.localdomain systemd-rc-local-generator[192450]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:42 np0005604215.localdomain systemd-sysv-generator[192454]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60405 DF PROTO=TCP SPT=41704 DPT=9101 SEQ=4163484514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658950E0000000001030307) 
Feb 01 09:18:43 np0005604215.localdomain sudo[192425]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:43 np0005604215.localdomain sudo[192573]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jmplnmteeeazhmtdfudfxiwdqjbnpeuv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937523.3281877-1073-17870603331054/AnsiballZ_systemd.py
Feb 01 09:18:43 np0005604215.localdomain sudo[192573]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:43 np0005604215.localdomain python3.9[192575]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:44 np0005604215.localdomain podman[192578]: 2026-02-01 09:18:44.098832572 +0000 UTC m=+0.100079007 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:18:44 np0005604215.localdomain systemd-rc-local-generator[192623]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:44 np0005604215.localdomain podman[192578]: 2026-02-01 09:18:44.134118263 +0000 UTC m=+0.135364698 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:18:44 np0005604215.localdomain systemd-sysv-generator[192628]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:44 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:18:44 np0005604215.localdomain sudo[192573]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:44 np0005604215.localdomain sudo[192741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eqdqmrzttoeeuynqpmqbuiydahbhwwqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937524.5199645-1073-271957896509940/AnsiballZ_systemd.py
Feb 01 09:18:44 np0005604215.localdomain sudo[192741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:45 np0005604215.localdomain python3.9[192743]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=49548 DF PROTO=TCP SPT=45860 DPT=9882 SEQ=2523180970 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6589D0D0000000001030307) 
Feb 01 09:18:45 np0005604215.localdomain sudo[192741]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:45 np0005604215.localdomain sudo[192854]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmcjwmzfoctyroatesvadaezsfuhjcrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937525.3509076-1073-279181643141000/AnsiballZ_systemd.py
Feb 01 09:18:45 np0005604215.localdomain sudo[192854]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:45 np0005604215.localdomain python3.9[192856]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:46 np0005604215.localdomain systemd-sysv-generator[192889]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:46 np0005604215.localdomain systemd-rc-local-generator[192883]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:46 np0005604215.localdomain sudo[192854]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:48 np0005604215.localdomain sudo[193003]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zlmtmfwswkjqejqvzkmyauuvsivrugax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937528.4721582-1182-55699992510707/AnsiballZ_systemd.py
Feb 01 09:18:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=900 DF PROTO=TCP SPT=49174 DPT=9102 SEQ=2722967410 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658AB0D0000000001030307) 
Feb 01 09:18:48 np0005604215.localdomain sudo[193003]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:49 np0005604215.localdomain python3.9[193005]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:18:49 np0005604215.localdomain systemd-rc-local-generator[193029]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:18:49 np0005604215.localdomain systemd-sysv-generator[193033]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:18:49 np0005604215.localdomain sudo[193003]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:49 np0005604215.localdomain sudo[193152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmxrnyamyznqfcmnkvfkaceqadpxczlh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937529.6265142-1206-260674092373185/AnsiballZ_systemd.py
Feb 01 09:18:49 np0005604215.localdomain sudo[193152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:50 np0005604215.localdomain python3.9[193154]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:50 np0005604215.localdomain sudo[193152]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:50 np0005604215.localdomain sudo[193265]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azdtfpiassrdtxsvlgetrxfiaqrxpoal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937530.4603443-1206-104526252941358/AnsiballZ_systemd.py
Feb 01 09:18:50 np0005604215.localdomain sudo[193265]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:51 np0005604215.localdomain sshd[193268]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:18:51 np0005604215.localdomain python3.9[193267]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:51 np0005604215.localdomain sudo[193265]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:51 np0005604215.localdomain sudo[193380]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mzxvnpmpkxmjlocdwbjgtosxlyegplcg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937531.2502418-1206-90695982516110/AnsiballZ_systemd.py
Feb 01 09:18:51 np0005604215.localdomain sudo[193380]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:51 np0005604215.localdomain sshd[193268]: Invalid user web from 85.206.171.113 port 34546
Feb 01 09:18:51 np0005604215.localdomain sshd[193268]: Received disconnect from 85.206.171.113 port 34546:11: Bye Bye [preauth]
Feb 01 09:18:51 np0005604215.localdomain sshd[193268]: Disconnected from invalid user web 85.206.171.113 port 34546 [preauth]
Feb 01 09:18:52 np0005604215.localdomain python3.9[193382]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23655 DF PROTO=TCP SPT=39478 DPT=9100 SEQ=2595214617 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658B90D0000000001030307) 
Feb 01 09:18:53 np0005604215.localdomain sudo[193380]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:53 np0005604215.localdomain sudo[193493]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mtanwuidokfsorelyzihnxndycaauysh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937533.2602634-1206-133225499732816/AnsiballZ_systemd.py
Feb 01 09:18:53 np0005604215.localdomain sudo[193493]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:53 np0005604215.localdomain python3.9[193495]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:54 np0005604215.localdomain sudo[193493]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=60407 DF PROTO=TCP SPT=41704 DPT=9101 SEQ=4163484514 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658C50E0000000001030307) 
Feb 01 09:18:55 np0005604215.localdomain sudo[193606]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnrioxecqiiohpenerngxclacppkukiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937535.0950668-1206-116714718406403/AnsiballZ_systemd.py
Feb 01 09:18:55 np0005604215.localdomain sudo[193606]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:55 np0005604215.localdomain python3.9[193608]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:55 np0005604215.localdomain sudo[193606]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:56 np0005604215.localdomain sudo[193719]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjujoqiptsmwucpwcbzepnptnjctykuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937535.936027-1206-24115256508438/AnsiballZ_systemd.py
Feb 01 09:18:56 np0005604215.localdomain sudo[193719]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:56 np0005604215.localdomain python3.9[193721]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:56 np0005604215.localdomain sudo[193719]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:57 np0005604215.localdomain sudo[193832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wwrcvvxlxjtpdtmsnjdrpyvlwporwcya ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937536.7465055-1206-206910340951597/AnsiballZ_systemd.py
Feb 01 09:18:57 np0005604215.localdomain sudo[193832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:57 np0005604215.localdomain python3.9[193834]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:58 np0005604215.localdomain sudo[193832]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:58 np0005604215.localdomain sudo[193945]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wjcucvortkhxmkkcxapzblzlfdoruhut ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937538.6371493-1206-100772697952301/AnsiballZ_systemd.py
Feb 01 09:18:58 np0005604215.localdomain sudo[193945]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:18:59 np0005604215.localdomain python3.9[193947]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:18:59 np0005604215.localdomain sudo[193945]: pam_unix(sudo:session): session closed for user root
Feb 01 09:18:59 np0005604215.localdomain sudo[194058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hypespvvgnreylxpnirixyywijjyuhav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937539.4331346-1206-61542753876779/AnsiballZ_systemd.py
Feb 01 09:18:59 np0005604215.localdomain sudo[194058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12723 DF PROTO=TCP SPT=49046 DPT=9882 SEQ=55718926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658D70C0000000001030307) 
Feb 01 09:19:00 np0005604215.localdomain python3.9[194060]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:19:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12724 DF PROTO=TCP SPT=49046 DPT=9882 SEQ=55718926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658DB0D0000000001030307) 
Feb 01 09:19:01 np0005604215.localdomain sudo[194058]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:01 np0005604215.localdomain sudo[194171]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vbmemwfbwaqmopibueewtvbejmxsqlrl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937541.2730973-1206-167125912186786/AnsiballZ_systemd.py
Feb 01 09:19:01 np0005604215.localdomain sudo[194171]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:01 np0005604215.localdomain python3.9[194173]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:19:02 np0005604215.localdomain sudo[194171]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12725 DF PROTO=TCP SPT=49046 DPT=9882 SEQ=55718926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658E30D0000000001030307) 
Feb 01 09:19:03 np0005604215.localdomain sudo[194284]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jthvbfzpdimgbxuazkwxrxhxltyufack ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937543.0854115-1206-179103075835138/AnsiballZ_systemd.py
Feb 01 09:19:03 np0005604215.localdomain sudo[194284]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:03 np0005604215.localdomain python3.9[194286]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:19:03 np0005604215.localdomain sudo[194284]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:04 np0005604215.localdomain sudo[194397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-odgcgudreroaoxlaqqsxwfbotzdzxxkp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937543.8681555-1206-136437525976725/AnsiballZ_systemd.py
Feb 01 09:19:04 np0005604215.localdomain sudo[194397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:04 np0005604215.localdomain python3.9[194399]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:19:04 np0005604215.localdomain sudo[194397]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:04 np0005604215.localdomain sudo[194510]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjqrckdrafbvkhmxsibavrycnugamuju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937544.6165612-1206-168754147175521/AnsiballZ_systemd.py
Feb 01 09:19:04 np0005604215.localdomain sudo[194510]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:05 np0005604215.localdomain python3.9[194512]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:19:05 np0005604215.localdomain sudo[194510]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:05 np0005604215.localdomain sudo[194623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ygphhkxzdwvhkxhrqzofurteodnorcxg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937545.4300857-1206-118078871371192/AnsiballZ_systemd.py
Feb 01 09:19:05 np0005604215.localdomain sudo[194623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:06 np0005604215.localdomain python3.9[194625]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Feb 01 09:19:06 np0005604215.localdomain sudo[194623]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42923 DF PROTO=TCP SPT=47972 DPT=9102 SEQ=776110545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658F08D0000000001030307) 
Feb 01 09:19:07 np0005604215.localdomain sudo[194736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-drevhiuewzlxmmmwlyegaywwuhexzgoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937546.7174606-1511-131414240145770/AnsiballZ_file.py
Feb 01 09:19:07 np0005604215.localdomain sudo[194736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:07 np0005604215.localdomain python3.9[194738]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:19:07 np0005604215.localdomain sudo[194736]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:07 np0005604215.localdomain sudo[194810]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:19:07 np0005604215.localdomain sudo[194810]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:19:07 np0005604215.localdomain sudo[194810]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:07 np0005604215.localdomain sudo[194851]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:19:07 np0005604215.localdomain sudo[194851]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:19:07 np0005604215.localdomain sudo[194879]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpaotnuxllidpadpeewozulqcvlttdri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937547.3710651-1511-259283689347774/AnsiballZ_file.py
Feb 01 09:19:07 np0005604215.localdomain sudo[194879]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:07 np0005604215.localdomain python3.9[194884]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:19:07 np0005604215.localdomain sudo[194879]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:08 np0005604215.localdomain sudo[194851]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:08 np0005604215.localdomain sudo[195025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-quelmdevsttsydvneefpbqiejfuzxfyl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937547.9803748-1511-80931193410461/AnsiballZ_file.py
Feb 01 09:19:08 np0005604215.localdomain sudo[195025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:08 np0005604215.localdomain python3.9[195027]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:19:08 np0005604215.localdomain sudo[195025]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:08 np0005604215.localdomain sudo[195139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oplsxvhkfrrjigghftnwbzbkqakawuml ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937548.572241-1511-20451688666906/AnsiballZ_file.py
Feb 01 09:19:08 np0005604215.localdomain sudo[195139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:09 np0005604215.localdomain sudo[195134]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:19:09 np0005604215.localdomain sudo[195134]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:19:09 np0005604215.localdomain sudo[195134]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:09 np0005604215.localdomain python3.9[195154]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:19:09 np0005604215.localdomain sudo[195139]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:09 np0005604215.localdomain sudo[195263]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zucebgerpnmnllqcqxxwwtiozlklxqif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937549.3129184-1511-95248688546855/AnsiballZ_file.py
Feb 01 09:19:09 np0005604215.localdomain sudo[195263]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15513 DF PROTO=TCP SPT=48760 DPT=9100 SEQ=3846254848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA658FD0D0000000001030307) 
Feb 01 09:19:09 np0005604215.localdomain python3.9[195265]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:19:09 np0005604215.localdomain sudo[195263]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:10 np0005604215.localdomain sudo[195373]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihhxiltdhscbseqaayjxgcmtuutaitps ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937549.9328113-1511-142275257223644/AnsiballZ_file.py
Feb 01 09:19:10 np0005604215.localdomain sudo[195373]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:10 np0005604215.localdomain python3.9[195375]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:19:10 np0005604215.localdomain sudo[195373]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:11 np0005604215.localdomain python3.9[195483]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:19:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:19:11 np0005604215.localdomain podman[195501]: 2026-02-01 09:19:11.877841396 +0000 UTC m=+0.088745356 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:19:11 np0005604215.localdomain podman[195501]: 2026-02-01 09:19:11.963668255 +0000 UTC m=+0.174572245 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 09:19:11 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:19:12 np0005604215.localdomain sudo[195614]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-othskqsyuykxejxuxjdnybcfmmpkferb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937552.003891-1664-193907424561782/AnsiballZ_stat.py
Feb 01 09:19:12 np0005604215.localdomain sudo[195614]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:12 np0005604215.localdomain python3.9[195616]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:12 np0005604215.localdomain sudo[195614]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58712 DF PROTO=TCP SPT=58780 DPT=9101 SEQ=2487244976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6590A4D0000000001030307) 
Feb 01 09:19:13 np0005604215.localdomain sudo[195704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bbruxggsqqwxcswfrtcdynugrbapkkmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937552.003891-1664-193907424561782/AnsiballZ_copy.py
Feb 01 09:19:13 np0005604215.localdomain sudo[195704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:13 np0005604215.localdomain python3.9[195706]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937552.003891-1664-193907424561782/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:13 np0005604215.localdomain sudo[195704]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:13 np0005604215.localdomain sudo[195814]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsvsnvujglqusiorcmbouirnldyfbwpk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937553.5368953-1664-89354729102577/AnsiballZ_stat.py
Feb 01 09:19:13 np0005604215.localdomain sudo[195814]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:14 np0005604215.localdomain python3.9[195816]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:14 np0005604215.localdomain sudo[195814]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:14 np0005604215.localdomain sudo[195904]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lrevlawimjkezmycmvesopkwehylzuqj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937553.5368953-1664-89354729102577/AnsiballZ_copy.py
Feb 01 09:19:14 np0005604215.localdomain sudo[195904]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:19:14 np0005604215.localdomain podman[195907]: 2026-02-01 09:19:14.554893556 +0000 UTC m=+0.083044909 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:19:14 np0005604215.localdomain podman[195907]: 2026-02-01 09:19:14.565164746 +0000 UTC m=+0.093316099 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:19:14 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:19:14 np0005604215.localdomain python3.9[195906]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937553.5368953-1664-89354729102577/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:14 np0005604215.localdomain sudo[195904]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:15 np0005604215.localdomain sudo[196032]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjsasaufyqqnwctqomvrshmgguzsiybq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937554.815992-1664-150770818330539/AnsiballZ_stat.py
Feb 01 09:19:15 np0005604215.localdomain sudo[196032]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:15 np0005604215.localdomain python3.9[196034]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:15 np0005604215.localdomain sudo[196032]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=12727 DF PROTO=TCP SPT=49046 DPT=9882 SEQ=55718926 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659130D0000000001030307) 
Feb 01 09:19:15 np0005604215.localdomain sudo[196122]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vgfxbtqimffhaazplvvrnvtftvfcvrod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937554.815992-1664-150770818330539/AnsiballZ_copy.py
Feb 01 09:19:15 np0005604215.localdomain sudo[196122]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:15 np0005604215.localdomain python3.9[196124]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937554.815992-1664-150770818330539/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:15 np0005604215.localdomain sudo[196122]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:16 np0005604215.localdomain sudo[196232]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-roqkglcdvdmuwyjaguqyrwifcadfgscs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937556.0009978-1664-17648239140229/AnsiballZ_stat.py
Feb 01 09:19:16 np0005604215.localdomain sudo[196232]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:16 np0005604215.localdomain python3.9[196234]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:16 np0005604215.localdomain sudo[196232]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:16 np0005604215.localdomain sudo[196322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yoduprxrvqwziimoaigxycozqnovmynk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937556.0009978-1664-17648239140229/AnsiballZ_copy.py
Feb 01 09:19:16 np0005604215.localdomain sudo[196322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:17 np0005604215.localdomain python3.9[196324]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937556.0009978-1664-17648239140229/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:17 np0005604215.localdomain sudo[196322]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:17 np0005604215.localdomain sudo[196432]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qovdqniiievvbgankzdpavdtfseyxrzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937557.253724-1664-137797485582053/AnsiballZ_stat.py
Feb 01 09:19:17 np0005604215.localdomain sudo[196432]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:17 np0005604215.localdomain python3.9[196434]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:17 np0005604215.localdomain sudo[196432]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:18 np0005604215.localdomain sudo[196522]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlwdtmyyaepqkxogzgjwzvcouplhkvwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937557.253724-1664-137797485582053/AnsiballZ_copy.py
Feb 01 09:19:18 np0005604215.localdomain sudo[196522]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:18 np0005604215.localdomain python3.9[196524]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937557.253724-1664-137797485582053/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=8d9b2057482987a531d808ceb2ac4bc7d43bf17c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:18 np0005604215.localdomain sudo[196522]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:18 np0005604215.localdomain sudo[196632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxgdwballmmjmxzcaqrwqiicoltbvqrw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937558.5369935-1664-139037321463834/AnsiballZ_stat.py
Feb 01 09:19:18 np0005604215.localdomain sudo[196632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42925 DF PROTO=TCP SPT=47972 DPT=9102 SEQ=776110545 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659210D0000000001030307) 
Feb 01 09:19:19 np0005604215.localdomain python3.9[196634]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:19 np0005604215.localdomain sudo[196632]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:19 np0005604215.localdomain sudo[196722]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tpixgvighvnhnqyjkvjecxnilejjfigi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937558.5369935-1664-139037321463834/AnsiballZ_copy.py
Feb 01 09:19:19 np0005604215.localdomain sudo[196722]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:19 np0005604215.localdomain python3.9[196724]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937558.5369935-1664-139037321463834/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:19 np0005604215.localdomain sudo[196722]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:20 np0005604215.localdomain sudo[196832]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdkkleezbddacjrvxekagonguhructwn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937559.7609825-1664-58800556915347/AnsiballZ_stat.py
Feb 01 09:19:20 np0005604215.localdomain sudo[196832]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:20 np0005604215.localdomain python3.9[196834]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:20 np0005604215.localdomain sudo[196832]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:20 np0005604215.localdomain sudo[196920]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abdkpvepvaxibhfpvpbjvmizipxttgxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937559.7609825-1664-58800556915347/AnsiballZ_copy.py
Feb 01 09:19:20 np0005604215.localdomain sudo[196920]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:20 np0005604215.localdomain python3.9[196922]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937559.7609825-1664-58800556915347/.source.conf follow=False _original_basename=auth.conf checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:20 np0005604215.localdomain sudo[196920]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:21 np0005604215.localdomain sudo[197030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcsjxgwebcxgkaatjxwtbkvxnpiwsybu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937560.942703-1664-79136557070601/AnsiballZ_stat.py
Feb 01 09:19:21 np0005604215.localdomain sudo[197030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:21 np0005604215.localdomain python3.9[197032]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:21 np0005604215.localdomain sudo[197030]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:21 np0005604215.localdomain sudo[197120]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmkorrbdoreiqzvupgsmjacanuorznyn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937560.942703-1664-79136557070601/AnsiballZ_copy.py
Feb 01 09:19:21 np0005604215.localdomain sudo[197120]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:22 np0005604215.localdomain python3.9[197122]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1769937560.942703-1664-79136557070601/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:22 np0005604215.localdomain sudo[197120]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=15515 DF PROTO=TCP SPT=48760 DPT=9100 SEQ=3846254848 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6592D0D0000000001030307) 
Feb 01 09:19:24 np0005604215.localdomain sudo[197230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmnlwbvkgmnxnbuslqubttnimzixdplt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937563.8799343-2008-165106759811984/AnsiballZ_file.py
Feb 01 09:19:24 np0005604215.localdomain sudo[197230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:24 np0005604215.localdomain python3.9[197232]: ansible-ansible.builtin.file Invoked with path=/etc/libvirt/passwd.db state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:24 np0005604215.localdomain sudo[197230]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:24 np0005604215.localdomain sudo[197340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtlenfnguyuklgnzvutondjmzfqngzfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937564.635973-2030-255594573305745/AnsiballZ_file.py
Feb 01 09:19:24 np0005604215.localdomain sudo[197340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:25 np0005604215.localdomain python3.9[197342]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:25 np0005604215.localdomain sudo[197340]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:25 np0005604215.localdomain sudo[197450]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-stzuuesobmrtqsliirnfxvajwgbegedz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937565.2924843-2030-221534396234801/AnsiballZ_file.py
Feb 01 09:19:25 np0005604215.localdomain sudo[197450]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58714 DF PROTO=TCP SPT=58780 DPT=9101 SEQ=2487244976 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6593B0D0000000001030307) 
Feb 01 09:19:25 np0005604215.localdomain python3.9[197452]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:25 np0005604215.localdomain sudo[197450]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:26 np0005604215.localdomain sudo[197560]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xaaxrcjxpfsqgkqwyyfkcnbttvhmiytx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937566.1427348-2030-31540582408287/AnsiballZ_file.py
Feb 01 09:19:26 np0005604215.localdomain sudo[197560]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:26 np0005604215.localdomain python3.9[197562]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:26 np0005604215.localdomain sudo[197560]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:26 np0005604215.localdomain sudo[197670]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eorpcvherzdnnsdtxrehxmhbfsfhkbdv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937566.7250586-2030-133257173346866/AnsiballZ_file.py
Feb 01 09:19:26 np0005604215.localdomain sudo[197670]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:27 np0005604215.localdomain python3.9[197672]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:27 np0005604215.localdomain sudo[197670]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:27 np0005604215.localdomain sudo[197780]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhpsoizaxizgdpwsqptmkrcuhrtzdnhz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937567.358828-2030-130513248397197/AnsiballZ_file.py
Feb 01 09:19:27 np0005604215.localdomain sudo[197780]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:27 np0005604215.localdomain python3.9[197782]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:27 np0005604215.localdomain sudo[197780]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:28 np0005604215.localdomain sudo[197890]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqhsddqaooarwjaeqkvkifgoftldmihm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937568.0299945-2030-187152399214876/AnsiballZ_file.py
Feb 01 09:19:28 np0005604215.localdomain sudo[197890]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:28 np0005604215.localdomain python3.9[197892]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:28 np0005604215.localdomain sudo[197890]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:28 np0005604215.localdomain sudo[198000]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kowqrddzpofziyadfktlkqsfckdldzdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937568.6515355-2030-98389837681170/AnsiballZ_file.py
Feb 01 09:19:28 np0005604215.localdomain sudo[198000]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:29 np0005604215.localdomain python3.9[198002]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:29 np0005604215.localdomain sudo[198000]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:29 np0005604215.localdomain sudo[198110]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yulkbwnbnjuqebuypvoembahbcryjtdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937569.2416816-2030-242984356350068/AnsiballZ_file.py
Feb 01 09:19:29 np0005604215.localdomain sudo[198110]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:29 np0005604215.localdomain python3.9[198112]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:29 np0005604215.localdomain sudo[198110]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39928 DF PROTO=TCP SPT=37178 DPT=9882 SEQ=6424101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6594C3D0000000001030307) 
Feb 01 09:19:30 np0005604215.localdomain sudo[198220]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmjvrjekiodfeyeouplcchdntkxmroyk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937569.8063068-2030-44828712338626/AnsiballZ_file.py
Feb 01 09:19:30 np0005604215.localdomain sudo[198220]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:30 np0005604215.localdomain python3.9[198222]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:30 np0005604215.localdomain sudo[198220]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:30 np0005604215.localdomain sudo[198330]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqqpgmatbugooxocfqurujoociupsaxu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937570.382248-2030-253772571106111/AnsiballZ_file.py
Feb 01 09:19:30 np0005604215.localdomain sudo[198330]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:30 np0005604215.localdomain python3.9[198332]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:30 np0005604215.localdomain sudo[198330]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39929 DF PROTO=TCP SPT=37178 DPT=9882 SEQ=6424101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659504D0000000001030307) 
Feb 01 09:19:31 np0005604215.localdomain sudo[198440]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ncaxhogcwcudhjywhxglxhbhcealkwha ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937571.0156517-2030-22676387714722/AnsiballZ_file.py
Feb 01 09:19:31 np0005604215.localdomain sudo[198440]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:31 np0005604215.localdomain python3.9[198442]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:31 np0005604215.localdomain sudo[198440]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:31 np0005604215.localdomain sudo[198550]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fseujmekddkhkpsjzbcitffhviqmmshg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937571.7012455-2030-130529133263414/AnsiballZ_file.py
Feb 01 09:19:31 np0005604215.localdomain sudo[198550]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:32 np0005604215.localdomain python3.9[198552]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:32 np0005604215.localdomain sudo[198550]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:32 np0005604215.localdomain sudo[198660]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zppwgamihzomidclmpvzufrqhiiegxpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937572.2906911-2030-169672247022540/AnsiballZ_file.py
Feb 01 09:19:32 np0005604215.localdomain sudo[198660]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:32 np0005604215.localdomain python3.9[198662]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:32 np0005604215.localdomain sudo[198660]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39930 DF PROTO=TCP SPT=37178 DPT=9882 SEQ=6424101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659584D0000000001030307) 
Feb 01 09:19:33 np0005604215.localdomain sudo[198770]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bqyckhcetxsjxqkwdkpvofqvpbjsiusi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937572.905612-2030-274625699623313/AnsiballZ_file.py
Feb 01 09:19:33 np0005604215.localdomain sudo[198770]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:33 np0005604215.localdomain python3.9[198772]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:33 np0005604215.localdomain sudo[198770]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:34 np0005604215.localdomain sudo[198880]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yuveohkvrillhodqfubivuefokosbylk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937573.9669535-2328-130868539584551/AnsiballZ_stat.py
Feb 01 09:19:34 np0005604215.localdomain sudo[198880]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:34 np0005604215.localdomain python3.9[198882]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:34 np0005604215.localdomain sudo[198880]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:34 np0005604215.localdomain sudo[198968]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pefgipxtfuziklerpbpxgosksmhokrst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937573.9669535-2328-130868539584551/AnsiballZ_copy.py
Feb 01 09:19:34 np0005604215.localdomain sudo[198968]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:34 np0005604215.localdomain python3.9[198970]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937573.9669535-2328-130868539584551/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:35 np0005604215.localdomain sudo[198968]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:35 np0005604215.localdomain sudo[199078]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwlnshrlluhcmibapubtkgyhggtbdhjw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937575.1464984-2328-159459720077710/AnsiballZ_stat.py
Feb 01 09:19:35 np0005604215.localdomain sudo[199078]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:35 np0005604215.localdomain python3.9[199080]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:35 np0005604215.localdomain sudo[199078]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:35 np0005604215.localdomain sudo[199166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cuhpvdkkrrrfjkawvyxdlpnodgvvaoil ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937575.1464984-2328-159459720077710/AnsiballZ_copy.py
Feb 01 09:19:35 np0005604215.localdomain sudo[199166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:36 np0005604215.localdomain python3.9[199168]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937575.1464984-2328-159459720077710/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:36 np0005604215.localdomain sudo[199166]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8408 DF PROTO=TCP SPT=44364 DPT=9102 SEQ=2839907613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65965CD0000000001030307) 
Feb 01 09:19:36 np0005604215.localdomain sudo[199276]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zcshayrlcddznpagqbrtxacysnhcfpzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937576.318512-2328-188162334281914/AnsiballZ_stat.py
Feb 01 09:19:36 np0005604215.localdomain sudo[199276]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:36 np0005604215.localdomain python3.9[199278]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:36 np0005604215.localdomain sudo[199276]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:37 np0005604215.localdomain sudo[199364]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-avwtgmqokmyhgrlffnniobughmnbrelj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937576.318512-2328-188162334281914/AnsiballZ_copy.py
Feb 01 09:19:37 np0005604215.localdomain sudo[199364]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:37 np0005604215.localdomain python3.9[199366]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937576.318512-2328-188162334281914/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:37 np0005604215.localdomain sudo[199364]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:38 np0005604215.localdomain sudo[199474]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elvvqqfmhichwhumqyztmoayqedpxkwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937577.662323-2328-72492885746653/AnsiballZ_stat.py
Feb 01 09:19:38 np0005604215.localdomain sudo[199474]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:38 np0005604215.localdomain python3.9[199476]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:38 np0005604215.localdomain sudo[199474]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:38 np0005604215.localdomain sudo[199562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-unieczbgbwsowxcleibwpljrqbfkbbke ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937577.662323-2328-72492885746653/AnsiballZ_copy.py
Feb 01 09:19:38 np0005604215.localdomain sudo[199562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:38 np0005604215.localdomain python3.9[199564]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937577.662323-2328-72492885746653/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:38 np0005604215.localdomain sudo[199562]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:39 np0005604215.localdomain sudo[199672]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yqyqmskxkappyhivlnxgufrkfmqsvcjt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937578.957737-2328-24903482110414/AnsiballZ_stat.py
Feb 01 09:19:39 np0005604215.localdomain sudo[199672]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:39 np0005604215.localdomain python3.9[199674]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:39 np0005604215.localdomain sudo[199672]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10847 DF PROTO=TCP SPT=46148 DPT=9100 SEQ=1988782609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659724D0000000001030307) 
Feb 01 09:19:39 np0005604215.localdomain sudo[199760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ovxiwpvjxloruhdkepuxmcevpnmzkbft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937578.957737-2328-24903482110414/AnsiballZ_copy.py
Feb 01 09:19:39 np0005604215.localdomain sudo[199760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:39 np0005604215.localdomain python3.9[199762]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937578.957737-2328-24903482110414/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:39 np0005604215.localdomain sudo[199760]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:40 np0005604215.localdomain sudo[199870]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uvdfdjjohbktxaowbskfnpisquxbhlho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937580.0829058-2328-91860248564206/AnsiballZ_stat.py
Feb 01 09:19:40 np0005604215.localdomain sudo[199870]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:40 np0005604215.localdomain python3.9[199872]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:40 np0005604215.localdomain sudo[199870]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:40 np0005604215.localdomain sudo[199958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhhbfuaffapnoejrtoqgdczjkkejqifh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937580.0829058-2328-91860248564206/AnsiballZ_copy.py
Feb 01 09:19:40 np0005604215.localdomain sudo[199958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:41 np0005604215.localdomain python3.9[199960]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937580.0829058-2328-91860248564206/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:41 np0005604215.localdomain sudo[199958]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:41 np0005604215.localdomain sudo[200068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bmfftxjlfyjmjxptdrkepnvkvxwfrdct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937581.2322366-2328-170818026424784/AnsiballZ_stat.py
Feb 01 09:19:41 np0005604215.localdomain sudo[200068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:41 np0005604215.localdomain python3.9[200070]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:41 np0005604215.localdomain sudo[200068]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:19:41.735 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:19:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:19:41.736 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:19:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:19:41.736 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:19:42 np0005604215.localdomain sudo[200156]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-goqtaaivonuaymyanzeaeqktrdftgqwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937581.2322366-2328-170818026424784/AnsiballZ_copy.py
Feb 01 09:19:42 np0005604215.localdomain sudo[200156]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:19:42 np0005604215.localdomain systemd[1]: tmp-crun.BAI8TF.mount: Deactivated successfully.
Feb 01 09:19:42 np0005604215.localdomain podman[200159]: 2026-02-01 09:19:42.195049207 +0000 UTC m=+0.086426979 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:19:42 np0005604215.localdomain podman[200159]: 2026-02-01 09:19:42.285757735 +0000 UTC m=+0.177135527 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:19:42 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:19:42 np0005604215.localdomain python3.9[200158]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937581.2322366-2328-170818026424784/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:42 np0005604215.localdomain sudo[200156]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:42 np0005604215.localdomain sudo[200290]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qzfbcvwfnrgykpzjxuffkzsnevttcxvk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937582.4921122-2328-188115481102160/AnsiballZ_stat.py
Feb 01 09:19:42 np0005604215.localdomain sudo[200290]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:42 np0005604215.localdomain python3.9[200292]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:42 np0005604215.localdomain sudo[200290]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56094 DF PROTO=TCP SPT=47266 DPT=9101 SEQ=137913282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6597F8E0000000001030307) 
Feb 01 09:19:43 np0005604215.localdomain sudo[200378]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grrrsjhdxykzfakblamogmvfamgvxkwq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937582.4921122-2328-188115481102160/AnsiballZ_copy.py
Feb 01 09:19:43 np0005604215.localdomain sudo[200378]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:43 np0005604215.localdomain python3.9[200380]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937582.4921122-2328-188115481102160/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:43 np0005604215.localdomain sudo[200378]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:43 np0005604215.localdomain sudo[200488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfjspwquuffejfphuthatvyiialqgphm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937583.6608582-2328-39857604497030/AnsiballZ_stat.py
Feb 01 09:19:43 np0005604215.localdomain sudo[200488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:44 np0005604215.localdomain python3.9[200490]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:44 np0005604215.localdomain sudo[200488]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:44 np0005604215.localdomain sudo[200576]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htnbzqoorhmkhnddgiqvucgztbkbqjmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937583.6608582-2328-39857604497030/AnsiballZ_copy.py
Feb 01 09:19:44 np0005604215.localdomain sudo[200576]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:44 np0005604215.localdomain python3.9[200578]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937583.6608582-2328-39857604497030/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:44 np0005604215.localdomain sudo[200576]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:19:44 np0005604215.localdomain systemd[1]: tmp-crun.7UlvA5.mount: Deactivated successfully.
Feb 01 09:19:44 np0005604215.localdomain podman[200597]: 2026-02-01 09:19:44.871053291 +0000 UTC m=+0.082345942 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS)
Feb 01 09:19:44 np0005604215.localdomain podman[200597]: 2026-02-01 09:19:44.900390199 +0000 UTC m=+0.111682820 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:19:44 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:19:45 np0005604215.localdomain sudo[200704]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nseypefjdbvnekxpyoxtviwmkagdyllg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937584.811555-2328-76833092473147/AnsiballZ_stat.py
Feb 01 09:19:45 np0005604215.localdomain sudo[200704]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:45 np0005604215.localdomain python3.9[200706]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:45 np0005604215.localdomain sudo[200704]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39932 DF PROTO=TCP SPT=37178 DPT=9882 SEQ=6424101 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659890D0000000001030307) 
Feb 01 09:19:45 np0005604215.localdomain sudo[200792]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmatzitmxjdncthajecioglimzrzmulw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937584.811555-2328-76833092473147/AnsiballZ_copy.py
Feb 01 09:19:45 np0005604215.localdomain sudo[200792]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:45 np0005604215.localdomain python3.9[200794]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937584.811555-2328-76833092473147/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:45 np0005604215.localdomain sudo[200792]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:46 np0005604215.localdomain sudo[200902]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztlqzgsipzykwixwjwarepkminpyxqxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937586.0166574-2328-28537287082178/AnsiballZ_stat.py
Feb 01 09:19:46 np0005604215.localdomain sudo[200902]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:46 np0005604215.localdomain python3.9[200904]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:46 np0005604215.localdomain sudo[200902]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:46 np0005604215.localdomain sudo[200990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpzwrsovmxzmvfoaszndbakgqpvyjwjn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937586.0166574-2328-28537287082178/AnsiballZ_copy.py
Feb 01 09:19:46 np0005604215.localdomain sudo[200990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:47 np0005604215.localdomain python3.9[200992]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937586.0166574-2328-28537287082178/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:47 np0005604215.localdomain sudo[200990]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:47 np0005604215.localdomain sudo[201100]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-trbruhtlrhunkezlhihrgaferxfacuzh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937587.2215085-2328-218006800649132/AnsiballZ_stat.py
Feb 01 09:19:47 np0005604215.localdomain sudo[201100]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:47 np0005604215.localdomain python3.9[201102]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:47 np0005604215.localdomain sudo[201100]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:48 np0005604215.localdomain sudo[201188]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxdrvyytestgrecvvtwbfycnspoclfws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937587.2215085-2328-218006800649132/AnsiballZ_copy.py
Feb 01 09:19:48 np0005604215.localdomain sudo[201188]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:48 np0005604215.localdomain python3.9[201190]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937587.2215085-2328-218006800649132/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:48 np0005604215.localdomain sudo[201188]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=8410 DF PROTO=TCP SPT=44364 DPT=9102 SEQ=2839907613 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659950D0000000001030307) 
Feb 01 09:19:49 np0005604215.localdomain sudo[201298]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ducykbsbwvdolfjnhjadbiczunxbmdfw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937588.6736314-2328-128116854492691/AnsiballZ_stat.py
Feb 01 09:19:49 np0005604215.localdomain sudo[201298]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:49 np0005604215.localdomain python3.9[201300]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:49 np0005604215.localdomain sudo[201298]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:49 np0005604215.localdomain sudo[201386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kmmzoduyxdotwetbzgbjllifdmijxblz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937588.6736314-2328-128116854492691/AnsiballZ_copy.py
Feb 01 09:19:49 np0005604215.localdomain sudo[201386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:49 np0005604215.localdomain python3.9[201388]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937588.6736314-2328-128116854492691/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:49 np0005604215.localdomain sudo[201386]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:50 np0005604215.localdomain sudo[201496]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-upzryspsdnqsptrxvcoggsnyuhdcqfbk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937589.923061-2328-2423269523355/AnsiballZ_stat.py
Feb 01 09:19:50 np0005604215.localdomain sudo[201496]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:50 np0005604215.localdomain python3.9[201498]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:19:50 np0005604215.localdomain sudo[201496]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:50 np0005604215.localdomain sudo[201584]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulybrcolkbmklrdmdhzrxhkcmnscwcpc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937589.923061-2328-2423269523355/AnsiballZ_copy.py
Feb 01 09:19:50 np0005604215.localdomain sudo[201584]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:51 np0005604215.localdomain python3.9[201586]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937589.923061-2328-2423269523355/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:19:51 np0005604215.localdomain sudo[201584]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10849 DF PROTO=TCP SPT=46148 DPT=9100 SEQ=1988782609 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659A30D0000000001030307) 
Feb 01 09:19:52 np0005604215.localdomain python3.9[201694]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail
                                                            ls -lRZ /run/libvirt | grep -E ':container_\S+_t'
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:19:53 np0005604215.localdomain sudo[201805]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vmhvdguurzcoxuzcwmqxuelpnlnltcqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937592.678934-2945-222162325212721/AnsiballZ_seboolean.py
Feb 01 09:19:53 np0005604215.localdomain sudo[201805]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:53 np0005604215.localdomain python3.9[201807]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Feb 01 09:19:53 np0005604215.localdomain sudo[201805]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:54 np0005604215.localdomain sudo[201915]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ehilyjmkttttsmrrkizjfrprjyulryen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937594.5245402-2975-222602956032450/AnsiballZ_systemd.py
Feb 01 09:19:54 np0005604215.localdomain sudo[201915]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:55 np0005604215.localdomain python3.9[201917]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:19:55 np0005604215.localdomain systemd-rc-local-generator[201941]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:19:55 np0005604215.localdomain systemd-sysv-generator[201944]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:19:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56096 DF PROTO=TCP SPT=47266 DPT=9101 SEQ=137913282 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659AF0D0000000001030307) 
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Starting libvirt logging daemon socket...
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Listening on libvirt logging daemon socket.
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Starting libvirt logging daemon admin socket...
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Listening on libvirt logging daemon admin socket.
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Starting libvirt logging daemon...
Feb 01 09:19:55 np0005604215.localdomain systemd[1]: Started libvirt logging daemon.
Feb 01 09:19:55 np0005604215.localdomain sudo[201915]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:56 np0005604215.localdomain sudo[202066]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sntlsvsztbrhicifxsxnggejiffroulm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937596.3363879-2975-9510621264756/AnsiballZ_systemd.py
Feb 01 09:19:56 np0005604215.localdomain sudo[202066]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:56 np0005604215.localdomain python3.9[202068]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:19:56 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:19:57 np0005604215.localdomain systemd-sysv-generator[202098]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:19:57 np0005604215.localdomain systemd-rc-local-generator[202093]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Starting libvirt nodedev daemon socket...
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Listening on libvirt nodedev daemon socket.
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Starting libvirt nodedev daemon admin socket...
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Starting libvirt nodedev daemon read-only socket...
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Listening on libvirt nodedev daemon admin socket.
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Started libvirt nodedev daemon.
Feb 01 09:19:57 np0005604215.localdomain sudo[202066]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 01 09:19:57 np0005604215.localdomain sudo[202244]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxkypztppyjejviwtnpmafektseyfffx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937597.4825473-2975-66619930513238/AnsiballZ_systemd.py
Feb 01 09:19:57 np0005604215.localdomain sudo[202244]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Feb 01 09:19:57 np0005604215.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Feb 01 09:19:58 np0005604215.localdomain python3.9[202247]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:19:58 np0005604215.localdomain systemd-rc-local-generator[202278]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:19:58 np0005604215.localdomain systemd-sysv-generator[202283]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Starting libvirt proxy daemon socket...
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Listening on libvirt proxy daemon socket.
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Starting libvirt proxy daemon admin socket...
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Starting libvirt proxy daemon read-only socket...
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Listening on libvirt proxy daemon admin socket.
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Listening on libvirt proxy daemon read-only socket.
Feb 01 09:19:58 np0005604215.localdomain systemd[1]: Started libvirt proxy daemon.
Feb 01 09:19:58 np0005604215.localdomain sudo[202244]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:58 np0005604215.localdomain setroubleshoot[202105]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l bc542b42-7abb-4b5c-9fa2-16a0c9696397
Feb 01 09:19:58 np0005604215.localdomain setroubleshoot[202105]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Feb 01 09:19:58 np0005604215.localdomain setroubleshoot[202105]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l bc542b42-7abb-4b5c-9fa2-16a0c9696397
Feb 01 09:19:58 np0005604215.localdomain setroubleshoot[202105]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.
                                                                 
                                                                 *****  Plugin dac_override (91.4 confidence) suggests   **********************
                                                                 
                                                                 If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system
                                                                 Then turn on full auditing to get path information about the offending file and generate the error again.
                                                                 Do
                                                                 
                                                                 Turn on full auditing
                                                                 # auditctl -w /etc/shadow -p w
                                                                 Try to recreate AVC. Then execute
                                                                 # ausearch -m avc -ts recent
                                                                 If you see PATH record check ownership/permissions on file, and fix it,
                                                                 otherwise report as a bugzilla.
                                                                 
                                                                 *****  Plugin catchall (9.59 confidence) suggests   **************************
                                                                 
                                                                 If you believe that virtlogd should have the dac_read_search capability by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd
                                                                 # semodule -X 300 -i my-virtlogd.pp
                                                                 
Feb 01 09:19:58 np0005604215.localdomain sudo[202423]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uwhglcbizgbkatjevrmumpevlgcivovd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937598.573276-2975-108169486599805/AnsiballZ_systemd.py
Feb 01 09:19:58 np0005604215.localdomain sudo[202423]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:19:59 np0005604215.localdomain python3.9[202425]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:19:59 np0005604215.localdomain systemd-sysv-generator[202455]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:19:59 np0005604215.localdomain systemd-rc-local-generator[202448]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Listening on libvirt locking daemon socket.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Starting libvirt QEMU daemon socket...
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Starting Virtual Machine and Container Registration Service...
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Listening on libvirt QEMU daemon socket.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Starting libvirt QEMU daemon admin socket...
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Starting libvirt QEMU daemon read-only socket...
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Listening on libvirt QEMU daemon admin socket.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Started Virtual Machine and Container Registration Service.
Feb 01 09:19:59 np0005604215.localdomain systemd[1]: Started libvirt QEMU daemon.
Feb 01 09:19:59 np0005604215.localdomain sudo[202423]: pam_unix(sudo:session): session closed for user root
Feb 01 09:19:59 np0005604215.localdomain sudo[202597]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dllbcuacwqaojuihkcuhayreejszpulh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937599.6800613-2975-1758667176810/AnsiballZ_systemd.py
Feb 01 09:19:59 np0005604215.localdomain sudo[202597]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2742 DF PROTO=TCP SPT=51952 DPT=9882 SEQ=3964749355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659C16C0000000001030307) 
Feb 01 09:20:00 np0005604215.localdomain python3.9[202599]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:20:00 np0005604215.localdomain systemd-rc-local-generator[202624]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:20:00 np0005604215.localdomain systemd-sysv-generator[202629]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Starting libvirt secret daemon socket...
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Listening on libvirt secret daemon socket.
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Starting libvirt secret daemon admin socket...
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Starting libvirt secret daemon read-only socket...
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Listening on libvirt secret daemon admin socket.
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Listening on libvirt secret daemon read-only socket.
Feb 01 09:20:00 np0005604215.localdomain systemd[1]: Started libvirt secret daemon.
Feb 01 09:20:00 np0005604215.localdomain sudo[202597]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2743 DF PROTO=TCP SPT=51952 DPT=9882 SEQ=3964749355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659C58D0000000001030307) 
Feb 01 09:20:01 np0005604215.localdomain sudo[202768]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puohuqaaxgjrkhhzriklckkvthkufgmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937601.0313735-3086-52323616231257/AnsiballZ_file.py
Feb 01 09:20:01 np0005604215.localdomain sudo[202768]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:01 np0005604215.localdomain python3.9[202770]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:01 np0005604215.localdomain sudo[202768]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:02 np0005604215.localdomain sudo[202878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-euylzuhoacdanhkypjbtkcdvdptvockv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937601.7566912-3111-58172828045516/AnsiballZ_find.py
Feb 01 09:20:02 np0005604215.localdomain sudo[202878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:02 np0005604215.localdomain python3.9[202880]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:20:02 np0005604215.localdomain sudo[202878]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:02 np0005604215.localdomain sudo[202988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkadsgjazdzucrjfkbdkntjxabtpakpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937602.473686-3135-136009700192023/AnsiballZ_command.py
Feb 01 09:20:02 np0005604215.localdomain sudo[202988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:03 np0005604215.localdomain python3.9[202990]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail;
                                                            echo ceph
                                                            awk -F '=' '/fsid/ {print $2}' /var/lib/openstack/config/ceph/ceph.conf | xargs
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:20:03 np0005604215.localdomain sudo[202988]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2744 DF PROTO=TCP SPT=51952 DPT=9882 SEQ=3964749355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659CD8D0000000001030307) 
Feb 01 09:20:03 np0005604215.localdomain python3.9[203102]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.keyring'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:20:04 np0005604215.localdomain python3.9[203210]: ansible-ansible.legacy.stat Invoked with path=/tmp/secret.xml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:05 np0005604215.localdomain python3.9[203296]: ansible-ansible.legacy.copy Invoked with dest=/tmp/secret.xml mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937604.3187819-3192-95176039703118/.source.xml follow=False _original_basename=secret.xml.j2 checksum=8e79ccae86c93336b3974fdc11794b13702e9d6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:06 np0005604215.localdomain sudo[203404]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxursbwzlweeqcttnwyzbftzsyvqomxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937605.6490304-3237-261496456482504/AnsiballZ_command.py
Feb 01 09:20:06 np0005604215.localdomain sudo[203404]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:06 np0005604215.localdomain python3.9[203406]: ansible-ansible.legacy.command Invoked with _raw_params=virsh secret-undefine 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                            virsh secret-define --file /tmp/secret.xml
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:20:06 np0005604215.localdomain polkitd[1029]: Registered Authentication Agent for unix-process:203408:969039 (system bus name :1.2835 [pkttyagent --process 203408 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 01 09:20:06 np0005604215.localdomain polkitd[1029]: Unregistered Authentication Agent for unix-process:203408:969039 (system bus name :1.2835, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 01 09:20:06 np0005604215.localdomain polkitd[1029]: Registered Authentication Agent for unix-process:203407:969038 (system bus name :1.2836 [pkttyagent --process 203407 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 01 09:20:06 np0005604215.localdomain polkitd[1029]: Unregistered Authentication Agent for unix-process:203407:969038 (system bus name :1.2836, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 01 09:20:06 np0005604215.localdomain sudo[203404]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5758 DF PROTO=TCP SPT=33652 DPT=9102 SEQ=4144422978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659DB0D0000000001030307) 
Feb 01 09:20:06 np0005604215.localdomain python3.9[203526]: ansible-ansible.builtin.file Invoked with path=/tmp/secret.xml state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:07 np0005604215.localdomain sudo[203634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ywfadqjivzcqzjgvckiiyawfwxvtnxnx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937607.2334182-3284-228926732427777/AnsiballZ_command.py
Feb 01 09:20:07 np0005604215.localdomain sudo[203634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:08 np0005604215.localdomain sudo[203634]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:08 np0005604215.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Feb 01 09:20:08 np0005604215.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 01 09:20:09 np0005604215.localdomain sudo[203693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:20:09 np0005604215.localdomain sudo[203693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:20:09 np0005604215.localdomain sudo[203693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:09 np0005604215.localdomain sudo[203727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:20:09 np0005604215.localdomain sudo[203727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:20:09 np0005604215.localdomain sudo[203781]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdbnjjignalakcguxiwfpxvqwlegpdow ; FSID=33fac0b9-80c7-560f-918a-c92d3021ca1e KEY=AQCqA39pAAAAABAAY/1Cx38ClRwclP8OidwwPQ== /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937608.9831853-3308-177199990040423/AnsiballZ_command.py
Feb 01 09:20:09 np0005604215.localdomain sudo[203781]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:09 np0005604215.localdomain polkitd[1029]: Registered Authentication Agent for unix-process:203784:969362 (system bus name :1.2841 [pkttyagent --process 203784 --notify-fd 4 --fallback], object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8)
Feb 01 09:20:09 np0005604215.localdomain polkitd[1029]: Unregistered Authentication Agent for unix-process:203784:969362 (system bus name :1.2841, object path /org/freedesktop/PolicyKit1/AuthenticationAgent, locale C.UTF-8) (disconnected from bus)
Feb 01 09:20:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42509 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=3837395286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659E78D0000000001030307) 
Feb 01 09:20:09 np0005604215.localdomain sudo[203727]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:10 np0005604215.localdomain sudo[203781]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:11 np0005604215.localdomain sudo[203911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:20:11 np0005604215.localdomain sudo[203911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:20:11 np0005604215.localdomain sudo[203911]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:11 np0005604215.localdomain sudo[203947]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-diepwdrworebqvuhqwycwqdqxaflnorq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937610.7580862-3333-127858232448282/AnsiballZ_copy.py
Feb 01 09:20:11 np0005604215.localdomain sudo[203947]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:11 np0005604215.localdomain python3.9[203950]: ansible-ansible.legacy.copy Invoked with dest=/etc/ceph/ceph.conf group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/config/ceph/ceph.conf backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:11 np0005604215.localdomain sudo[203947]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:11 np0005604215.localdomain sudo[204058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bykxkcwpqvrkqpihykxkrzynqzhntxav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937611.5022473-3357-166113692975764/AnsiballZ_stat.py
Feb 01 09:20:11 np0005604215.localdomain sudo[204058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:11 np0005604215.localdomain python3.9[204060]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:11 np0005604215.localdomain sudo[204058]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:12 np0005604215.localdomain sudo[204146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mddrcbkkscgsmhbnhqmavksntmegqwwz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937611.5022473-3357-166113692975764/AnsiballZ_copy.py
Feb 01 09:20:12 np0005604215.localdomain sudo[204146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:20:12 np0005604215.localdomain podman[204149]: 2026-02-01 09:20:12.424962637 +0000 UTC m=+0.076776096 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:20:12 np0005604215.localdomain podman[204149]: 2026-02-01 09:20:12.530786711 +0000 UTC m=+0.182600200 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_managed=true)
Feb 01 09:20:12 np0005604215.localdomain python3.9[204148]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937611.5022473-3357-166113692975764/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=dc5ee7162311c27a6084cbee4052b901d56cb1ba backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:12 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:20:12 np0005604215.localdomain sudo[204146]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23912 DF PROTO=TCP SPT=35626 DPT=9101 SEQ=1688313427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659F48D0000000001030307) 
Feb 01 09:20:13 np0005604215.localdomain sudo[204282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tvutipinlllbfbplvhcwjyhydocbksoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937612.994769-3404-164167030070486/AnsiballZ_file.py
Feb 01 09:20:13 np0005604215.localdomain sudo[204282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:13 np0005604215.localdomain python3.9[204284]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:13 np0005604215.localdomain sudo[204282]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:13 np0005604215.localdomain sshd[204340]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:20:14 np0005604215.localdomain sudo[204394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lsqmhiciqxeatbhiovwpnyqhigdvaaqz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937613.6912537-3429-218143983478707/AnsiballZ_stat.py
Feb 01 09:20:14 np0005604215.localdomain sudo[204394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:14 np0005604215.localdomain python3.9[204396]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:14 np0005604215.localdomain sudo[204394]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:14 np0005604215.localdomain sshd[204340]: Invalid user tecnico from 85.206.171.113 port 33386
Feb 01 09:20:14 np0005604215.localdomain sudo[204451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bktvyjtkrvqwltqgrqpnkcqgfccczldz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937613.6912537-3429-218143983478707/AnsiballZ_file.py
Feb 01 09:20:14 np0005604215.localdomain sudo[204451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:14 np0005604215.localdomain sshd[204340]: Received disconnect from 85.206.171.113 port 33386:11: Bye Bye [preauth]
Feb 01 09:20:14 np0005604215.localdomain sshd[204340]: Disconnected from invalid user tecnico 85.206.171.113 port 33386 [preauth]
Feb 01 09:20:14 np0005604215.localdomain python3.9[204453]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:14 np0005604215.localdomain sudo[204451]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:15 np0005604215.localdomain sudo[204561]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-psxttrtofnhscnjmhludhpzxwltlhlrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937614.9117708-3464-138752857059010/AnsiballZ_stat.py
Feb 01 09:20:15 np0005604215.localdomain sudo[204561]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:20:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2746 DF PROTO=TCP SPT=51952 DPT=9882 SEQ=3964749355 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA659FD0D0000000001030307) 
Feb 01 09:20:15 np0005604215.localdomain systemd[1]: tmp-crun.crOu9p.mount: Deactivated successfully.
Feb 01 09:20:15 np0005604215.localdomain podman[204564]: 2026-02-01 09:20:15.300483489 +0000 UTC m=+0.093121028 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 09:20:15 np0005604215.localdomain podman[204564]: 2026-02-01 09:20:15.332733558 +0000 UTC m=+0.125371127 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:20:15 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:20:15 np0005604215.localdomain python3.9[204563]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:15 np0005604215.localdomain sudo[204561]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:15 np0005604215.localdomain sudo[204634]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zfhjuaaxevjdvihptenwdolbqoztzdfa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937614.9117708-3464-138752857059010/AnsiballZ_file.py
Feb 01 09:20:15 np0005604215.localdomain sudo[204634]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:15 np0005604215.localdomain python3.9[204636]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.2w8x5h1a recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:15 np0005604215.localdomain sudo[204634]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:16 np0005604215.localdomain sudo[204744]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-adoooipxwfpzulckfisirldhylfxpzhm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937616.133062-3501-127717319734320/AnsiballZ_stat.py
Feb 01 09:20:16 np0005604215.localdomain sudo[204744]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:16 np0005604215.localdomain python3.9[204746]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:16 np0005604215.localdomain sudo[204744]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:16 np0005604215.localdomain sudo[204801]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfrguzuxruniemxpqrzaquljjivrudep ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937616.133062-3501-127717319734320/AnsiballZ_file.py
Feb 01 09:20:16 np0005604215.localdomain sudo[204801]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:17 np0005604215.localdomain python3.9[204803]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:17 np0005604215.localdomain sudo[204801]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:17 np0005604215.localdomain sudo[204911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-izrvbnzkfesyribmverxgbmqthrlqhbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937617.4572515-3540-262020796073086/AnsiballZ_command.py
Feb 01 09:20:17 np0005604215.localdomain sudo[204911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:17 np0005604215.localdomain python3.9[204913]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:20:18 np0005604215.localdomain sudo[204911]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:18 np0005604215.localdomain sudo[205022]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-neqrtxfznxntqhqfjhpyqxrktzhguzab ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937618.2594848-3564-208992288562586/AnsiballZ_edpm_nftables_from_files.py
Feb 01 09:20:18 np0005604215.localdomain sudo[205022]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5760 DF PROTO=TCP SPT=33652 DPT=9102 SEQ=4144422978 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A0B0E0000000001030307) 
Feb 01 09:20:18 np0005604215.localdomain python3[205024]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 01 09:20:18 np0005604215.localdomain sudo[205022]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:19 np0005604215.localdomain sudo[205132]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tkqjvjaemkhfcllwrrdhabftbcsxfjds ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937619.1912372-3588-85314157157605/AnsiballZ_stat.py
Feb 01 09:20:19 np0005604215.localdomain sudo[205132]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:19 np0005604215.localdomain python3.9[205134]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:19 np0005604215.localdomain sudo[205132]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:20 np0005604215.localdomain sudo[205189]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gmegjutnloqssohzoonzuwetptbikbod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937619.1912372-3588-85314157157605/AnsiballZ_file.py
Feb 01 09:20:20 np0005604215.localdomain sudo[205189]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:20 np0005604215.localdomain python3.9[205191]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:20 np0005604215.localdomain sudo[205189]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:20 np0005604215.localdomain sudo[205299]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dkubyqwcwqkripmxvmepfhypqphxulsb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937620.487486-3623-221950494658288/AnsiballZ_stat.py
Feb 01 09:20:20 np0005604215.localdomain sudo[205299]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:21 np0005604215.localdomain python3.9[205301]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:21 np0005604215.localdomain sudo[205299]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42511 DF PROTO=TCP SPT=42254 DPT=9100 SEQ=3837395286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A170E0000000001030307) 
Feb 01 09:20:22 np0005604215.localdomain sudo[205389]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dcqfjsxxacewsmziztcjtemdascoyaom ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937620.487486-3623-221950494658288/AnsiballZ_copy.py
Feb 01 09:20:22 np0005604215.localdomain sudo[205389]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:22 np0005604215.localdomain python3.9[205391]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937620.487486-3623-221950494658288/.source.nft follow=False _original_basename=jump-chain.j2 checksum=3ce353c89bce3b135a0ed688d4e338b2efb15185 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:22 np0005604215.localdomain sudo[205389]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:23 np0005604215.localdomain sudo[205499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fyqbdwbfnjjarowqbywzqffsumjjzzis ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937622.8561513-3669-255789890014040/AnsiballZ_stat.py
Feb 01 09:20:23 np0005604215.localdomain sudo[205499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:23 np0005604215.localdomain python3.9[205501]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:23 np0005604215.localdomain sudo[205499]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:23 np0005604215.localdomain sudo[205556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqrqtevscwxcoilaxdpptolzmzbyzlvv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937622.8561513-3669-255789890014040/AnsiballZ_file.py
Feb 01 09:20:23 np0005604215.localdomain sudo[205556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:23 np0005604215.localdomain python3.9[205558]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:23 np0005604215.localdomain sudo[205556]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:25 np0005604215.localdomain sudo[205666]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjmjjyrjymjgaefcgiyvrihlhxsuldxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937624.7446265-3705-115035283692235/AnsiballZ_stat.py
Feb 01 09:20:25 np0005604215.localdomain sudo[205666]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:25 np0005604215.localdomain python3.9[205668]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:25 np0005604215.localdomain sudo[205666]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=23914 DF PROTO=TCP SPT=35626 DPT=9101 SEQ=1688313427 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A250D0000000001030307) 
Feb 01 09:20:25 np0005604215.localdomain sudo[205723]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mqwqgpnlvyzkgnghoqvvdscxbnqcefpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937624.7446265-3705-115035283692235/AnsiballZ_file.py
Feb 01 09:20:25 np0005604215.localdomain sudo[205723]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:25 np0005604215.localdomain python3.9[205725]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:25 np0005604215.localdomain sudo[205723]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:26 np0005604215.localdomain sudo[205833]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-reatfavvyxyxynnsrqojckuybvndjwkw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937626.0404294-3741-66577289254133/AnsiballZ_stat.py
Feb 01 09:20:26 np0005604215.localdomain sudo[205833]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:26 np0005604215.localdomain python3.9[205835]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:26 np0005604215.localdomain sudo[205833]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:26 np0005604215.localdomain sudo[205923]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhbccwszrnbdjfsiszegiqstvujeancy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937626.0404294-3741-66577289254133/AnsiballZ_copy.py
Feb 01 09:20:27 np0005604215.localdomain sudo[205923]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:27 np0005604215.localdomain python3.9[205925]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769937626.0404294-3741-66577289254133/.source.nft follow=False _original_basename=ruleset.j2 checksum=e2e2635f27347d386f310e86d2b40c40289835bb backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:27 np0005604215.localdomain sudo[205923]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:27 np0005604215.localdomain sudo[206033]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jdfxwcvjkeumzlaiajvfkuklpyqrpfwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937627.405616-3785-254748181046543/AnsiballZ_file.py
Feb 01 09:20:27 np0005604215.localdomain sudo[206033]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:27 np0005604215.localdomain python3.9[206035]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:27 np0005604215.localdomain sudo[206033]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:28 np0005604215.localdomain sudo[206143]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dozmnotgtrqrhzhhzbgoozgfuzezdheb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937628.1300495-3810-47823463214087/AnsiballZ_command.py
Feb 01 09:20:28 np0005604215.localdomain sudo[206143]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:28 np0005604215.localdomain python3.9[206145]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:20:28 np0005604215.localdomain sudo[206143]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:29 np0005604215.localdomain sudo[206256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hkujcolzsataxynsvgmuzimhyndakfal ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937628.8562937-3834-112133165092292/AnsiballZ_blockinfile.py
Feb 01 09:20:29 np0005604215.localdomain sudo[206256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:29 np0005604215.localdomain python3.9[206258]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:29 np0005604215.localdomain sudo[206256]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61744 DF PROTO=TCP SPT=37710 DPT=9882 SEQ=1595016040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A369F0000000001030307) 
Feb 01 09:20:30 np0005604215.localdomain sudo[206366]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oktvchhtjbttghxokdghfrozcnyzvjca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937629.934762-3861-62197380227170/AnsiballZ_command.py
Feb 01 09:20:30 np0005604215.localdomain sudo[206366]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:30 np0005604215.localdomain python3.9[206368]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:20:30 np0005604215.localdomain sudo[206366]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61745 DF PROTO=TCP SPT=37710 DPT=9882 SEQ=1595016040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A3A8D0000000001030307) 
Feb 01 09:20:31 np0005604215.localdomain sudo[206477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-grdrdbxobimwqzgdgiljsrcmuipeeuny ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937630.8296394-3885-146512521145295/AnsiballZ_stat.py
Feb 01 09:20:31 np0005604215.localdomain sudo[206477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:31 np0005604215.localdomain python3.9[206479]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:20:31 np0005604215.localdomain sudo[206477]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:31 np0005604215.localdomain sudo[206589]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vpbgmjqvtzhdmpudewwcdnfufqatpfxn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937631.5545223-3909-256542360697614/AnsiballZ_command.py
Feb 01 09:20:31 np0005604215.localdomain sudo[206589]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:32 np0005604215.localdomain python3.9[206591]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:20:32 np0005604215.localdomain sudo[206589]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:32 np0005604215.localdomain sudo[206703]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qstauddeibpurnxailmkhynlswxtxsbt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937632.3005176-3933-80689123804232/AnsiballZ_file.py
Feb 01 09:20:32 np0005604215.localdomain sudo[206703]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:32 np0005604215.localdomain python3.9[206705]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:32 np0005604215.localdomain sudo[206703]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61746 DF PROTO=TCP SPT=37710 DPT=9882 SEQ=1595016040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A428D0000000001030307) 
Feb 01 09:20:33 np0005604215.localdomain sudo[206813]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ngserenxyvjduqxczhkjpfxizlupngnn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937633.035836-3957-192710607138931/AnsiballZ_stat.py
Feb 01 09:20:33 np0005604215.localdomain sudo[206813]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:33 np0005604215.localdomain python3.9[206815]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:33 np0005604215.localdomain sudo[206813]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:33 np0005604215.localdomain sudo[206901]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obretwgqnoyreimfuwylabjtmojwvubt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937633.035836-3957-192710607138931/AnsiballZ_copy.py
Feb 01 09:20:33 np0005604215.localdomain sudo[206901]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:34 np0005604215.localdomain python3.9[206903]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937633.035836-3957-192710607138931/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:34 np0005604215.localdomain sudo[206901]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:34 np0005604215.localdomain sudo[207011]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jtepopofjirmisummowuagupnlgtskpr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937634.3641233-4002-24455528867895/AnsiballZ_stat.py
Feb 01 09:20:34 np0005604215.localdomain sudo[207011]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:34 np0005604215.localdomain python3.9[207013]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:34 np0005604215.localdomain sudo[207011]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:35 np0005604215.localdomain sudo[207099]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vvcmqvopvbicfftbieuyzswskqvjcpqf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937634.3641233-4002-24455528867895/AnsiballZ_copy.py
Feb 01 09:20:35 np0005604215.localdomain sudo[207099]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:35 np0005604215.localdomain python3.9[207101]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937634.3641233-4002-24455528867895/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:35 np0005604215.localdomain sudo[207099]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37854 DF PROTO=TCP SPT=33554 DPT=9102 SEQ=3480415361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A504D0000000001030307) 
Feb 01 09:20:36 np0005604215.localdomain sudo[207209]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dxilcqbpptryboekmhvrkyunyyrbjaew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937636.4695392-4047-268693152017653/AnsiballZ_stat.py
Feb 01 09:20:36 np0005604215.localdomain sudo[207209]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:36 np0005604215.localdomain python3.9[207211]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:20:36 np0005604215.localdomain sudo[207209]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:37 np0005604215.localdomain sudo[207297]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptpjrymbxxadanshcxtepruqgxktwppf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937636.4695392-4047-268693152017653/AnsiballZ_copy.py
Feb 01 09:20:37 np0005604215.localdomain sudo[207297]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:37 np0005604215.localdomain python3.9[207299]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937636.4695392-4047-268693152017653/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:20:37 np0005604215.localdomain sudo[207297]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:38 np0005604215.localdomain sudo[207407]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-squanhqcbptfsrlqldfohmnanrnivscx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937637.8956-4092-156395928601866/AnsiballZ_systemd.py
Feb 01 09:20:38 np0005604215.localdomain sudo[207407]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:38 np0005604215.localdomain python3.9[207409]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:20:38 np0005604215.localdomain systemd-sysv-generator[207436]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:20:38 np0005604215.localdomain systemd-rc-local-generator[207431]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:38 np0005604215.localdomain systemd[1]: Reached target edpm_libvirt.target.
Feb 01 09:20:38 np0005604215.localdomain sudo[207407]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:39 np0005604215.localdomain sudo[207556]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bfczumcpnyapoigenyzerdcrdwbduiyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937639.4884603-4116-280586684866545/AnsiballZ_systemd.py
Feb 01 09:20:39 np0005604215.localdomain sudo[207556]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43146 DF PROTO=TCP SPT=35024 DPT=9100 SEQ=1318914317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A5CCD0000000001030307) 
Feb 01 09:20:40 np0005604215.localdomain python3.9[207558]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:20:40 np0005604215.localdomain systemd-rc-local-generator[207580]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:20:40 np0005604215.localdomain systemd-sysv-generator[207586]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:20:40 np0005604215.localdomain systemd-rc-local-generator[207623]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:20:40 np0005604215.localdomain systemd-sysv-generator[207627]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:20:40 np0005604215.localdomain sudo[207556]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:41 np0005604215.localdomain sshd[159022]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:20:41 np0005604215.localdomain systemd[1]: session-52.scope: Deactivated successfully.
Feb 01 09:20:41 np0005604215.localdomain systemd[1]: session-52.scope: Consumed 3min 21.787s CPU time.
Feb 01 09:20:41 np0005604215.localdomain systemd-logind[761]: Session 52 logged out. Waiting for processes to exit.
Feb 01 09:20:41 np0005604215.localdomain systemd-logind[761]: Removed session 52.
Feb 01 09:20:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:20:41.736 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:20:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:20:41.738 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:20:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:20:41.738 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:20:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:20:42 np0005604215.localdomain systemd[1]: tmp-crun.RsZaFf.mount: Deactivated successfully.
Feb 01 09:20:42 np0005604215.localdomain podman[207650]: 2026-02-01 09:20:42.859772127 +0000 UTC m=+0.070927811 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:20:42 np0005604215.localdomain podman[207650]: 2026-02-01 09:20:42.936747658 +0000 UTC m=+0.147903402 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 01 09:20:42 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:20:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46123 DF PROTO=TCP SPT=42572 DPT=9101 SEQ=1222515291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A69CE0000000001030307) 
Feb 01 09:20:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61748 DF PROTO=TCP SPT=37710 DPT=9882 SEQ=1595016040 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A730E0000000001030307) 
Feb 01 09:20:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:20:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.006       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.4      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1ab610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55aabc1aa2d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 6.5e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 09:20:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:20:45 np0005604215.localdomain systemd[1]: tmp-crun.TSmGFU.mount: Deactivated successfully.
Feb 01 09:20:45 np0005604215.localdomain podman[207675]: 2026-02-01 09:20:45.875268476 +0000 UTC m=+0.092722527 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 01 09:20:45 np0005604215.localdomain podman[207675]: 2026-02-01 09:20:45.878771691 +0000 UTC m=+0.096225722 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 09:20:45 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:20:46 np0005604215.localdomain sshd[207693]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:20:46 np0005604215.localdomain sshd[207693]: Accepted publickey for zuul from 192.168.122.30 port 44446 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:20:46 np0005604215.localdomain systemd-logind[761]: New session 53 of user zuul.
Feb 01 09:20:46 np0005604215.localdomain systemd[1]: Started Session 53 of User zuul.
Feb 01 09:20:46 np0005604215.localdomain sshd[207693]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:20:47 np0005604215.localdomain python3.9[207804]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:20:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=37856 DF PROTO=TCP SPT=33554 DPT=9102 SEQ=3480415361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A810E0000000001030307) 
Feb 01 09:20:49 np0005604215.localdomain python3.9[207916]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:20:49 np0005604215.localdomain network[207933]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:20:49 np0005604215.localdomain network[207934]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:20:49 np0005604215.localdomain network[207935]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:20:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:20:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6000.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          
                                                          ** Compaction Stats [default] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      2/0    2.61 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Sum      2/0    2.61 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [default] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.1      0.01              0.00         1    0.011       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [default] **
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-0] **
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-1] **
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [m-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [m-2] **
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.57 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Sum      1/0    1.57 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.2      0.01              0.00         1    0.007       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-0] **
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-1] **
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [p-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [p-2] **
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-0] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-0] **
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-1] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-1] **
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      1/0    1.26 KB   0.1      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Sum      1/0    1.26 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [O-2] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.3      0.00              0.00         1    0.004       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb9610#2 capacity: 272.00 MB usage: 0.45 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 2 last_secs: 9e-06 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(1,0.20 KB,7.2928e-05%) FilterBlock(1,0.11 KB,3.92689e-05%) IndexBlock(1,0.14 KB,5.04886e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [O-2] **
                                                          
                                                          ** Compaction Stats [L] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [L] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [L] **
                                                          
                                                          ** Compaction Stats [P] **
                                                          Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           Sum      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                           Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.0      0.00              0.00         0    0.000       0      0       0.0       0.0
                                                          
                                                          ** Compaction Stats [P] **
                                                          Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                          ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                          
                                                          Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                          
                                                          Uptime(secs): 6000.1 total, 4800.0 interval
                                                          Flush(GB): cumulative 0.000, interval 0.000
                                                          AddFile(GB): cumulative 0.000, interval 0.000
                                                          AddFile(Total Files): cumulative 0, interval 0
                                                          AddFile(L0 Files): cumulative 0, interval 0
                                                          AddFile(Keys): cumulative 0, interval 0
                                                          Cumulative compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Interval compaction: 0.00 GB write, 0.00 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                          Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                          Block cache BinnedLRUCache@0x55797eeb82d0#2 capacity: 1.62 GB usage: 2.09 KB table_size: 0 occupancy: 18446744073709551615 collections: 11 last_copies: 8 last_secs: 4.1e-05 secs_since: 0
                                                          Block cache entry stats(count,size,portion): DataBlock(3,1.42 KB,8.34465e-05%) FilterBlock(3,0.33 KB,1.92569e-05%) IndexBlock(3,0.34 KB,2.01739e-05%) Misc(1,0.00 KB,0%)
                                                          
                                                          ** File Read Latency Histogram By Level [P] **
Feb 01 09:20:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43148 DF PROTO=TCP SPT=35024 DPT=9100 SEQ=1318914317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A8D0D0000000001030307) 
Feb 01 09:20:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:20:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=46125 DF PROTO=TCP SPT=42572 DPT=9101 SEQ=1222515291 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65A990D0000000001030307) 
Feb 01 09:20:55 np0005604215.localdomain sudo[208165]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pyqyyhgsbjjpmcmdpnuwxuiuvqerdljp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937655.3219945-98-156642141649902/AnsiballZ_setup.py
Feb 01 09:20:55 np0005604215.localdomain sudo[208165]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:55 np0005604215.localdomain python3.9[208167]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:20:56 np0005604215.localdomain sudo[208165]: pam_unix(sudo:session): session closed for user root
Feb 01 09:20:56 np0005604215.localdomain sudo[208228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gaekezmujytpzmscqalbjuxnkwtysfyd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937655.3219945-98-156642141649902/AnsiballZ_dnf.py
Feb 01 09:20:56 np0005604215.localdomain sudo[208228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:20:56 np0005604215.localdomain python3.9[208230]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:21:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33076 DF PROTO=TCP SPT=58768 DPT=9882 SEQ=1104122674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AABCD0000000001030307) 
Feb 01 09:21:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33077 DF PROTO=TCP SPT=58768 DPT=9882 SEQ=1104122674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AAFCD0000000001030307) 
Feb 01 09:21:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33078 DF PROTO=TCP SPT=58768 DPT=9882 SEQ=1104122674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AB7CE0000000001030307) 
Feb 01 09:21:03 np0005604215.localdomain sudo[208228]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:06 np0005604215.localdomain sudo[208340]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kpqxpdspnctsmgiqeszkwqevcamratov ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937665.007214-134-69690255632107/AnsiballZ_stat.py
Feb 01 09:21:06 np0005604215.localdomain sudo[208340]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:06 np0005604215.localdomain python3.9[208342]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:21:06 np0005604215.localdomain sudo[208340]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31360 DF PROTO=TCP SPT=44308 DPT=9102 SEQ=3300110845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AC54D0000000001030307) 
Feb 01 09:21:06 np0005604215.localdomain sudo[208452]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cnktsmjwfhxhjlsuzakmfnpxefxnjrof ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937666.636781-159-134087774133961/AnsiballZ_copy.py
Feb 01 09:21:06 np0005604215.localdomain sudo[208452]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:07 np0005604215.localdomain python3.9[208454]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi mode=preserve remote_src=True src=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi/ backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:07 np0005604215.localdomain sudo[208452]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:07 np0005604215.localdomain sudo[208562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bonekxishasgikgznwqqxvrkrnjwnkpz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937667.45734-183-143399084951999/AnsiballZ_command.py
Feb 01 09:21:07 np0005604215.localdomain sudo[208562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:08 np0005604215.localdomain python3.9[208564]: ansible-ansible.legacy.command Invoked with _raw_params=mv "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi" "/var/lib/config-data/puppet-generated/iscsid/etc/iscsi.adopted"
                                                             _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:21:08 np0005604215.localdomain sudo[208562]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:08 np0005604215.localdomain sudo[208673]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fiaftunhuhjhejkrkgjsoaqrxjipnfyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937668.2870045-207-78157631642413/AnsiballZ_command.py
Feb 01 09:21:08 np0005604215.localdomain sudo[208673]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:08 np0005604215.localdomain python3.9[208675]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:21:08 np0005604215.localdomain sudo[208673]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:09 np0005604215.localdomain sudo[208784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjigwmktzwdabqtmrlgqkmtdgjxaltnf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937668.9941359-231-100019918441121/AnsiballZ_command.py
Feb 01 09:21:09 np0005604215.localdomain sudo[208784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:09 np0005604215.localdomain python3.9[208786]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:21:09 np0005604215.localdomain sudo[208784]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63306 DF PROTO=TCP SPT=56072 DPT=9100 SEQ=3317580477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AD1CD0000000001030307) 
Feb 01 09:21:10 np0005604215.localdomain sudo[208895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ptiaakjqpqfqoifippcgpxwolwmbweri ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937669.8151655-258-216262066706828/AnsiballZ_stat.py
Feb 01 09:21:10 np0005604215.localdomain sudo[208895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:10 np0005604215.localdomain python3.9[208897]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:21:10 np0005604215.localdomain sudo[208895]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:11 np0005604215.localdomain sudo[208955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:21:11 np0005604215.localdomain sudo[208955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:21:11 np0005604215.localdomain sudo[208955]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:11 np0005604215.localdomain sudo[208989]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:21:11 np0005604215.localdomain sudo[208989]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:21:11 np0005604215.localdomain sudo[209043]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hfbybdjadgqolzwzhlrtmovvpkmifffh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937670.8085089-290-56144132324933/AnsiballZ_lineinfile.py
Feb 01 09:21:11 np0005604215.localdomain sudo[209043]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:11 np0005604215.localdomain python3.9[209045]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:11 np0005604215.localdomain sudo[209043]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:11 np0005604215.localdomain sudo[208989]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:12 np0005604215.localdomain sudo[209184]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hzklhquymyztwhusycaeahoewowfmnet ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937671.809398-317-145418840215551/AnsiballZ_systemd_service.py
Feb 01 09:21:12 np0005604215.localdomain sudo[209184]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:12 np0005604215.localdomain sudo[209187]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:21:12 np0005604215.localdomain sudo[209187]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:21:12 np0005604215.localdomain sudo[209187]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:12 np0005604215.localdomain python3.9[209186]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:21:12 np0005604215.localdomain systemd[1]: Listening on Open-iSCSI iscsid Socket.
Feb 01 09:21:12 np0005604215.localdomain sudo[209184]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47621 DF PROTO=TCP SPT=48686 DPT=9101 SEQ=2534758677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65ADF0D0000000001030307) 
Feb 01 09:21:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:21:13 np0005604215.localdomain podman[209226]: 2026-02-01 09:21:13.876695393 +0000 UTC m=+0.087761738 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:21:13 np0005604215.localdomain podman[209226]: 2026-02-01 09:21:13.920858106 +0000 UTC m=+0.131924471 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:21:13 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:21:14 np0005604215.localdomain sudo[209341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epeogupfxkqnqslkncvezlmloqjwdgdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937674.1032162-342-30303358001981/AnsiballZ_systemd_service.py
Feb 01 09:21:14 np0005604215.localdomain sudo[209341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:14 np0005604215.localdomain python3.9[209343]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:21:14 np0005604215.localdomain systemd-rc-local-generator[209368]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:21:14 np0005604215.localdomain systemd-sysv-generator[209375]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:15 np0005604215.localdomain systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Feb 01 09:21:15 np0005604215.localdomain systemd[1]: Starting Open-iSCSI...
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.<reversed domain name>[:identifier].
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6.
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: If using hardware iscsi like qla4xxx this message can be ignored.
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf
Feb 01 09:21:15 np0005604215.localdomain iscsid[209384]: iscsid: can't open iscsid.ipc_auth_uid configuration file /etc/iscsi/iscsid.conf
Feb 01 09:21:15 np0005604215.localdomain systemd[1]: Started Open-iSCSI.
Feb 01 09:21:15 np0005604215.localdomain systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Feb 01 09:21:15 np0005604215.localdomain systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Feb 01 09:21:15 np0005604215.localdomain sudo[209341]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=33080 DF PROTO=TCP SPT=58768 DPT=9882 SEQ=1104122674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AE70D0000000001030307) 
Feb 01 09:21:16 np0005604215.localdomain python3.9[209493]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:21:16 np0005604215.localdomain network[209510]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:21:16 np0005604215.localdomain network[209511]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:21:16 np0005604215.localdomain network[209512]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:21:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:21:16 np0005604215.localdomain podman[209518]: 2026-02-01 09:21:16.344432016 +0000 UTC m=+0.069963775 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:21:16 np0005604215.localdomain podman[209518]: 2026-02-01 09:21:16.37860605 +0000 UTC m=+0.104137869 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 01 09:21:16 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:21:17 np0005604215.localdomain systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Feb 01 09:21:18 np0005604215.localdomain systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Feb 01 09:21:18 np0005604215.localdomain systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service.
Feb 01 09:21:18 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:21:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31362 DF PROTO=TCP SPT=44308 DPT=9102 SEQ=3300110845 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65AF50D0000000001030307) 
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 5698bc1e-e2fe-4086-9a1f-bb7435d28bc8
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 5698bc1e-e2fe-4086-9a1f-bb7435d28bc8
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 5698bc1e-e2fe-4086-9a1f-bb7435d28bc8
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 5698bc1e-e2fe-4086-9a1f-bb7435d28bc8
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 5698bc1e-e2fe-4086-9a1f-bb7435d28bc8
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi. For complete SELinux messages run: sealert -l 5698bc1e-e2fe-4086-9a1f-bb7435d28bc8
Feb 01 09:21:19 np0005604215.localdomain setroubleshoot[209544]: SELinux is preventing /usr/sbin/iscsid from search access on the directory iscsi.
                                                                 
                                                                 *****  Plugin catchall (100. confidence) suggests   **************************
                                                                 
                                                                 If you believe that iscsid should be allowed search access on the iscsi directory by default.
                                                                 Then you should report this as a bug.
                                                                 You can generate a local policy module to allow this access.
                                                                 Do
                                                                 allow this access for now by executing:
                                                                 # ausearch -c 'iscsid' --raw | audit2allow -M my-iscsid
                                                                 # semodule -X 300 -i my-iscsid.pp
                                                                 
Feb 01 09:21:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63308 DF PROTO=TCP SPT=56072 DPT=9100 SEQ=3317580477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B010D0000000001030307) 
Feb 01 09:21:21 np0005604215.localdomain sudo[209778]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxzrzbtunupmxbejclmaytrbbdzimfoa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937681.57883-410-111414127599870/AnsiballZ_dnf.py
Feb 01 09:21:21 np0005604215.localdomain sudo[209778]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:22 np0005604215.localdomain python3.9[209780]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:21:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47623 DF PROTO=TCP SPT=48686 DPT=9101 SEQ=2534758677 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B0F0D0000000001030307) 
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:21:26 np0005604215.localdomain systemd-rc-local-generator[209826]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:21:26 np0005604215.localdomain systemd-sysv-generator[209830]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: run-r62452d3adea44937a329029c597e48b9.service: Deactivated successfully.
Feb 01 09:21:26 np0005604215.localdomain systemd[1]: run-reacb2bd551814c8baef46cd863ab2523.service: Deactivated successfully.
Feb 01 09:21:27 np0005604215.localdomain sudo[209778]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:28 np0005604215.localdomain sudo[210071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-daqpkvrkkikwrkzydzmzmvrqtksqgvee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937687.8918917-438-61213306484111/AnsiballZ_file.py
Feb 01 09:21:28 np0005604215.localdomain sudo[210071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:28 np0005604215.localdomain python3.9[210073]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 09:21:28 np0005604215.localdomain sudo[210071]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:29 np0005604215.localdomain sudo[210181]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bggbehrfiwxtzwndwmsbnlcbgqkssknn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937688.7010536-461-148827543211733/AnsiballZ_modprobe.py
Feb 01 09:21:29 np0005604215.localdomain sudo[210181]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:29 np0005604215.localdomain python3.9[210183]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 01 09:21:29 np0005604215.localdomain sudo[210181]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:29 np0005604215.localdomain systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@1.service: Deactivated successfully.
Feb 01 09:21:29 np0005604215.localdomain systemd[1]: setroubleshootd.service: Deactivated successfully.
Feb 01 09:21:29 np0005604215.localdomain sudo[210296]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlumexmnwtjbjcvxhbcfhsqxpapqsabu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937689.6352932-486-153305824394531/AnsiballZ_stat.py
Feb 01 09:21:29 np0005604215.localdomain sudo[210296]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61713 DF PROTO=TCP SPT=33556 DPT=9882 SEQ=3908121571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B20FD0000000001030307) 
Feb 01 09:21:30 np0005604215.localdomain python3.9[210298]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:21:30 np0005604215.localdomain sudo[210296]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:30 np0005604215.localdomain sudo[210384]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzjhgoqeqkxkueybvsifpztreqviprtq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937689.6352932-486-153305824394531/AnsiballZ_copy.py
Feb 01 09:21:30 np0005604215.localdomain sudo[210384]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:30 np0005604215.localdomain python3.9[210386]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937689.6352932-486-153305824394531/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:30 np0005604215.localdomain sudo[210384]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61714 DF PROTO=TCP SPT=33556 DPT=9882 SEQ=3908121571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B250D0000000001030307) 
Feb 01 09:21:31 np0005604215.localdomain sudo[210494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vxykssqhsrngjvbsohykwuadoxizyjvi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937691.0253425-533-58125014878805/AnsiballZ_lineinfile.py
Feb 01 09:21:31 np0005604215.localdomain sudo[210494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:31 np0005604215.localdomain python3.9[210496]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:31 np0005604215.localdomain sudo[210494]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:32 np0005604215.localdomain sudo[210604]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aektbkflessoairylndgtmbrwieashzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937691.7289376-557-173750233978735/AnsiballZ_systemd.py
Feb 01 09:21:32 np0005604215.localdomain sudo[210604]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:32 np0005604215.localdomain python3.9[210606]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:21:32 np0005604215.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 01 09:21:32 np0005604215.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 01 09:21:32 np0005604215.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 01 09:21:32 np0005604215.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 01 09:21:32 np0005604215.localdomain systemd-modules-load[210610]: Module 'msr' is built in
Feb 01 09:21:32 np0005604215.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 01 09:21:32 np0005604215.localdomain sudo[210604]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61715 DF PROTO=TCP SPT=33556 DPT=9882 SEQ=3908121571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B2D0D0000000001030307) 
Feb 01 09:21:34 np0005604215.localdomain sudo[210718]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwptpcaropxrinvlasoddktmvgugjihs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937692.9712462-581-121580693564050/AnsiballZ_command.py
Feb 01 09:21:34 np0005604215.localdomain sudo[210718]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:34 np0005604215.localdomain python3.9[210720]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:21:34 np0005604215.localdomain sudo[210718]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:35 np0005604215.localdomain sshd[210777]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:21:35 np0005604215.localdomain sudo[210831]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejrkbmymwvdycjxixqkuzhhswvwdebus ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937695.0113394-611-266055216437690/AnsiballZ_stat.py
Feb 01 09:21:35 np0005604215.localdomain sudo[210831]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:35 np0005604215.localdomain python3.9[210833]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:21:35 np0005604215.localdomain sudo[210831]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:35 np0005604215.localdomain sshd[210777]: Invalid user teste from 85.206.171.113 port 57544
Feb 01 09:21:35 np0005604215.localdomain sshd[210777]: Received disconnect from 85.206.171.113 port 57544:11: Bye Bye [preauth]
Feb 01 09:21:35 np0005604215.localdomain sshd[210777]: Disconnected from invalid user teste 85.206.171.113 port 57544 [preauth]
Feb 01 09:21:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6988 DF PROTO=TCP SPT=45186 DPT=9102 SEQ=882411208 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B3A8D0000000001030307) 
Feb 01 09:21:36 np0005604215.localdomain sudo[210941]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fxsrbmrunzigknklqgwpfqaauaqyzsai ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937696.3369985-638-50303793189658/AnsiballZ_stat.py
Feb 01 09:21:36 np0005604215.localdomain sudo[210941]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:36 np0005604215.localdomain python3.9[210943]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:21:36 np0005604215.localdomain sudo[210941]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:37 np0005604215.localdomain sudo[211029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecmvpnqmdvmsdrqqyomgpjclxeehxglc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937696.3369985-638-50303793189658/AnsiballZ_copy.py
Feb 01 09:21:37 np0005604215.localdomain sudo[211029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:37 np0005604215.localdomain python3.9[211031]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937696.3369985-638-50303793189658/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:37 np0005604215.localdomain sudo[211029]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:37 np0005604215.localdomain sudo[211139]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hixevysakibylshvbexhrbzlvpptteme ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937697.589035-684-96581282162400/AnsiballZ_command.py
Feb 01 09:21:37 np0005604215.localdomain sudo[211139]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:38 np0005604215.localdomain python3.9[211141]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:21:38 np0005604215.localdomain sudo[211139]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:38 np0005604215.localdomain sudo[211250]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-chddvpsykwibgvykvbddtaykklqfvcam ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937698.3136551-707-181235251621284/AnsiballZ_lineinfile.py
Feb 01 09:21:38 np0005604215.localdomain sudo[211250]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:38 np0005604215.localdomain python3.9[211252]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:38 np0005604215.localdomain sudo[211250]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:39 np0005604215.localdomain sudo[211360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xluanctkfjrqvjahcvhjxcykckhokuda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937699.0827656-731-155838789835165/AnsiballZ_replace.py
Feb 01 09:21:39 np0005604215.localdomain sudo[211360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:39 np0005604215.localdomain python3.9[211362]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:39 np0005604215.localdomain sudo[211360]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19980 DF PROTO=TCP SPT=59126 DPT=9100 SEQ=3658720964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B470D0000000001030307) 
Feb 01 09:21:40 np0005604215.localdomain sudo[211470]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhiniyxseoojqsacrtaptlaqtcsrjszo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937699.9473748-757-214665195450206/AnsiballZ_replace.py
Feb 01 09:21:40 np0005604215.localdomain sudo[211470]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:40 np0005604215.localdomain python3.9[211472]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:40 np0005604215.localdomain sudo[211470]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:41 np0005604215.localdomain sudo[211580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uusskvogbsaaoiqhgssjbzgntlwtywca ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937700.7434928-782-37286338368643/AnsiballZ_lineinfile.py
Feb 01 09:21:41 np0005604215.localdomain sudo[211580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:41 np0005604215.localdomain python3.9[211582]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:41 np0005604215.localdomain sudo[211580]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:41 np0005604215.localdomain sudo[211690]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ulkrbsfkdnyxgdgmobwelknbzcuynpkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937701.3853135-782-181088456454340/AnsiballZ_lineinfile.py
Feb 01 09:21:41 np0005604215.localdomain sudo[211690]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:21:41.737 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:21:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:21:41.738 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:21:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:21:41.738 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:21:41 np0005604215.localdomain python3.9[211692]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:41 np0005604215.localdomain sudo[211690]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:42 np0005604215.localdomain sudo[211800]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ciyamqnqqgjtrcfnnqsqxzwdrkevtqxl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937701.9945338-782-280681306740981/AnsiballZ_lineinfile.py
Feb 01 09:21:42 np0005604215.localdomain sudo[211800]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:42 np0005604215.localdomain python3.9[211802]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:42 np0005604215.localdomain sudo[211800]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2691 DF PROTO=TCP SPT=53186 DPT=9101 SEQ=945611142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B544D0000000001030307) 
Feb 01 09:21:43 np0005604215.localdomain sudo[211910]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmxdrhobkseywutrfpwpfmlcdgusdggq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937702.6654973-782-47045513830483/AnsiballZ_lineinfile.py
Feb 01 09:21:43 np0005604215.localdomain sudo[211910]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:43 np0005604215.localdomain python3.9[211912]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:43 np0005604215.localdomain sudo[211910]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:44 np0005604215.localdomain sudo[212020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swonsfwdaqbztafgzlidtzvdsezfmdsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937703.7439747-870-308316977139/AnsiballZ_stat.py
Feb 01 09:21:44 np0005604215.localdomain sudo[212020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:21:44 np0005604215.localdomain systemd[1]: tmp-crun.pMXc6u.mount: Deactivated successfully.
Feb 01 09:21:44 np0005604215.localdomain podman[212023]: 2026-02-01 09:21:44.183236504 +0000 UTC m=+0.121415897 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 01 09:21:44 np0005604215.localdomain python3.9[212022]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:21:44 np0005604215.localdomain podman[212023]: 2026-02-01 09:21:44.252752176 +0000 UTC m=+0.190931569 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 01 09:21:44 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:21:44 np0005604215.localdomain sudo[212020]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:44 np0005604215.localdomain sudo[212158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwlvsluzrnadxwblyiiizoyhnrdyzvcf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937704.5722885-894-117018860604667/AnsiballZ_command.py
Feb 01 09:21:44 np0005604215.localdomain sudo[212158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:45 np0005604215.localdomain python3.9[212160]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/true _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:21:45 np0005604215.localdomain sudo[212158]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61717 DF PROTO=TCP SPT=33556 DPT=9882 SEQ=3908121571 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B5D0D0000000001030307) 
Feb 01 09:21:45 np0005604215.localdomain sudo[212269]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdgjjwyvkhgsnhrzmakgetzrpmmmpxid ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937705.3913462-921-52535613925781/AnsiballZ_systemd_service.py
Feb 01 09:21:45 np0005604215.localdomain sudo[212269]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:45 np0005604215.localdomain python3.9[212271]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:21:45 np0005604215.localdomain systemd[1]: Listening on multipathd control socket.
Feb 01 09:21:46 np0005604215.localdomain sudo[212269]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:46 np0005604215.localdomain sudo[212383]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jvfdydzwldrwttrpczvmcgzdcfessvuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937706.3353164-945-151595224700373/AnsiballZ_systemd_service.py
Feb 01 09:21:46 np0005604215.localdomain sudo[212383]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:46 np0005604215.localdomain python3.9[212385]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:21:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:21:47 np0005604215.localdomain systemd[1]: Starting Wait for udev To Complete Device Initialization...
Feb 01 09:21:47 np0005604215.localdomain podman[212387]: 2026-02-01 09:21:47.040421068 +0000 UTC m=+0.063429630 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:21:47 np0005604215.localdomain udevadm[212402]: systemd-udev-settle.service is deprecated. Please fix multipathd.service not to pull it in.
Feb 01 09:21:47 np0005604215.localdomain systemd[1]: Finished Wait for udev To Complete Device Initialization.
Feb 01 09:21:47 np0005604215.localdomain systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 01 09:21:47 np0005604215.localdomain podman[212387]: 2026-02-01 09:21:47.074610252 +0000 UTC m=+0.097618784 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:21:47 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:21:47 np0005604215.localdomain multipathd[212410]: --------start up--------
Feb 01 09:21:47 np0005604215.localdomain multipathd[212410]: read /etc/multipath.conf
Feb 01 09:21:47 np0005604215.localdomain multipathd[212410]: path checkers start up
Feb 01 09:21:47 np0005604215.localdomain systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 01 09:21:47 np0005604215.localdomain sudo[212383]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:48 np0005604215.localdomain sudo[212525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwfoukxwtwyyqjjlrauualzemnhlpfpe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937708.5682793-981-272727554241603/AnsiballZ_file.py
Feb 01 09:21:48 np0005604215.localdomain sudo[212525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6990 DF PROTO=TCP SPT=45186 DPT=9102 SEQ=882411208 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B6B0D0000000001030307) 
Feb 01 09:21:49 np0005604215.localdomain python3.9[212527]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 09:21:49 np0005604215.localdomain sudo[212525]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:49 np0005604215.localdomain sudo[212635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyaorpayltrthupcezavuspsaqqnmjfl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937709.2818763-1004-5398284321161/AnsiballZ_modprobe.py
Feb 01 09:21:49 np0005604215.localdomain sudo[212635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:49 np0005604215.localdomain python3.9[212637]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 01 09:21:49 np0005604215.localdomain sudo[212635]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:50 np0005604215.localdomain sudo[212754]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmuojlohmxyzjnyraxouifaqrhbgigwm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937710.390496-1028-60450042753356/AnsiballZ_stat.py
Feb 01 09:21:50 np0005604215.localdomain sudo[212754]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:50 np0005604215.localdomain python3.9[212756]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:21:50 np0005604215.localdomain sudo[212754]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:51 np0005604215.localdomain sudo[212842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tunyavpsbuquttjuokbhcitikhoozqwx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937710.390496-1028-60450042753356/AnsiballZ_copy.py
Feb 01 09:21:51 np0005604215.localdomain sudo[212842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:51 np0005604215.localdomain python3.9[212844]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937710.390496-1028-60450042753356/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:51 np0005604215.localdomain sudo[212842]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19982 DF PROTO=TCP SPT=59126 DPT=9100 SEQ=3658720964 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B770D0000000001030307) 
Feb 01 09:21:52 np0005604215.localdomain sudo[212952]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kwytjnymhkiymvixrqcskrzzjyrfewnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937711.8121152-1076-158218447244078/AnsiballZ_lineinfile.py
Feb 01 09:21:52 np0005604215.localdomain sudo[212952]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:52 np0005604215.localdomain python3.9[212954]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:21:52 np0005604215.localdomain sudo[212952]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:52 np0005604215.localdomain sudo[213062]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brwzyixlfgictvngrykwlrpnzqnjmxtx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937712.5353405-1101-187759468229968/AnsiballZ_systemd.py
Feb 01 09:21:52 np0005604215.localdomain sudo[213062]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:53 np0005604215.localdomain python3.9[213064]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:21:53 np0005604215.localdomain systemd[1]: systemd-modules-load.service: Deactivated successfully.
Feb 01 09:21:53 np0005604215.localdomain systemd[1]: Stopped Load Kernel Modules.
Feb 01 09:21:53 np0005604215.localdomain systemd[1]: Stopping Load Kernel Modules...
Feb 01 09:21:53 np0005604215.localdomain systemd[1]: Starting Load Kernel Modules...
Feb 01 09:21:53 np0005604215.localdomain systemd-modules-load[213068]: Module 'msr' is built in
Feb 01 09:21:53 np0005604215.localdomain systemd[1]: Finished Load Kernel Modules.
Feb 01 09:21:53 np0005604215.localdomain sudo[213062]: pam_unix(sudo:session): session closed for user root
Feb 01 09:21:54 np0005604215.localdomain sudo[213176]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fhwpuxictssqtehyagljvhuizoozsksy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937713.7396939-1124-79230955947079/AnsiballZ_dnf.py
Feb 01 09:21:54 np0005604215.localdomain sudo[213176]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:21:54 np0005604215.localdomain python3.9[213178]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:21:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2693 DF PROTO=TCP SPT=53186 DPT=9101 SEQ=945611142 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B850D0000000001030307) 
Feb 01 09:21:57 np0005604215.localdomain systemd[1]: virtnodedevd.service: Deactivated successfully.
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:21:58 np0005604215.localdomain systemd-rc-local-generator[213215]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:21:58 np0005604215.localdomain systemd-sysv-generator[213218]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:21:58 np0005604215.localdomain systemd-sysv-generator[213253]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:21:58 np0005604215.localdomain systemd-rc-local-generator[213248]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:58 np0005604215.localdomain systemd[1]: virtproxyd.service: Deactivated successfully.
Feb 01 09:21:58 np0005604215.localdomain systemd-logind[761]: Watching system buttons on /dev/input/event0 (Power Button)
Feb 01 09:21:58 np0005604215.localdomain systemd-logind[761]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Feb 01 09:21:58 np0005604215.localdomain lvm[213304]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 01 09:21:58 np0005604215.localdomain lvm[213304]: VG ceph_vg1 finished
Feb 01 09:21:58 np0005604215.localdomain lvm[213303]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 01 09:21:58 np0005604215.localdomain lvm[213303]: VG ceph_vg0 finished
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: Starting man-db-cache-update.service...
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:21:59 np0005604215.localdomain systemd-rc-local-generator[213349]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:21:59 np0005604215.localdomain systemd-sysv-generator[213354]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:21:59 np0005604215.localdomain systemd[1]: Queuing reload/restart jobs for marked units…
Feb 01 09:22:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55400 DF PROTO=TCP SPT=54230 DPT=9882 SEQ=465484690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B962E0000000001030307) 
Feb 01 09:22:00 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Deactivated successfully.
Feb 01 09:22:00 np0005604215.localdomain systemd[1]: Finished man-db-cache-update.service.
Feb 01 09:22:00 np0005604215.localdomain systemd[1]: man-db-cache-update.service: Consumed 1.263s CPU time.
Feb 01 09:22:00 np0005604215.localdomain systemd[1]: run-ra1a3255d90394862aee02f1c58a61ad4.service: Deactivated successfully.
Feb 01 09:22:00 np0005604215.localdomain sudo[213176]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:00 np0005604215.localdomain sudo[214608]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jpbvklhrhzzunlochojuecctgvznvksi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937720.6021605-1148-50485760930286/AnsiballZ_systemd_service.py
Feb 01 09:22:00 np0005604215.localdomain sudo[214608]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55401 DF PROTO=TCP SPT=54230 DPT=9882 SEQ=465484690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65B9A4E0000000001030307) 
Feb 01 09:22:01 np0005604215.localdomain python3.9[214610]: ansible-ansible.builtin.systemd_service Invoked with name=multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:22:01 np0005604215.localdomain multipathd[212410]: exit (signal)
Feb 01 09:22:01 np0005604215.localdomain multipathd[212410]: --------shut down-------
Feb 01 09:22:01 np0005604215.localdomain systemd[1]: Stopping Device-Mapper Multipath Device Controller...
Feb 01 09:22:01 np0005604215.localdomain systemd[1]: multipathd.service: Deactivated successfully.
Feb 01 09:22:01 np0005604215.localdomain systemd[1]: Stopped Device-Mapper Multipath Device Controller.
Feb 01 09:22:01 np0005604215.localdomain systemd[1]: Starting Device-Mapper Multipath Device Controller...
Feb 01 09:22:01 np0005604215.localdomain multipathd[214616]: --------start up--------
Feb 01 09:22:01 np0005604215.localdomain multipathd[214616]: read /etc/multipath.conf
Feb 01 09:22:01 np0005604215.localdomain multipathd[214616]: path checkers start up
Feb 01 09:22:01 np0005604215.localdomain systemd[1]: Started Device-Mapper Multipath Device Controller.
Feb 01 09:22:01 np0005604215.localdomain sudo[214608]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:02 np0005604215.localdomain python3.9[214731]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:22:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55402 DF PROTO=TCP SPT=54230 DPT=9882 SEQ=465484690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BA24E0000000001030307) 
Feb 01 09:22:03 np0005604215.localdomain sudo[214843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aikzrfkxxkpajgnlwhiaajfjmnmyoxda ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937722.896465-1201-32255702814417/AnsiballZ_file.py
Feb 01 09:22:03 np0005604215.localdomain sudo[214843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:03 np0005604215.localdomain python3.9[214845]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:03 np0005604215.localdomain sudo[214843]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:04 np0005604215.localdomain sudo[214953]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcmmgkcammkizvdhsdctowzrgpohunnw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937723.908488-1233-66952139938003/AnsiballZ_systemd_service.py
Feb 01 09:22:04 np0005604215.localdomain sudo[214953]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:04 np0005604215.localdomain python3.9[214955]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:22:04 np0005604215.localdomain systemd-rc-local-generator[214980]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:22:04 np0005604215.localdomain systemd-sysv-generator[214987]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:04 np0005604215.localdomain sudo[214953]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:05 np0005604215.localdomain python3.9[215100]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:22:05 np0005604215.localdomain network[215117]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:22:05 np0005604215.localdomain network[215118]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:22:05 np0005604215.localdomain network[215119]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:22:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57262 DF PROTO=TCP SPT=57006 DPT=9102 SEQ=3281454821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BAFCD0000000001030307) 
Feb 01 09:22:06 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:22:09 np0005604215.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 01 09:22:09 np0005604215.localdomain systemd[1]: virtqemud.service: Deactivated successfully.
Feb 01 09:22:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36907 DF PROTO=TCP SPT=36858 DPT=9100 SEQ=1360949448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BBC4D0000000001030307) 
Feb 01 09:22:10 np0005604215.localdomain sudo[215352]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pakexlqrlxtvxlrrkodwgpumqaybypsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937729.9285412-1290-131733109713109/AnsiballZ_systemd_service.py
Feb 01 09:22:10 np0005604215.localdomain sudo[215352]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:10 np0005604215.localdomain python3.9[215354]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:11 np0005604215.localdomain sudo[215352]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:12 np0005604215.localdomain sudo[215463]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enejsltiyghuhrqtubtcdpzxehsszmod ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937731.704657-1290-146754149014875/AnsiballZ_systemd_service.py
Feb 01 09:22:12 np0005604215.localdomain sudo[215463]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:12 np0005604215.localdomain python3.9[215465]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:12 np0005604215.localdomain sudo[215463]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:12 np0005604215.localdomain sudo[215538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:22:12 np0005604215.localdomain sudo[215538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:22:12 np0005604215.localdomain sudo[215538]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:12 np0005604215.localdomain sudo[215573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:22:12 np0005604215.localdomain sudo[215573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:22:12 np0005604215.localdomain sudo[215609]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bzeollkcocirpybguiqniatgqkzdpmdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937732.5002244-1290-193750616622389/AnsiballZ_systemd_service.py
Feb 01 09:22:12 np0005604215.localdomain sudo[215609]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31662 DF PROTO=TCP SPT=33722 DPT=9101 SEQ=2442379317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BC94D0000000001030307) 
Feb 01 09:22:13 np0005604215.localdomain python3.9[215612]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:13 np0005604215.localdomain sudo[215609]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:13 np0005604215.localdomain sudo[215573]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:13 np0005604215.localdomain sudo[215752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exhlxlwdsyzrewgxdkuzsihizwmnbmop ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937733.2612302-1290-66927039709750/AnsiballZ_systemd_service.py
Feb 01 09:22:13 np0005604215.localdomain sudo[215752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:13 np0005604215.localdomain python3.9[215754]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:13 np0005604215.localdomain sudo[215752]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:22:14 np0005604215.localdomain podman[215784]: 2026-02-01 09:22:14.872535404 +0000 UTC m=+0.084207229 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:22:14 np0005604215.localdomain podman[215784]: 2026-02-01 09:22:14.938686946 +0000 UTC m=+0.150358751 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_controller)
Feb 01 09:22:14 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:22:15 np0005604215.localdomain sudo[215887]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ozsylxsjkvaaoznxcezwxydtmrvyvgra ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937734.7940288-1290-212032498562815/AnsiballZ_systemd_service.py
Feb 01 09:22:15 np0005604215.localdomain sudo[215887]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:15 np0005604215.localdomain python3.9[215889]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55404 DF PROTO=TCP SPT=54230 DPT=9882 SEQ=465484690 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BD30D0000000001030307) 
Feb 01 09:22:15 np0005604215.localdomain sudo[215891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:22:15 np0005604215.localdomain sudo[215891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:22:15 np0005604215.localdomain sudo[215891]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:16 np0005604215.localdomain sudo[215887]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:22:17 np0005604215.localdomain sudo[216016]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cwnazbrnohhipsgcfcyljixlhwucchgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937737.5114834-1290-193289532678473/AnsiballZ_systemd_service.py
Feb 01 09:22:17 np0005604215.localdomain sudo[216016]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:17 np0005604215.localdomain podman[216017]: 2026-02-01 09:22:17.870151808 +0000 UTC m=+0.082438826 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:22:17 np0005604215.localdomain podman[216017]: 2026-02-01 09:22:17.879645575 +0000 UTC m=+0.091932593 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 01 09:22:17 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:22:18 np0005604215.localdomain python3.9[216028]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:18 np0005604215.localdomain sudo[216016]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:18 np0005604215.localdomain sudo[216146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wpqinnkxgsrixaoqzhduuwljcobowtqr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937738.2850049-1290-246324086318716/AnsiballZ_systemd_service.py
Feb 01 09:22:18 np0005604215.localdomain sudo[216146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57264 DF PROTO=TCP SPT=57006 DPT=9102 SEQ=3281454821 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BDF0D0000000001030307) 
Feb 01 09:22:18 np0005604215.localdomain python3.9[216148]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:18 np0005604215.localdomain sudo[216146]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:19 np0005604215.localdomain sudo[216257]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gkqexxtwbmsggjdoyvjixohdowlqyaib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937739.0275593-1290-167355303123773/AnsiballZ_systemd_service.py
Feb 01 09:22:19 np0005604215.localdomain sudo[216257]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:19 np0005604215.localdomain python3.9[216259]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:22:19 np0005604215.localdomain sudo[216257]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:20 np0005604215.localdomain sudo[216368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ixyllqahkbitioaijlkokrymbggiiemf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937740.1116676-1468-29795013936700/AnsiballZ_file.py
Feb 01 09:22:20 np0005604215.localdomain sudo[216368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:20 np0005604215.localdomain python3.9[216370]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:20 np0005604215.localdomain sudo[216368]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:21 np0005604215.localdomain sudo[216478]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ynizvdzrhupnctygkfwcpbrszokbpsnq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937740.7020555-1468-17823114153628/AnsiballZ_file.py
Feb 01 09:22:21 np0005604215.localdomain sudo[216478]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:21 np0005604215.localdomain python3.9[216480]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:21 np0005604215.localdomain sudo[216478]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:21 np0005604215.localdomain sudo[216588]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lpanutbckaxwbpdjsyxfdxgtnqerlllp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937741.3537774-1468-102261008521553/AnsiballZ_file.py
Feb 01 09:22:21 np0005604215.localdomain sudo[216588]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:21 np0005604215.localdomain python3.9[216590]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:21 np0005604215.localdomain sudo[216588]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:22 np0005604215.localdomain sudo[216698]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tosiiqghiahzaqvcsjecennieajrrodo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937741.9854407-1468-27683414531194/AnsiballZ_file.py
Feb 01 09:22:22 np0005604215.localdomain sudo[216698]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36909 DF PROTO=TCP SPT=36858 DPT=9100 SEQ=1360949448 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BED0D0000000001030307) 
Feb 01 09:22:22 np0005604215.localdomain python3.9[216700]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:22 np0005604215.localdomain sudo[216698]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:22 np0005604215.localdomain sudo[216808]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blsfszgkecwwxwwhwqkirhamkwdoegtf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937742.5916545-1468-203451359710951/AnsiballZ_file.py
Feb 01 09:22:22 np0005604215.localdomain sudo[216808]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:23 np0005604215.localdomain python3.9[216810]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:23 np0005604215.localdomain sudo[216808]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:23 np0005604215.localdomain sudo[216918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hhtagdpghdnlkkavmyxhhcuxjxykpuvz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937743.2393253-1468-50878829870504/AnsiballZ_file.py
Feb 01 09:22:23 np0005604215.localdomain sudo[216918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:23 np0005604215.localdomain python3.9[216920]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:23 np0005604215.localdomain sudo[216918]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:24 np0005604215.localdomain sudo[217028]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nygyecnrvoqilfkuvkzwtrjanbwxkjsw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937743.8562627-1468-61290872950151/AnsiballZ_file.py
Feb 01 09:22:24 np0005604215.localdomain sudo[217028]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:24 np0005604215.localdomain python3.9[217030]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:24 np0005604215.localdomain sudo[217028]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:24 np0005604215.localdomain sudo[217138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ujiuwpkwnziiyosvjaqypwqkdwieegve ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937744.5443974-1468-256151831366531/AnsiballZ_file.py
Feb 01 09:22:24 np0005604215.localdomain sudo[217138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:25 np0005604215.localdomain python3.9[217140]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:25 np0005604215.localdomain sudo[217138]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31664 DF PROTO=TCP SPT=33722 DPT=9101 SEQ=2442379317 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65BF90D0000000001030307) 
Feb 01 09:22:25 np0005604215.localdomain sudo[217248]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elybeaqfobjwyizmagdyiiyyozzbdmsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937745.2888978-1639-224943908790683/AnsiballZ_file.py
Feb 01 09:22:25 np0005604215.localdomain sudo[217248]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:25 np0005604215.localdomain python3.9[217250]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:25 np0005604215.localdomain sudo[217248]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:26 np0005604215.localdomain sudo[217358]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhosmhihfwjazbmyebmjspxttjgcowgq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937745.8573065-1639-31534513222257/AnsiballZ_file.py
Feb 01 09:22:26 np0005604215.localdomain sudo[217358]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:26 np0005604215.localdomain python3.9[217360]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:26 np0005604215.localdomain sudo[217358]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:26 np0005604215.localdomain sudo[217468]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gsdwbkzfpczyexgyqworxaskbptqkvnr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937746.4443583-1639-71731252874647/AnsiballZ_file.py
Feb 01 09:22:26 np0005604215.localdomain sudo[217468]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:26 np0005604215.localdomain python3.9[217470]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:26 np0005604215.localdomain sudo[217468]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:27 np0005604215.localdomain sudo[217578]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-abutmrqvpwddkkrphahlgmqmwrcwyxbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937747.030091-1639-102038473460394/AnsiballZ_file.py
Feb 01 09:22:27 np0005604215.localdomain sudo[217578]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:27 np0005604215.localdomain python3.9[217580]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:27 np0005604215.localdomain sudo[217578]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:27 np0005604215.localdomain sudo[217688]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nnqucdtxtsvrapilsyfemksbeikjoryh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937747.6411152-1639-103330976438575/AnsiballZ_file.py
Feb 01 09:22:27 np0005604215.localdomain sudo[217688]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:28 np0005604215.localdomain python3.9[217690]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:28 np0005604215.localdomain sudo[217688]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:28 np0005604215.localdomain sudo[217798]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lssobjeagbanxxscnhtdlyzfrheecjkq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937748.43521-1639-211056436898838/AnsiballZ_file.py
Feb 01 09:22:28 np0005604215.localdomain sudo[217798]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:28 np0005604215.localdomain python3.9[217800]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:28 np0005604215.localdomain sudo[217798]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:29 np0005604215.localdomain sudo[217908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjdimuozktiomfdasvpjkoztyrgxtgsf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937749.5938463-1639-129463939904339/AnsiballZ_file.py
Feb 01 09:22:29 np0005604215.localdomain sudo[217908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64378 DF PROTO=TCP SPT=57108 DPT=9882 SEQ=1145317472 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C0B5D0000000001030307) 
Feb 01 09:22:30 np0005604215.localdomain python3.9[217910]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:30 np0005604215.localdomain sudo[217908]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:30 np0005604215.localdomain sudo[218018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlknhegwarspqzccfnotungfqnwzkboj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937750.2206855-1639-130710778243580/AnsiballZ_file.py
Feb 01 09:22:30 np0005604215.localdomain sudo[218018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:30 np0005604215.localdomain python3.9[218020]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:22:30 np0005604215.localdomain sudo[218018]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64379 DF PROTO=TCP SPT=57108 DPT=9882 SEQ=1145317472 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C0F4D0000000001030307) 
Feb 01 09:22:32 np0005604215.localdomain sudo[218128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vprrmzxxwcutvoutgfxdwyvsehsruyxc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937751.1552427-1812-18892520927098/AnsiballZ_command.py
Feb 01 09:22:32 np0005604215.localdomain sudo[218128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:32 np0005604215.localdomain python3.9[218130]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:32 np0005604215.localdomain sudo[218128]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64380 DF PROTO=TCP SPT=57108 DPT=9882 SEQ=1145317472 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C174D0000000001030307) 
Feb 01 09:22:33 np0005604215.localdomain python3.9[218240]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:22:34 np0005604215.localdomain sudo[218348]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oxgdpxhxmttuiaznycyurmnrxvmdtqar ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937753.8854694-1866-279327091418968/AnsiballZ_systemd_service.py
Feb 01 09:22:34 np0005604215.localdomain sudo[218348]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:34 np0005604215.localdomain python3.9[218350]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:22:34 np0005604215.localdomain systemd-sysv-generator[218377]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:22:34 np0005604215.localdomain systemd-rc-local-generator[218372]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:22:34 np0005604215.localdomain sudo[218348]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:35 np0005604215.localdomain sudo[218494]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eiqctujfovvivofjrngraaoguimhsqoo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937755.2692306-1891-277034489640692/AnsiballZ_command.py
Feb 01 09:22:35 np0005604215.localdomain sudo[218494]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:35 np0005604215.localdomain python3.9[218496]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:35 np0005604215.localdomain sudo[218494]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:36 np0005604215.localdomain sudo[218605]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yjxisputekcxtkzajckjixxhnnbhbfan ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937755.8855724-1891-104283241173299/AnsiballZ_command.py
Feb 01 09:22:36 np0005604215.localdomain sudo[218605]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:36 np0005604215.localdomain python3.9[218607]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:36 np0005604215.localdomain sudo[218605]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39107 DF PROTO=TCP SPT=46486 DPT=9102 SEQ=2244556675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C250D0000000001030307) 
Feb 01 09:22:36 np0005604215.localdomain sudo[218716]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iupuepxnbxnqqqavhwwzxfmgwzbdnazj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937756.4550989-1891-93747165024414/AnsiballZ_command.py
Feb 01 09:22:36 np0005604215.localdomain sudo[218716]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:36 np0005604215.localdomain python3.9[218718]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:37 np0005604215.localdomain sudo[218716]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:37 np0005604215.localdomain sudo[218827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfkmetpgqrearbxzhvnuojbggprwmyas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937757.1135788-1891-260555348163160/AnsiballZ_command.py
Feb 01 09:22:37 np0005604215.localdomain sudo[218827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:37 np0005604215.localdomain python3.9[218829]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:37 np0005604215.localdomain sudo[218827]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:37 np0005604215.localdomain sudo[218938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clnjbjltfsvxtwtqabpqfzvbpntxekhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937757.6942337-1891-142471874852391/AnsiballZ_command.py
Feb 01 09:22:37 np0005604215.localdomain sudo[218938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:38 np0005604215.localdomain python3.9[218940]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:38 np0005604215.localdomain sudo[218938]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:38 np0005604215.localdomain sudo[219049]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zxsuerkkbfgirxzjhhohuowyasoesybu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937758.2942276-1891-181736242055722/AnsiballZ_command.py
Feb 01 09:22:38 np0005604215.localdomain sudo[219049]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:38 np0005604215.localdomain python3.9[219051]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:38 np0005604215.localdomain sudo[219049]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:39 np0005604215.localdomain sudo[219160]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnidcepnjndmxhfeaiqytvcomvvdzvcj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937758.9066799-1891-99522033160725/AnsiballZ_command.py
Feb 01 09:22:39 np0005604215.localdomain sudo[219160]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:39 np0005604215.localdomain python3.9[219162]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:39 np0005604215.localdomain sudo[219160]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4924 DF PROTO=TCP SPT=52242 DPT=9100 SEQ=366248767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C318D0000000001030307) 
Feb 01 09:22:39 np0005604215.localdomain sudo[219271]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhgwzeviutqnkadrkuwdvvbfxfxyjqbi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937759.5451922-1891-183986076506134/AnsiballZ_command.py
Feb 01 09:22:39 np0005604215.localdomain sudo[219271]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:39 np0005604215.localdomain python3.9[219273]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:22:41 np0005604215.localdomain sudo[219271]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:22:41.738 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:22:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:22:41.740 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:22:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:22:41.740 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:22:42 np0005604215.localdomain sudo[219382]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mxrgvjozgvrckgezjnvcvnqallixoidx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937762.3381417-2098-14137865082498/AnsiballZ_file.py
Feb 01 09:22:42 np0005604215.localdomain sudo[219382]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:42 np0005604215.localdomain python3.9[219384]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:42 np0005604215.localdomain sudo[219382]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43009 DF PROTO=TCP SPT=34352 DPT=9101 SEQ=4279428959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C3E8D0000000001030307) 
Feb 01 09:22:43 np0005604215.localdomain sudo[219492]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gbfgcsvpcfzbhwkknfsgwkvvvumzzpsl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937762.991664-2098-92853245362259/AnsiballZ_file.py
Feb 01 09:22:43 np0005604215.localdomain sudo[219492]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:43 np0005604215.localdomain python3.9[219494]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:43 np0005604215.localdomain sudo[219492]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:44 np0005604215.localdomain sudo[219602]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-clafovbbzwedixhahlzptldzwkacckpx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937764.001801-2098-203913991970501/AnsiballZ_file.py
Feb 01 09:22:44 np0005604215.localdomain sudo[219602]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:44 np0005604215.localdomain python3.9[219604]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:44 np0005604215.localdomain sudo[219602]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:45 np0005604215.localdomain sudo[219712]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mifyhkpnsttosobsihrujhexzyknlsku ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937765.0039933-2163-75387932456226/AnsiballZ_file.py
Feb 01 09:22:45 np0005604215.localdomain sudo[219712]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:22:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64382 DF PROTO=TCP SPT=57108 DPT=9882 SEQ=1145317472 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C470D0000000001030307) 
Feb 01 09:22:45 np0005604215.localdomain podman[219715]: 2026-02-01 09:22:45.383384265 +0000 UTC m=+0.087424237 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 01 09:22:45 np0005604215.localdomain podman[219715]: 2026-02-01 09:22:45.449502367 +0000 UTC m=+0.153542359 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller)
Feb 01 09:22:45 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:22:45 np0005604215.localdomain python3.9[219714]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:45 np0005604215.localdomain sudo[219712]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:45 np0005604215.localdomain sudo[219848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wkuwmejjiyyjctvefenpjnvxqhijyxws ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937765.6098194-2163-264278327875191/AnsiballZ_file.py
Feb 01 09:22:45 np0005604215.localdomain sudo[219848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:46 np0005604215.localdomain python3.9[219850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:46 np0005604215.localdomain sudo[219848]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:46 np0005604215.localdomain sudo[219958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qimhkzoezokudfqvvapmcvycmygkuqes ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937766.187972-2163-126655350813898/AnsiballZ_file.py
Feb 01 09:22:46 np0005604215.localdomain sudo[219958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:46 np0005604215.localdomain python3.9[219960]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:46 np0005604215.localdomain sudo[219958]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:47 np0005604215.localdomain sudo[220068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnhgcblihwfuiqsuhddhqqqanopqigjq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937766.790263-2163-161726472808512/AnsiballZ_file.py
Feb 01 09:22:47 np0005604215.localdomain sudo[220068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:47 np0005604215.localdomain python3.9[220070]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:47 np0005604215.localdomain sudo[220068]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:47 np0005604215.localdomain sudo[220178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-favvzwtxgjdrtlhyczcrkcppqtkmwhox ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937767.464052-2163-238366527904933/AnsiballZ_file.py
Feb 01 09:22:47 np0005604215.localdomain sudo[220178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:47 np0005604215.localdomain python3.9[220180]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:47 np0005604215.localdomain sudo[220178]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:48 np0005604215.localdomain sudo[220288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ttilzfdummugqhoruijzkcqifqdnwcba ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937768.1717975-2163-74273774707260/AnsiballZ_file.py
Feb 01 09:22:48 np0005604215.localdomain sudo[220288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:22:48 np0005604215.localdomain systemd[1]: tmp-crun.Ep6wEX.mount: Deactivated successfully.
Feb 01 09:22:48 np0005604215.localdomain podman[220291]: 2026-02-01 09:22:48.544409815 +0000 UTC m=+0.093731258 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:22:48 np0005604215.localdomain podman[220291]: 2026-02-01 09:22:48.549069926 +0000 UTC m=+0.098391399 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:22:48 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:22:48 np0005604215.localdomain python3.9[220290]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:48 np0005604215.localdomain sudo[220288]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39109 DF PROTO=TCP SPT=46486 DPT=9102 SEQ=2244556675 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C550D0000000001030307) 
Feb 01 09:22:49 np0005604215.localdomain sudo[220414]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cjuhsdmrkibjpezaptajtzezhgifpeva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937768.8268652-2163-60061981378635/AnsiballZ_file.py
Feb 01 09:22:49 np0005604215.localdomain sudo[220414]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:49 np0005604215.localdomain python3.9[220416]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:22:49 np0005604215.localdomain sudo[220414]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=4926 DF PROTO=TCP SPT=52242 DPT=9100 SEQ=366248767 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C610D0000000001030307) 
Feb 01 09:22:54 np0005604215.localdomain sshd[220434]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:22:54 np0005604215.localdomain sshd[220434]: Invalid user tom from 85.206.171.113 port 35436
Feb 01 09:22:54 np0005604215.localdomain sshd[220434]: Received disconnect from 85.206.171.113 port 35436:11: Bye Bye [preauth]
Feb 01 09:22:54 np0005604215.localdomain sshd[220434]: Disconnected from invalid user tom 85.206.171.113 port 35436 [preauth]
Feb 01 09:22:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43011 DF PROTO=TCP SPT=34352 DPT=9101 SEQ=4279428959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C6F0E0000000001030307) 
Feb 01 09:22:56 np0005604215.localdomain sudo[220526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xjyrgxquhceftxzkjihoummkhynaxeru ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937775.7150178-2488-252384199472/AnsiballZ_getent.py
Feb 01 09:22:56 np0005604215.localdomain sudo[220526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:56 np0005604215.localdomain python3.9[220528]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 01 09:22:56 np0005604215.localdomain sudo[220526]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:57 np0005604215.localdomain sudo[220637]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kliklziolnsxdkipowyccdosiywpvkcx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937777.0104713-2513-196546397713899/AnsiballZ_group.py
Feb 01 09:22:57 np0005604215.localdomain sudo[220637]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:57 np0005604215.localdomain python3.9[220639]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 01 09:22:57 np0005604215.localdomain groupadd[220640]: group added to /etc/group: name=nova, GID=42436
Feb 01 09:22:57 np0005604215.localdomain groupadd[220640]: group added to /etc/gshadow: name=nova
Feb 01 09:22:57 np0005604215.localdomain groupadd[220640]: new group: name=nova, GID=42436
Feb 01 09:22:57 np0005604215.localdomain sudo[220637]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:58 np0005604215.localdomain sudo[220753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbfsiobijafqhgcafokcqhcwcobegiib ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937777.971566-2536-203240874684012/AnsiballZ_user.py
Feb 01 09:22:58 np0005604215.localdomain sudo[220753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:22:58 np0005604215.localdomain python3.9[220755]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604215.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 01 09:22:58 np0005604215.localdomain useradd[220757]: new user: name=nova, UID=42436, GID=42436, home=/home/nova, shell=/bin/sh, from=/dev/pts/0
Feb 01 09:22:58 np0005604215.localdomain useradd[220757]: add 'nova' to group 'libvirt'
Feb 01 09:22:58 np0005604215.localdomain useradd[220757]: add 'nova' to shadow group 'libvirt'
Feb 01 09:22:58 np0005604215.localdomain sudo[220753]: pam_unix(sudo:session): session closed for user root
Feb 01 09:22:59 np0005604215.localdomain sshd[220781]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:22:59 np0005604215.localdomain sshd[220781]: Accepted publickey for zuul from 192.168.122.30 port 56342 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:22:59 np0005604215.localdomain systemd-logind[761]: New session 54 of user zuul.
Feb 01 09:22:59 np0005604215.localdomain systemd[1]: Started Session 54 of User zuul.
Feb 01 09:22:59 np0005604215.localdomain sshd[220781]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:23:00 np0005604215.localdomain sshd[220784]: Received disconnect from 192.168.122.30 port 56342:11: disconnected by user
Feb 01 09:23:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43881 DF PROTO=TCP SPT=45998 DPT=9882 SEQ=1733190088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C808C0000000001030307) 
Feb 01 09:23:00 np0005604215.localdomain sshd[220784]: Disconnected from user zuul 192.168.122.30 port 56342
Feb 01 09:23:00 np0005604215.localdomain sshd[220781]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:23:00 np0005604215.localdomain systemd[1]: session-54.scope: Deactivated successfully.
Feb 01 09:23:00 np0005604215.localdomain systemd-logind[761]: Session 54 logged out. Waiting for processes to exit.
Feb 01 09:23:00 np0005604215.localdomain systemd-logind[761]: Removed session 54.
Feb 01 09:23:00 np0005604215.localdomain python3.9[220892]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43882 DF PROTO=TCP SPT=45998 DPT=9882 SEQ=1733190088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C848D0000000001030307) 
Feb 01 09:23:01 np0005604215.localdomain python3.9[220978]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937780.2323384-2612-110943075201031/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:02 np0005604215.localdomain python3.9[221086]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:02 np0005604215.localdomain python3.9[221141]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43883 DF PROTO=TCP SPT=45998 DPT=9882 SEQ=1733190088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C8C8E0000000001030307) 
Feb 01 09:23:03 np0005604215.localdomain python3.9[221249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:03 np0005604215.localdomain python3.9[221335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937782.6842198-2612-254015808744796/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:04 np0005604215.localdomain python3.9[221443]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:04 np0005604215.localdomain python3.9[221529]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937783.8872705-2612-113418659411933/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=f97201355591685d5a25f9693d35e9cd6d9ded96 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:05 np0005604215.localdomain python3.9[221637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:06 np0005604215.localdomain python3.9[221723]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937785.0461545-2612-200071512002209/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14505 DF PROTO=TCP SPT=35086 DPT=9102 SEQ=155959959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65C9A0E0000000001030307) 
Feb 01 09:23:07 np0005604215.localdomain python3.9[221831]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:08 np0005604215.localdomain python3.9[221917]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937786.166917-2612-236257260370081/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:09 np0005604215.localdomain sudo[222025]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zrfpgyghbqwfpuetlleqlskrarormgfn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937788.5925386-2860-274263132997223/AnsiballZ_file.py
Feb 01 09:23:09 np0005604215.localdomain sudo[222025]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63998 DF PROTO=TCP SPT=56596 DPT=9100 SEQ=596335886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CA68E0000000001030307) 
Feb 01 09:23:09 np0005604215.localdomain python3.9[222027]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:23:09 np0005604215.localdomain sudo[222025]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:10 np0005604215.localdomain sudo[222135]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qsglwphlqpovmdizhbxrfayuncfrjank ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937790.0956202-2885-63964437366256/AnsiballZ_copy.py
Feb 01 09:23:10 np0005604215.localdomain sudo[222135]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:10 np0005604215.localdomain python3.9[222137]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:23:10 np0005604215.localdomain sudo[222135]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:11 np0005604215.localdomain sudo[222245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vsqnqeujaadojwhhorimqdixabysnonj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937790.8058624-2909-135888951754243/AnsiballZ_stat.py
Feb 01 09:23:11 np0005604215.localdomain sudo[222245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:11 np0005604215.localdomain python3.9[222247]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:11 np0005604215.localdomain sudo[222245]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27010 DF PROTO=TCP SPT=47044 DPT=9101 SEQ=4075716098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CB3CE0000000001030307) 
Feb 01 09:23:13 np0005604215.localdomain sudo[222357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcteakmysrhcbguosknkdtbfpvzvfmgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937793.4635627-2936-256986208510242/AnsiballZ_file.py
Feb 01 09:23:13 np0005604215.localdomain sudo[222357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:13 np0005604215.localdomain python3.9[222359]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:23:13 np0005604215.localdomain sudo[222357]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:14 np0005604215.localdomain python3.9[222467]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:15 np0005604215.localdomain python3.9[222577]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43885 DF PROTO=TCP SPT=45998 DPT=9882 SEQ=1733190088 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CBD0E0000000001030307) 
Feb 01 09:23:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:23:15 np0005604215.localdomain podman[222664]: 2026-02-01 09:23:15.871925021 +0000 UTC m=+0.087577091 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:23:15 np0005604215.localdomain sudo[222677]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:23:15 np0005604215.localdomain sudo[222677]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:23:15 np0005604215.localdomain sudo[222677]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:15 np0005604215.localdomain python3.9[222663]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937794.988809-2987-235251796245736/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=aff5546b44cf4461a7541a94e4cce1332c9b58b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:15 np0005604215.localdomain podman[222664]: 2026-02-01 09:23:15.957682495 +0000 UTC m=+0.173334605 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:23:15 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:23:16 np0005604215.localdomain sudo[222707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:23:16 np0005604215.localdomain sudo[222707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:23:16 np0005604215.localdomain python3.9[222846]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:23:16 np0005604215.localdomain sudo[222707]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:17 np0005604215.localdomain python3.9[222949]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937796.157896-3033-72625758565637/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:23:17 np0005604215.localdomain sudo[223021]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:23:17 np0005604215.localdomain sudo[223021]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:23:17 np0005604215.localdomain sudo[223021]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:18 np0005604215.localdomain sudo[223075]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ldrcnfmeerfskannmnxwylyoecdvehjm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937797.6227376-3083-17877225421342/AnsiballZ_container_config_data.py
Feb 01 09:23:18 np0005604215.localdomain sudo[223075]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:18 np0005604215.localdomain python3.9[223077]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Feb 01 09:23:18 np0005604215.localdomain sudo[223075]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:23:18 np0005604215.localdomain systemd[1]: tmp-crun.qtk6FC.mount: Deactivated successfully.
Feb 01 09:23:18 np0005604215.localdomain podman[223111]: 2026-02-01 09:23:18.87447632 +0000 UTC m=+0.083933560 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:23:18 np0005604215.localdomain podman[223111]: 2026-02-01 09:23:18.908785785 +0000 UTC m=+0.118243015 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:23:18 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:23:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=14507 DF PROTO=TCP SPT=35086 DPT=9102 SEQ=155959959 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CCB0E0000000001030307) 
Feb 01 09:23:19 np0005604215.localdomain sudo[223203]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rsrxpmxmohxlatvxrzbfybzsfcmmtadc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937798.7766244-3116-259543746042436/AnsiballZ_container_config_hash.py
Feb 01 09:23:19 np0005604215.localdomain sudo[223203]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:19 np0005604215.localdomain python3.9[223205]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:23:19 np0005604215.localdomain sudo[223203]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:20 np0005604215.localdomain sudo[223313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-armbhtlhtknlxjisjanjqyjkfmmbenbq ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937799.9527185-3147-23569975595548/AnsiballZ_edpm_container_manage.py
Feb 01 09:23:20 np0005604215.localdomain sudo[223313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:20 np0005604215.localdomain python3[223315]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:23:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64000 DF PROTO=TCP SPT=56596 DPT=9100 SEQ=596335886 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CD70E0000000001030307) 
Feb 01 09:23:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27012 DF PROTO=TCP SPT=47044 DPT=9101 SEQ=4075716098 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CE30E0000000001030307) 
Feb 01 09:23:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18676 DF PROTO=TCP SPT=42360 DPT=9882 SEQ=830503638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CF5BD0000000001030307) 
Feb 01 09:23:30 np0005604215.localdomain podman[223330]: 2026-02-01 09:23:20.797914777 +0000 UTC m=+0.044097316 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 01 09:23:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18677 DF PROTO=TCP SPT=42360 DPT=9882 SEQ=830503638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65CF9CD0000000001030307) 
Feb 01 09:23:31 np0005604215.localdomain podman[223398]: 
Feb 01 09:23:31 np0005604215.localdomain podman[223398]: 2026-02-01 09:23:31.20312572 +0000 UTC m=+0.081439333 container create 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible)
Feb 01 09:23:31 np0005604215.localdomain podman[223398]: 2026-02-01 09:23:31.168472054 +0000 UTC m=+0.046785717 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 01 09:23:31 np0005604215.localdomain python3[223315]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Feb 01 09:23:31 np0005604215.localdomain sudo[223313]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:32 np0005604215.localdomain sudo[223543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msrkmbsecqsjqcwjzvykpucumdagkjxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937811.7153313-3170-65772955220756/AnsiballZ_stat.py
Feb 01 09:23:32 np0005604215.localdomain sudo[223543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:32 np0005604215.localdomain python3.9[223545]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:32 np0005604215.localdomain sudo[223543]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18678 DF PROTO=TCP SPT=42360 DPT=9882 SEQ=830503638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D01CD0000000001030307) 
Feb 01 09:23:33 np0005604215.localdomain sudo[223655]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqjdzfrmlkguroctftgzjtpetsrvpvah ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937813.2330654-3206-92584656062333/AnsiballZ_container_config_data.py
Feb 01 09:23:33 np0005604215.localdomain sudo[223655]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:33 np0005604215.localdomain python3.9[223657]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Feb 01 09:23:33 np0005604215.localdomain sudo[223655]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:34 np0005604215.localdomain sudo[223765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-saonubisdmaiikwyvtishonfcghcyydd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937814.2400188-3239-78650548964281/AnsiballZ_container_config_hash.py
Feb 01 09:23:34 np0005604215.localdomain sudo[223765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:34 np0005604215.localdomain python3.9[223767]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:23:34 np0005604215.localdomain sudo[223765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:35 np0005604215.localdomain sudo[223875]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szgespbhnyrydinrkbunqxaloueurugf ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937815.172083-3269-149720889450696/AnsiballZ_edpm_container_manage.py
Feb 01 09:23:35 np0005604215.localdomain sudo[223875]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:35 np0005604215.localdomain python3[223877]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:23:36 np0005604215.localdomain python3[223877]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",
                                                                    "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:31:38.534497001Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1214548351,
                                                                    "VirtualSize": 1214548351,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",
                                                                              "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:39.234075496Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.686286019Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.133364958Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:10.283411186Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:19.407054412Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:42.656365894Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:37.451289936Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.151652427Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532191009Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532298572Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:44.609081717Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 01 09:23:36 np0005604215.localdomain podman[223928]: 2026-02-01 09:23:36.11811738 +0000 UTC m=+0.092786600 container remove 1543f157ce4423d7729f0ac3a9fa2d7a0f71c1b7ad555c21cc9ce4ed5abc095e (image=registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1, name=nova_compute, config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'LIBGUESTFS_BACKEND': 'direct', 'TRIPLEO_CONFIG_HASH': '848fbaed99314033c0982eb0cffd8af7-1296029e90a465a2201c8dc6f8be17e7'}, 'healthcheck': {'test': '/openstack/healthcheck 5672'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-nova-compute:17.1', 'ipc': 'host', 'net': 'host', 'privileged': True, 'restart': 'always', 'start_order': 3, 'ulimit': ['nofile=131072', 'memlock=67108864'], 'user': 'nova', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/log/containers/nova:/var/log/nova', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro', '/var/lib/kolla/config_files/nova_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/nova_libvirt:/var/lib/kolla/config_files/src:ro', '/var/lib/config-data/puppet-generated/iscsid/etc/iscsi:/var/lib/kolla/config_files/src-iscsid:ro', '/var/lib/tripleo-config/ceph:/var/lib/kolla/config_files/src-ceph:z', '/dev:/dev', '/lib/modules:/lib/modules:ro', '/run:/run', '/run/nova:/run/nova:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/libvirt:/var/lib/libvirt:shared', '/sys/class/net:/sys/class/net', '/sys/bus/pci:/sys/bus/pci', '/boot:/boot:ro', '/var/lib/nova:/var/lib/nova:shared']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=nova_compute, maintainer=OpenStack TripleO Team, vendor=Red Hat, Inc., io.k8s.description=Red Hat OpenStack Platform 17.1 nova-compute, summary=Red Hat OpenStack Platform 17.1 nova-compute, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-nova-compute, distribution-scope=public, release=1766032510, url=https://www.redhat.com, description=Red Hat OpenStack Platform 17.1 nova-compute, name=rhosp-rhel9/openstack-nova-compute, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, com.redhat.component=openstack-nova-compute-container, org.opencontainers.image.revision=fb12bb7868525ee01e6b8233454fd15b039a7ffe, io.openshift.expose-services=, managed_by=tripleo_ansible, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, config_id=tripleo_step5, io.buildah.version=1.41.5, tcib_managed=true, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2026-01-12T23:32:04Z, version=17.1.13, io.k8s.display-name=Red Hat OpenStack Platform 17.1 nova-compute, build-date=2026-01-12T23:32:04Z, batch=17.1_20260112.1, vcs-ref=fb12bb7868525ee01e6b8233454fd15b039a7ffe)
Feb 01 09:23:36 np0005604215.localdomain python3[223877]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force nova_compute
Feb 01 09:23:36 np0005604215.localdomain podman[223941]: 
Feb 01 09:23:36 np0005604215.localdomain podman[223941]: 2026-02-01 09:23:36.218554581 +0000 UTC m=+0.083474995 container create 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=nova_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:23:36 np0005604215.localdomain podman[223941]: 2026-02-01 09:23:36.179754739 +0000 UTC m=+0.044675203 image pull  quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 01 09:23:36 np0005604215.localdomain python3[223877]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath --volume /etc/multipath.conf:/etc/multipath.conf:ro,Z --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Feb 01 09:23:36 np0005604215.localdomain sudo[223875]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48804 DF PROTO=TCP SPT=55492 DPT=9102 SEQ=3823228251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D0F4D0000000001030307) 
Feb 01 09:23:36 np0005604215.localdomain sudo[224086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlvbhljmoejfixubjmecoenvrhwdvgbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937816.636462-3292-149214022085644/AnsiballZ_stat.py
Feb 01 09:23:36 np0005604215.localdomain sudo[224086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:37 np0005604215.localdomain python3.9[224088]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:37 np0005604215.localdomain sudo[224086]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:37 np0005604215.localdomain sudo[224198]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xypgnxhlqvbjtvzoputrarqzafmvgwqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937817.4558518-3319-164700577221948/AnsiballZ_file.py
Feb 01 09:23:37 np0005604215.localdomain sudo[224198]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:37 np0005604215.localdomain python3.9[224200]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:23:37 np0005604215.localdomain sudo[224198]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:38 np0005604215.localdomain sudo[224307]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tqxsaycyufsskoprukcehqcuifoquoko ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937817.9893043-3319-234522244925028/AnsiballZ_copy.py
Feb 01 09:23:38 np0005604215.localdomain sudo[224307]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:38 np0005604215.localdomain python3.9[224309]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937817.9893043-3319-234522244925028/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:23:38 np0005604215.localdomain sudo[224307]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:38 np0005604215.localdomain sudo[224362]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rhysvpxbxswvzdqhyxptsitzhmjfxruc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937817.9893043-3319-234522244925028/AnsiballZ_systemd.py
Feb 01 09:23:38 np0005604215.localdomain sudo[224362]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:39 np0005604215.localdomain python3.9[224364]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:23:39 np0005604215.localdomain systemd-rc-local-generator[224389]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:23:39 np0005604215.localdomain systemd-sysv-generator[224392]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:39 np0005604215.localdomain sudo[224362]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63622 DF PROTO=TCP SPT=52826 DPT=9100 SEQ=1783760803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D1BCE0000000001030307) 
Feb 01 09:23:39 np0005604215.localdomain sudo[224453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sebfxibubzylujhvxtecbdacmzcvxext ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937817.9893043-3319-234522244925028/AnsiballZ_systemd.py
Feb 01 09:23:39 np0005604215.localdomain sudo[224453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:40 np0005604215.localdomain python3.9[224455]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:23:40 np0005604215.localdomain systemd-rc-local-generator[224480]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:23:40 np0005604215.localdomain systemd-sysv-generator[224486]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: Starting nova_compute container...
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:23:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:40 np0005604215.localdomain podman[224496]: 2026-02-01 09:23:40.639497973 +0000 UTC m=+0.126666953 container init 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=nova_compute, org.label-schema.build-date=20260127, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:23:40 np0005604215.localdomain podman[224496]: 2026-02-01 09:23:40.649180898 +0000 UTC m=+0.136349878 container start 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=nova_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:23:40 np0005604215.localdomain podman[224496]: nova_compute
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + sudo -E kolla_set_configs
Feb 01 09:23:40 np0005604215.localdomain systemd[1]: Started nova_compute container.
Feb 01 09:23:40 np0005604215.localdomain sudo[224453]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Validating config file
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying service configuration files
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Deleting /etc/ceph
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Creating directory /etc/ceph
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/ceph
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Writing out command to execute
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: ++ cat /run_command
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + CMD=nova-compute
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + ARGS=
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + sudo kolla_copy_cacerts
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + [[ ! -n '' ]]
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + . kolla_extend_start
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: Running command: 'nova-compute'
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + echo 'Running command: '\''nova-compute'\'''
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + umask 0022
Feb 01 09:23:40 np0005604215.localdomain nova_compute[224510]: + exec nova-compute
Feb 01 09:23:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:23:41.739 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:23:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:23:41.739 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:23:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:23:41.740 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.445 224514 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.445 224514 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.445 224514 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.445 224514 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.560 224514 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.582 224514 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.022s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:23:42 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:42.582 224514 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 01 09:23:42 np0005604215.localdomain python3.9[224634]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.013 224514 INFO nova.virt.driver [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.128 224514 INFO nova.compute.provider_config [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.135 224514 WARNING nova.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.136 224514 DEBUG oslo_concurrency.lockutils [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.136 224514 DEBUG oslo_concurrency.lockutils [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.136 224514 DEBUG oslo_concurrency.lockutils [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.136 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.136 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.137 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.138 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] console_host                   = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.139 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.140 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.141 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.142 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.143 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.144 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.145 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.146 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.147 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.148 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.149 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.150 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.151 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.152 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.153 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63466 DF PROTO=TCP SPT=52822 DPT=9101 SEQ=2959385660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D290D0000000001030307) 
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.154 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.155 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.156 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.157 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.158 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.159 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.160 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.161 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.162 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.163 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.164 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.165 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.166 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.167 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.168 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.168 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.168 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.168 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.168 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.168 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.169 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.169 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.169 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.169 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.169 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.169 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.170 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.171 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.172 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.172 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.172 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.172 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.172 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.173 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.174 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.175 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.176 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.177 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.178 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.179 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.179 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.179 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.179 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.179 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.179 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.180 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.181 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.182 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.183 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.184 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.185 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.186 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.187 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.188 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.189 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.190 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.191 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.192 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.193 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.194 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.195 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.196 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.197 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.198 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.199 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.200 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.201 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.202 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.203 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.204 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.204 224514 WARNING oslo_config.cfg [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: and ``live_migration_inbound_addr`` respectively.
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: ).  Its value may be silently ignored in the future.
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.204 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.204 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.204 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.204 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.205 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.206 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rbd_secret_uuid        = 33fac0b9-80c7-560f-918a-c92d3021ca1e log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.207 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.208 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.209 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.210 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.211 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.212 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.213 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.214 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.215 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.216 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.217 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.218 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.219 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.220 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.221 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.222 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.223 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.223 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.223 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.223 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.223 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.223 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.224 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.225 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.226 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.227 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.228 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.229 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.230 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.231 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.231 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.231 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.231 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.231 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.231 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.232 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.233 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.234 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.235 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.236 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.237 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.238 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.238 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.238 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.238 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.238 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.238 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.239 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.240 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.241 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.242 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.243 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.244 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.245 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.246 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.247 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.248 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.248 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.248 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.248 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.248 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.248 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.249 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.249 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.249 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.249 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.250 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.250 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.250 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.251 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.252 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.253 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.254 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.255 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.256 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.257 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.258 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.259 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.260 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.261 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.262 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.263 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.263 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.263 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.263 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.263 224514 DEBUG oslo_service.service [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.264 224514 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.284 224514 INFO nova.virt.node [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Determined node identity d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from /var/lib/nova/compute_id
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.284 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.284 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.285 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.285 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 01 09:23:43 np0005604215.localdomain systemd[1]: Started libvirt QEMU daemon.
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.355 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7418dd4d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.359 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7418dd4d00> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.360 224514 INFO nova.virt.libvirt.driver [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Connection event '1' reason 'None'
Feb 01 09:23:43 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:43.372 224514 DEBUG nova.virt.libvirt.volume.mount [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 01 09:23:43 np0005604215.localdomain python3.9[224795]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.286 224514 INFO nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Libvirt host capabilities <capabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <host>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <uuid>b72fb799-3472-4728-b6e2-ec98d2bbb61b</uuid>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <arch>x86_64</arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model>EPYC-Rome-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <vendor>AMD</vendor>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <microcode version='16777317'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <signature family='23' model='49' stepping='0'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='x2apic'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='tsc-deadline'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='osxsave'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='hypervisor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='tsc_adjust'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='spec-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='stibp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='arch-capabilities'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='cmp_legacy'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='topoext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='virt-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='lbrv'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='tsc-scale'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='vmcb-clean'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='pause-filter'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='pfthreshold'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='svme-addr-chk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='rdctl-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='skip-l1dfl-vmentry'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='mds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature name='pschange-mc-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <pages unit='KiB' size='4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <pages unit='KiB' size='2048'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <pages unit='KiB' size='1048576'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <power_management>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <suspend_mem/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <suspend_disk/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <suspend_hybrid/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </power_management>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <iommu support='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <migration_features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <live/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <uri_transports>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <uri_transport>tcp</uri_transport>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <uri_transport>rdma</uri_transport>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </uri_transports>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </migration_features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <topology>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <cells num='1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <cell id='0'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           <memory unit='KiB'>16116604</memory>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           <pages unit='KiB' size='4'>4029151</pages>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           <pages unit='KiB' size='2048'>0</pages>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           <distances>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <sibling id='0' value='10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           </distances>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           <cpus num='8'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:           </cpus>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         </cell>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </cells>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </topology>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <cache>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </cache>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <secmodel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model>selinux</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <doi>0</doi>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </secmodel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <secmodel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model>dac</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <doi>0</doi>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </secmodel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </host>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <guest>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <os_type>hvm</os_type>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <arch name='i686'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <wordsize>32</wordsize>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <domain type='qemu'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <domain type='kvm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <pae/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <nonpae/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <acpi default='on' toggle='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <apic default='on' toggle='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <cpuselection/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <deviceboot/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <disksnapshot default='on' toggle='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <externalSnapshot/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </guest>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <guest>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <os_type>hvm</os_type>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <arch name='x86_64'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <wordsize>64</wordsize>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <domain type='qemu'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <domain type='kvm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <acpi default='on' toggle='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <apic default='on' toggle='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <cpuselection/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <deviceboot/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <disksnapshot default='on' toggle='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <externalSnapshot/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </guest>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: </capabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.296 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.317 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: <domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <domain>kvm</domain>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <arch>i686</arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <vcpu max='240'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <iothreads supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <os supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='firmware'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <loader supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>rom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pflash</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='readonly'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>yes</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='secure'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </loader>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </os>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='maximumMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <vendor>AMD</vendor>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='succor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='custom' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <memoryBacking supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='sourceType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>anonymous</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>memfd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </memoryBacking>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <disk supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='diskDevice'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>disk</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cdrom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>floppy</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>lun</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ide</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>fdc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>sata</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </disk>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <graphics supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vnc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egl-headless</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </graphics>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <video supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='modelType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vga</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cirrus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>none</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>bochs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ramfb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </video>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hostdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='mode'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>subsystem</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='startupPolicy'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>mandatory</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>requisite</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>optional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='subsysType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pci</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='capsType'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='pciBackend'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hostdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <rng supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>random</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </rng>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <filesystem supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='driverType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>path</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>handle</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtiofs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </filesystem>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tpm supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-tis</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-crb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emulator</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>external</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendVersion'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>2.0</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </tpm>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <redirdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </redirdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <channel supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </channel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <crypto supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </crypto>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <interface supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>passt</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </interface>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <panic supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>isa</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>hyperv</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </panic>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <console supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>null</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dev</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pipe</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stdio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>udp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tcp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu-vdagent</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </console>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <gic supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <genid supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backup supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <async-teardown supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <s390-pv supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <ps2 supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tdx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sev supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sgx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hyperv supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='features'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>relaxed</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vapic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>spinlocks</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vpindex</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>runtime</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>synic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stimer</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reset</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vendor_id</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>frequencies</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reenlightenment</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tlbflush</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ipi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>avic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emsr_bitmap</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>xmm_input</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hyperv>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <launchSecurity supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: </domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.327 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: <domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <domain>kvm</domain>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <arch>i686</arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <vcpu max='1024'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <iothreads supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <os supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='firmware'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <loader supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>rom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pflash</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='readonly'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>yes</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='secure'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </loader>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </os>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='maximumMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <vendor>AMD</vendor>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='succor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='custom' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <memoryBacking supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='sourceType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>anonymous</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>memfd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </memoryBacking>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <disk supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='diskDevice'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>disk</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cdrom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>floppy</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>lun</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>fdc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>sata</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </disk>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <graphics supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vnc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egl-headless</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </graphics>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <video supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='modelType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vga</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cirrus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>none</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>bochs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ramfb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </video>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hostdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='mode'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>subsystem</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='startupPolicy'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>mandatory</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>requisite</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>optional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='subsysType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pci</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='capsType'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='pciBackend'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hostdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <rng supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>random</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </rng>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <filesystem supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='driverType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>path</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>handle</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtiofs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </filesystem>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tpm supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-tis</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-crb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emulator</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>external</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendVersion'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>2.0</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </tpm>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <redirdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </redirdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <channel supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </channel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <crypto supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </crypto>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <interface supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>passt</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </interface>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <panic supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>isa</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>hyperv</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </panic>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <console supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>null</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dev</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pipe</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stdio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>udp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tcp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu-vdagent</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </console>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <gic supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <genid supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backup supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <async-teardown supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <s390-pv supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <ps2 supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tdx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sev supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sgx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hyperv supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='features'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>relaxed</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vapic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>spinlocks</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vpindex</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>runtime</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>synic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stimer</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reset</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vendor_id</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>frequencies</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reenlightenment</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tlbflush</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ipi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>avic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emsr_bitmap</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>xmm_input</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hyperv>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <launchSecurity supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: </domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.399 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.405 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: <domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <domain>kvm</domain>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <arch>x86_64</arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <vcpu max='240'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <iothreads supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <os supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='firmware'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <loader supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>rom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pflash</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='readonly'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>yes</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='secure'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </loader>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </os>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='maximumMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <vendor>AMD</vendor>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='succor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='custom' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <memoryBacking supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='sourceType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>anonymous</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>memfd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </memoryBacking>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <disk supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='diskDevice'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>disk</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cdrom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>floppy</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>lun</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ide</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>fdc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>sata</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </disk>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <graphics supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vnc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egl-headless</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </graphics>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <video supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='modelType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vga</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cirrus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>none</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>bochs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ramfb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </video>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hostdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='mode'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>subsystem</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='startupPolicy'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>mandatory</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>requisite</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>optional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='subsysType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pci</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='capsType'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='pciBackend'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hostdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <rng supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>random</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </rng>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <filesystem supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='driverType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>path</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>handle</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtiofs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </filesystem>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tpm supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-tis</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-crb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emulator</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>external</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendVersion'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>2.0</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </tpm>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <redirdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </redirdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <channel supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </channel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <crypto supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </crypto>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <interface supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>passt</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </interface>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <panic supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>isa</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>hyperv</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </panic>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <console supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>null</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dev</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pipe</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stdio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>udp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tcp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu-vdagent</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </console>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <gic supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <genid supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backup supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <async-teardown supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <s390-pv supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <ps2 supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tdx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sev supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sgx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hyperv supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='features'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>relaxed</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vapic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>spinlocks</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vpindex</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>runtime</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>synic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stimer</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reset</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vendor_id</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>frequencies</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reenlightenment</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tlbflush</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ipi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>avic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emsr_bitmap</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>xmm_input</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hyperv>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <launchSecurity supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: </domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.463 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: <domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <domain>kvm</domain>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <arch>x86_64</arch>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <vcpu max='1024'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <iothreads supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <os supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='firmware'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>efi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <loader supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>rom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pflash</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='readonly'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>yes</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='secure'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>yes</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>no</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </loader>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </os>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='maximumMigratable'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>on</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>off</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <vendor>AMD</vendor>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='succor'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <mode name='custom' supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ddpd-u'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sha512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm3'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sm4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Denverton-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amd-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='auto-ibrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='perfmon-v2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbpb'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='stibp-always-on'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='EPYC-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-128'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-256'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx10-512'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='prefetchiti'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Haswell-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512er'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512pf'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fma4'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tbm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xop'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='amx-tile'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-bf16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-fp16'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bitalg'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrc'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fzrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='la57'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='taa-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ifma'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cmpccxadd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fbsdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='fsrs'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ibrs-all'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='intel-psfd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='lam'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mcdt-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pbrsb-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='psdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rfds-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='serialize'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vaes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='hle'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='rtm'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512bw'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512cd'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512dq'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512f'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='avx512vl'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='invpcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pcid'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='pku'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='mpx'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='core-capability'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='split-lock-detect'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='cldemote'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='erms'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='gfni'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdir64b'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='movdiri'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='xsaves'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='athlon-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='core2duo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='coreduo-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='n270-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='ss'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <blockers model='phenom-v1'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnow'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <feature name='3dnowext'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </blockers>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </mode>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </cpu>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <memoryBacking supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <enum name='sourceType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>anonymous</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <value>memfd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </memoryBacking>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <disk supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='diskDevice'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>disk</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cdrom</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>floppy</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>lun</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>fdc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>sata</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </disk>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <graphics supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vnc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egl-headless</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </graphics>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <video supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='modelType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vga</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>cirrus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>none</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>bochs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ramfb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </video>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hostdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='mode'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>subsystem</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='startupPolicy'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>mandatory</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>requisite</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>optional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='subsysType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pci</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>scsi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='capsType'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='pciBackend'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hostdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <rng supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtio-non-transitional</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>random</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>egd</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </rng>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <filesystem supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='driverType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>path</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>handle</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>virtiofs</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </filesystem>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tpm supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-tis</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tpm-crb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emulator</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>external</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendVersion'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>2.0</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </tpm>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <redirdev supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='bus'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>usb</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </redirdev>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <channel supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </channel>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <crypto supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendModel'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>builtin</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </crypto>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <interface supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='backendType'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>default</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>passt</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </interface>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <panic supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='model'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>isa</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>hyperv</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </panic>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <console supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='type'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>null</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vc</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pty</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dev</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>file</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>pipe</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stdio</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>udp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tcp</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>unix</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>qemu-vdagent</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>dbus</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </console>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </devices>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   <features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <gic supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <genid supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <backup supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <async-teardown supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <s390-pv supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <ps2 supported='yes'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <tdx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sev supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <sgx supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <hyperv supported='yes'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <enum name='features'>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>relaxed</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vapic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>spinlocks</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vpindex</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>runtime</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>synic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>stimer</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reset</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>vendor_id</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>frequencies</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>reenlightenment</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>tlbflush</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>ipi</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>avic</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>emsr_bitmap</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <value>xmm_input</value>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </enum>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       <defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:       </defaults>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     </hyperv>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:     <launchSecurity supported='no'/>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:   </features>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: </domainCapabilities>
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.514 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.515 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.517 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.518 224514 INFO nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Secure Boot support detected
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.520 224514 INFO nova.virt.libvirt.driver [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.520 224514 INFO nova.virt.libvirt.driver [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.531 224514 DEBUG nova.virt.libvirt.driver [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.575 224514 INFO nova.virt.node [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Determined node identity d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from /var/lib/nova/compute_id
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.595 224514 DEBUG nova.compute.manager [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Verified node d5eeed9a-e4d0-4244-8d4e-39e5c8263590 matches my host np0005604215.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 01 09:23:44 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:44.668 224514 INFO nova.compute.manager [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.101 224514 INFO nova.service [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Updating service version for nova-compute on np0005604215.localdomain from 57 to 66
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.144 224514 DEBUG oslo_concurrency.lockutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.145 224514 DEBUG oslo_concurrency.lockutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.145 224514 DEBUG oslo_concurrency.lockutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.145 224514 DEBUG nova.compute.resource_tracker [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.145 224514 DEBUG oslo_concurrency.processutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:23:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18680 DF PROTO=TCP SPT=42360 DPT=9882 SEQ=830503638 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D310E0000000001030307) 
Feb 01 09:23:45 np0005604215.localdomain python3.9[224915]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.594 224514 DEBUG oslo_concurrency.processutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:23:45 np0005604215.localdomain systemd[1]: Started libvirt nodedev daemon.
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.950 224514 WARNING nova.virt.libvirt.driver [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.952 224514 DEBUG nova.compute.resource_tracker [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13613MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.953 224514 DEBUG oslo_concurrency.lockutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:23:45 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:45.953 224514 DEBUG oslo_concurrency.lockutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.093 224514 DEBUG nova.compute.resource_tracker [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.093 224514 DEBUG nova.compute.resource_tracker [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:23:46 np0005604215.localdomain sudo[225068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dffmbkpmpwksbhbrkpkptxsnnsfgokzg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937825.6619282-3500-123352311905504/AnsiballZ_podman_container.py
Feb 01 09:23:46 np0005604215.localdomain sudo[225068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.146 224514 DEBUG nova.scheduler.client.report [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.170 224514 DEBUG nova.scheduler.client.report [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.170 224514 DEBUG nova.compute.provider_tree [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.184 224514 DEBUG nova.scheduler.client.report [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.205 224514 DEBUG nova.scheduler.client.report [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: HW_CPU_X86_BMI,COMPUTE_TRUSTED_CERTS,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSSE3,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_AMD_SVM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_USB,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_ACCELERATORS,HW_CPU_X86_SSE4A,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_IMAGE_TYPE_ISO,HW_CPU_X86_F16C,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SSE41,HW_CPU_X86_CLMUL,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SVM,COMPUTE_NODE,COMPUTE_NET_VIF_MODEL_LAN9118,HW_CPU_X86_AVX,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_MMX,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_ABM,HW_CPU_X86_AVX2,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_SSE42,HW_CPU_X86_FMA3,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE2,COMPUTE_STORAGE_BUS_IDE,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_GRAPHICS_MODEL_CIRRUS,HW_CPU_X86_BMI2,HW_CPU_X86_AESNI _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.225 224514 DEBUG oslo_concurrency.processutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:23:46 np0005604215.localdomain podman[225071]: 2026-02-01 09:23:46.227453034 +0000 UTC m=+0.086506627 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:23:46 np0005604215.localdomain podman[225071]: 2026-02-01 09:23:46.295973863 +0000 UTC m=+0.155027436 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:23:46 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:23:46 np0005604215.localdomain python3.9[225070]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 01 09:23:46 np0005604215.localdomain sudo[225068]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:46 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 122.2 (407 of 333 items), suggesting rotation.
Feb 01 09:23:46 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 09:23:46 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:23:46 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.642 224514 DEBUG oslo_concurrency.processutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.417s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.649 224514 DEBUG nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.649 224514 INFO nova.virt.libvirt.host [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] kernel doesn't support AMD SEV
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.651 224514 DEBUG nova.compute.provider_tree [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.651 224514 DEBUG nova.virt.libvirt.driver [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.681 224514 DEBUG nova.scheduler.client.report [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.775 224514 DEBUG nova.compute.provider_tree [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Updating resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 generation from 2 to 3 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.819 224514 DEBUG nova.compute.resource_tracker [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.820 224514 DEBUG oslo_concurrency.lockutils [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.867s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.820 224514 DEBUG nova.service [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.912 224514 DEBUG nova.service [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 01 09:23:46 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:46.912 224514 DEBUG nova.servicegroup.drivers.db [None req-4842977a-3d23-43c6-a834-f3e97ae6307a - - - - - -] DB_Driver: join new ServiceGroup member np0005604215.localdomain to the compute group, service = <Service: host=np0005604215.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 01 09:23:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48806 DF PROTO=TCP SPT=55492 DPT=9102 SEQ=3823228251 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D3F0D0000000001030307) 
Feb 01 09:23:49 np0005604215.localdomain sudo[225249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbvggpqmikuochtrkfuzkqbygkkzfqrf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937829.3096247-3524-188434908942464/AnsiballZ_systemd.py
Feb 01 09:23:49 np0005604215.localdomain sudo[225249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:23:49 np0005604215.localdomain systemd[1]: tmp-crun.xu90Tr.mount: Deactivated successfully.
Feb 01 09:23:49 np0005604215.localdomain podman[225252]: 2026-02-01 09:23:49.691328964 +0000 UTC m=+0.069747607 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 01 09:23:49 np0005604215.localdomain podman[225252]: 2026-02-01 09:23:49.719867073 +0000 UTC m=+0.098285726 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:23:49 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:23:49 np0005604215.localdomain python3.9[225251]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:23:49 np0005604215.localdomain systemd[1]: Stopping nova_compute container...
Feb 01 09:23:50 np0005604215.localdomain systemd[1]: tmp-crun.CJvFoo.mount: Deactivated successfully.
Feb 01 09:23:51 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:51.030 224514 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 01 09:23:51 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:51.032 224514 DEBUG oslo_concurrency.lockutils [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:23:51 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:51.033 224514 DEBUG oslo_concurrency.lockutils [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:23:51 np0005604215.localdomain nova_compute[224510]: 2026-02-01 09:23:51.033 224514 DEBUG oslo_concurrency.lockutils [None req-8068517e-d2d4-4816-a849-5dd48d7eee84 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:23:51 np0005604215.localdomain systemd[1]: libpod-6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b.scope: Deactivated successfully.
Feb 01 09:23:51 np0005604215.localdomain virtqemud[224673]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Feb 01 09:23:51 np0005604215.localdomain virtqemud[224673]: hostname: np0005604215.localdomain
Feb 01 09:23:51 np0005604215.localdomain virtqemud[224673]: End of file while reading data: Input/output error
Feb 01 09:23:51 np0005604215.localdomain systemd[1]: libpod-6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b.scope: Consumed 3.803s CPU time.
Feb 01 09:23:51 np0005604215.localdomain podman[225273]: 2026-02-01 09:23:51.417459736 +0000 UTC m=+1.442966633 container died 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:23:51 np0005604215.localdomain systemd[1]: tmp-crun.Zlc5wd.mount: Deactivated successfully.
Feb 01 09:23:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538-merged.mount: Deactivated successfully.
Feb 01 09:23:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b-userdata-shm.mount: Deactivated successfully.
Feb 01 09:23:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63624 DF PROTO=TCP SPT=52826 DPT=9100 SEQ=1783760803 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D4B0D0000000001030307) 
Feb 01 09:23:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=63468 DF PROTO=TCP SPT=52822 DPT=9101 SEQ=2959385660 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D590D0000000001030307) 
Feb 01 09:23:55 np0005604215.localdomain podman[225273]: 2026-02-01 09:23:55.736053746 +0000 UTC m=+5.761560593 container cleanup 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.build-date=20260127, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:23:55 np0005604215.localdomain podman[225273]: nova_compute
Feb 01 09:23:55 np0005604215.localdomain podman[225567]: error opening file `/run/crun/6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b/status`: No such file or directory
Feb 01 09:23:55 np0005604215.localdomain podman[225556]: 2026-02-01 09:23:55.827656669 +0000 UTC m=+0.063829277 container cleanup 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']})
Feb 01 09:23:55 np0005604215.localdomain podman[225556]: nova_compute
Feb 01 09:23:55 np0005604215.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 01 09:23:55 np0005604215.localdomain systemd[1]: Stopped nova_compute container.
Feb 01 09:23:55 np0005604215.localdomain systemd[1]: Starting nova_compute container...
Feb 01 09:23:55 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:23:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:55 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:55 np0005604215.localdomain podman[225571]: 2026-02-01 09:23:55.978309431 +0000 UTC m=+0.121401131 container init 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:23:55 np0005604215.localdomain podman[225571]: 2026-02-01 09:23:55.987396417 +0000 UTC m=+0.130488107 container start 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=nova_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:23:55 np0005604215.localdomain podman[225571]: nova_compute
Feb 01 09:23:55 np0005604215.localdomain nova_compute[225585]: + sudo -E kolla_set_configs
Feb 01 09:23:55 np0005604215.localdomain systemd[1]: Started nova_compute container.
Feb 01 09:23:56 np0005604215.localdomain sudo[225249]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Validating config file
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying service configuration files
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /etc/ceph
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Creating directory /etc/ceph
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/ceph
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Writing out command to execute
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: ++ cat /run_command
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + CMD=nova-compute
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + ARGS=
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + sudo kolla_copy_cacerts
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + [[ ! -n '' ]]
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + . kolla_extend_start
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: Running command: 'nova-compute'
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + echo 'Running command: '\''nova-compute'\'''
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + umask 0022
Feb 01 09:23:56 np0005604215.localdomain nova_compute[225585]: + exec nova-compute
Feb 01 09:23:57 np0005604215.localdomain sudo[225705]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pemderbqozbjwnbggvjwnehaxntqrvgv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937836.7399573-3551-272994001682666/AnsiballZ_podman_container.py
Feb 01 09:23:57 np0005604215.localdomain sudo[225705]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:23:57 np0005604215.localdomain python3.9[225707]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: Started libpod-conmon-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d.scope.
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:23:57 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:57 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:57 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 09:23:57 np0005604215.localdomain podman[225733]: 2026-02-01 09:23:57.596601387 +0000 UTC m=+0.136971856 container init 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:23:57 np0005604215.localdomain podman[225733]: 2026-02-01 09:23:57.613671027 +0000 UTC m=+0.154041496 container start 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3)
Feb 01 09:23:57 np0005604215.localdomain python3.9[225707]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Applying nova statedir ownership
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/d301d14069645d8c23fee2987984776b3e88a570e1aa96d6cf3e31fa880385fd
Feb 01 09:23:57 np0005604215.localdomain nova_compute_init[225755]: INFO:nova_statedir:Nova statedir ownership complete
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: libpod-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d.scope: Deactivated successfully.
Feb 01 09:23:57 np0005604215.localdomain podman[225754]: 2026-02-01 09:23:57.685765724 +0000 UTC m=+0.056175843 container died 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init)
Feb 01 09:23:57 np0005604215.localdomain podman[225767]: 2026-02-01 09:23:57.762659438 +0000 UTC m=+0.074499072 container cleanup 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: libpod-conmon-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d.scope: Deactivated successfully.
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: tmp-crun.p8KUQI.mount: Deactivated successfully.
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890-merged.mount: Deactivated successfully.
Feb 01 09:23:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d-userdata-shm.mount: Deactivated successfully.
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.844 225589 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.844 225589 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.844 225589 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.844 225589 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 01 09:23:57 np0005604215.localdomain sudo[225705]: pam_unix(sudo:session): session closed for user root
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.963 225589 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.986 225589 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:23:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:57.986 225589 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 01 09:23:58 np0005604215.localdomain sshd[207693]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:23:58 np0005604215.localdomain systemd-logind[761]: Session 53 logged out. Waiting for processes to exit.
Feb 01 09:23:58 np0005604215.localdomain systemd[1]: session-53.scope: Deactivated successfully.
Feb 01 09:23:58 np0005604215.localdomain systemd[1]: session-53.scope: Consumed 1min 55.991s CPU time.
Feb 01 09:23:58 np0005604215.localdomain systemd-logind[761]: Removed session 53.
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.467 225589 INFO nova.virt.driver [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.577 225589 INFO nova.compute.provider_config [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.585 225589 WARNING nova.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.: nova.exception.TooOldComputeService: Current Nova version does not support computes older than Yoga but the minimum compute service level in your cell is 57 and the oldest supported service level is 61.
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.585 225589 DEBUG oslo_concurrency.lockutils [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.586 225589 DEBUG oslo_concurrency.lockutils [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.586 225589 DEBUG oslo_concurrency.lockutils [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.586 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.586 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.586 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.587 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.588 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] console_host                   = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.589 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.590 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.590 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.590 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.590 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.590 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.590 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.591 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.592 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.592 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.592 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.592 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.592 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.592 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.593 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.593 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.593 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.593 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.593 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.593 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.594 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.595 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.596 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.597 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.597 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.597 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.597 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.597 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.597 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.598 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.599 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.600 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.601 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.602 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.602 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.602 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.602 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.603 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.604 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.604 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.605 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.605 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.605 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.606 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.606 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.606 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.607 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.607 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.607 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.607 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.608 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.608 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.608 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.609 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.609 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.609 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.610 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.610 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.610 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.611 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.611 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.611 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.611 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.612 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.612 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.612 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.613 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.613 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.613 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.614 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.614 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.614 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.614 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.615 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.615 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.616 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.616 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.617 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.617 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.617 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.618 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.618 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.618 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.619 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.619 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.619 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.620 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.620 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.620 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.621 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.621 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.621 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.622 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.622 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.622 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.623 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.623 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.623 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.624 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.624 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.624 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.625 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.625 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.625 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.626 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.626 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.626 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.627 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.627 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.627 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.628 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.628 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.628 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.628 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.629 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.629 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.630 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.630 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.630 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.631 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.631 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.631 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.632 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.632 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.632 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.633 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.633 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.633 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.634 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.634 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.634 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.635 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.635 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.635 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.635 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.636 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.636 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.636 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.637 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.637 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.637 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.638 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.638 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.639 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.639 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.639 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.640 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.640 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.640 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.641 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.641 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.641 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.642 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.642 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.642 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.642 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.642 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.643 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.643 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.643 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.643 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.643 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.644 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.644 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.644 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.644 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.644 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.645 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.645 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.645 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.645 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.645 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.646 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.646 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.646 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.646 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.647 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.647 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.647 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.647 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.647 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.647 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.648 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.648 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.648 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.648 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.648 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.649 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.649 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.649 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.649 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.649 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.650 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.650 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.650 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.650 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.650 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.651 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.651 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.651 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.651 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.652 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.652 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.652 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.652 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.652 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.653 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.653 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.653 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.653 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.653 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.654 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.654 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.654 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.654 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.654 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.655 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.655 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.655 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.655 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.655 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.655 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.656 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.656 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.656 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.656 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.656 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.657 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.657 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.657 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.657 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.657 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.657 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.658 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.658 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.658 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.658 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.658 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.659 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.659 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.659 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.659 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.659 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.659 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.660 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.660 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.660 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.660 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.660 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.661 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.661 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.661 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.661 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.661 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.661 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.662 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.662 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.662 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.662 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.662 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.663 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.663 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.663 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.663 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.663 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.663 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.664 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.664 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.664 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.664 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.665 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.665 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.665 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.665 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.665 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.666 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.666 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.666 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.666 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.666 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.667 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.667 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.667 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.667 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.668 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.668 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.668 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.668 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.668 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.669 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.669 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.669 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.669 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.669 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.670 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.670 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.670 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.670 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.670 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.671 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.671 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.671 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.671 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.671 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.671 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.672 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.672 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.672 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.672 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.672 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.673 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.673 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.673 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.673 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.673 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.673 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.674 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.675 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.676 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.677 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.678 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.679 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.680 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.681 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.682 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.683 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.684 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.685 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.686 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.687 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.687 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.687 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.687 225589 WARNING oslo_config.cfg [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: and ``live_migration_inbound_addr`` respectively.
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: ).  Its value may be silently ignored in the future.
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.687 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.687 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.688 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.689 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rbd_secret_uuid        = 33fac0b9-80c7-560f-918a-c92d3021ca1e log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.690 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.691 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.692 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.692 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.693 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.694 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.695 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.696 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.697 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.698 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.699 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.700 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.701 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.702 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.703 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.704 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.705 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.706 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.707 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.708 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.709 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.710 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.711 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.712 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.713 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.714 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.715 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.716 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.717 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.718 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.719 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.720 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.721 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.721 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.721 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.721 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.721 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.721 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.722 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.disable_compute_service_check_for_ffu = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.723 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.724 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.725 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.726 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.727 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.728 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.729 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.730 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.731 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.732 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.733 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.734 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.735 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.736 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.737 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.738 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.739 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.740 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.741 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.742 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.743 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.744 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.745 225589 DEBUG oslo_service.service [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.745 225589 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.759 225589 INFO nova.virt.node [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Determined node identity d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from /var/lib/nova/compute_id
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.760 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.760 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.760 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.761 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.773 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f5a67fd0430> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.775 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f5a67fd0430> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.776 225589 INFO nova.virt.libvirt.driver [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Connection event '1' reason 'None'
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.783 225589 INFO nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Libvirt host capabilities <capabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <host>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <uuid>b72fb799-3472-4728-b6e2-ec98d2bbb61b</uuid>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <arch>x86_64</arch>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model>EPYC-Rome-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <vendor>AMD</vendor>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <microcode version='16777317'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <signature family='23' model='49' stepping='0'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='x2apic'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='tsc-deadline'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='osxsave'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='hypervisor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='tsc_adjust'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='spec-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='stibp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='arch-capabilities'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='cmp_legacy'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='topoext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='virt-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='lbrv'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='tsc-scale'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='vmcb-clean'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='pause-filter'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='pfthreshold'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='svme-addr-chk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='rdctl-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='skip-l1dfl-vmentry'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='mds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature name='pschange-mc-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <pages unit='KiB' size='4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <pages unit='KiB' size='2048'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <pages unit='KiB' size='1048576'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <power_management>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <suspend_mem/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <suspend_disk/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <suspend_hybrid/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </power_management>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <iommu support='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <migration_features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <live/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <uri_transports>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <uri_transport>tcp</uri_transport>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <uri_transport>rdma</uri_transport>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </uri_transports>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </migration_features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <topology>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <cells num='1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <cell id='0'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           <memory unit='KiB'>16116604</memory>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           <pages unit='KiB' size='4'>4029151</pages>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           <pages unit='KiB' size='2048'>0</pages>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           <distances>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <sibling id='0' value='10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           </distances>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           <cpus num='8'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:           </cpus>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         </cell>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </cells>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </topology>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <cache>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </cache>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <secmodel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model>selinux</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <doi>0</doi>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </secmodel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <secmodel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model>dac</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <doi>0</doi>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </secmodel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </host>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <guest>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <os_type>hvm</os_type>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <arch name='i686'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <wordsize>32</wordsize>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <domain type='qemu'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <domain type='kvm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </arch>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <pae/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <nonpae/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <acpi default='on' toggle='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <apic default='on' toggle='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <cpuselection/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <deviceboot/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <disksnapshot default='on' toggle='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <externalSnapshot/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </guest>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <guest>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <os_type>hvm</os_type>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <arch name='x86_64'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <wordsize>64</wordsize>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <domain type='qemu'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <domain type='kvm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </arch>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <acpi default='on' toggle='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <apic default='on' toggle='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <cpuselection/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <deviceboot/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <disksnapshot default='on' toggle='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <externalSnapshot/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </guest>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: </capabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.789 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.792 225589 DEBUG nova.virt.libvirt.volume.mount [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.794 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: <domainCapabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <domain>kvm</domain>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <arch>i686</arch>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <vcpu max='1024'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <iothreads supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <os supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <enum name='firmware'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <loader supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>rom</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pflash</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='readonly'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>yes</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='secure'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </loader>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </os>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='maximumMigratable'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <vendor>AMD</vendor>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='succor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='custom' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <memoryBacking supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <enum name='sourceType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>file</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>anonymous</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>memfd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </memoryBacking>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <devices>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <disk supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='diskDevice'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>disk</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>cdrom</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>floppy</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>lun</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>fdc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>sata</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </disk>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <graphics supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vnc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>egl-headless</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </graphics>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <video supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='modelType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vga</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>cirrus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>none</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>bochs</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ramfb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </video>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <hostdev supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='mode'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>subsystem</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='startupPolicy'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>mandatory</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>requisite</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>optional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='subsysType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pci</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='capsType'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='pciBackend'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </hostdev>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <rng supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>random</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>egd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </rng>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <filesystem supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='driverType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>path</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>handle</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtiofs</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </filesystem>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <tpm supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tpm-tis</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tpm-crb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>emulator</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>external</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendVersion'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>2.0</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </tpm>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <redirdev supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </redirdev>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <channel supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </channel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <crypto supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>qemu</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </crypto>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <interface supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>passt</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </interface>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <panic supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>isa</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>hyperv</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </panic>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <console supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>null</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dev</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>file</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pipe</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>stdio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>udp</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tcp</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>qemu-vdagent</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </console>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </devices>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <gic supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <genid supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <backup supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <async-teardown supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <s390-pv supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <ps2 supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <tdx supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <sev supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <sgx supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <hyperv supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='features'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>relaxed</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vapic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>spinlocks</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vpindex</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>runtime</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>synic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>stimer</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>reset</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vendor_id</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>frequencies</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>reenlightenment</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tlbflush</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ipi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>avic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>emsr_bitmap</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>xmm_input</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <defaults>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </defaults>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </hyperv>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <launchSecurity supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: </domainCapabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.802 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: <domainCapabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <domain>kvm</domain>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <arch>i686</arch>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <vcpu max='240'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <iothreads supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <os supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <enum name='firmware'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <loader supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>rom</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pflash</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='readonly'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>yes</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='secure'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </loader>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </os>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='maximumMigratable'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <vendor>AMD</vendor>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='succor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='custom' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <memoryBacking supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <enum name='sourceType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>file</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>anonymous</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>memfd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </memoryBacking>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <devices>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <disk supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='diskDevice'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>disk</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>cdrom</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>floppy</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>lun</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ide</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>fdc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>sata</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </disk>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <graphics supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vnc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>egl-headless</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </graphics>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <video supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='modelType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vga</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>cirrus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>none</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>bochs</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ramfb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </video>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <hostdev supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='mode'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>subsystem</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='startupPolicy'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>mandatory</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>requisite</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>optional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='subsysType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pci</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='capsType'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='pciBackend'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </hostdev>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <rng supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>random</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>egd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </rng>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <filesystem supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='driverType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>path</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>handle</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtiofs</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </filesystem>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <tpm supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tpm-tis</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tpm-crb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>emulator</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>external</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendVersion'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>2.0</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </tpm>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <redirdev supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </redirdev>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <channel supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </channel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <crypto supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>qemu</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </crypto>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <interface supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>passt</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </interface>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <panic supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>isa</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>hyperv</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </panic>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <console supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>null</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dev</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>file</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pipe</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>stdio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>udp</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tcp</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>qemu-vdagent</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </console>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </devices>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <gic supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <genid supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <backup supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <async-teardown supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <s390-pv supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <ps2 supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <tdx supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <sev supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <sgx supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <hyperv supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='features'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>relaxed</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vapic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>spinlocks</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vpindex</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>runtime</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>synic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>stimer</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>reset</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vendor_id</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>frequencies</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>reenlightenment</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tlbflush</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ipi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>avic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>emsr_bitmap</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>xmm_input</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <defaults>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </defaults>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </hyperv>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <launchSecurity supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: </domainCapabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.873 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.879 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: <domainCapabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <domain>kvm</domain>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <arch>x86_64</arch>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <vcpu max='1024'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <iothreads supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <os supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <enum name='firmware'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>efi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <loader supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>rom</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pflash</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='readonly'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>yes</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='secure'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>yes</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </loader>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </os>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='maximumMigratable'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <vendor>AMD</vendor>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='succor'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <mode name='custom' supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom-v1'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </cpu>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <memoryBacking supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <enum name='sourceType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>file</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>anonymous</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <value>memfd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </memoryBacking>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <devices>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <disk supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='diskDevice'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>disk</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>cdrom</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>floppy</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>lun</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>fdc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>sata</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </disk>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <graphics supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vnc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>egl-headless</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </graphics>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <video supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='modelType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vga</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>cirrus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>none</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>bochs</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ramfb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </video>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <hostdev supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='mode'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>subsystem</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='startupPolicy'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>mandatory</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>requisite</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>optional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='subsysType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pci</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='capsType'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='pciBackend'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </hostdev>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <rng supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>random</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>egd</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </rng>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <filesystem supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='driverType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>path</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>handle</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>virtiofs</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </filesystem>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <tpm supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tpm-tis</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tpm-crb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>emulator</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>external</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendVersion'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>2.0</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </tpm>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <redirdev supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </redirdev>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <channel supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </channel>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <crypto supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>qemu</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </crypto>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <interface supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='backendType'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>passt</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </interface>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <panic supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>isa</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>hyperv</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </panic>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <console supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>null</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vc</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dev</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>file</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>pipe</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>stdio</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>udp</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tcp</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>qemu-vdagent</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </console>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </devices>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   <features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <gic supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <genid supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <backup supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <async-teardown supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <s390-pv supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <ps2 supported='yes'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <tdx supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <sev supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <sgx supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <hyperv supported='yes'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <enum name='features'>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>relaxed</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vapic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>spinlocks</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vpindex</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>runtime</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>synic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>stimer</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>reset</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>vendor_id</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>frequencies</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>reenlightenment</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>tlbflush</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>ipi</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>avic</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>emsr_bitmap</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <value>xmm_input</value>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       <defaults>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:       </defaults>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     </hyperv>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:     <launchSecurity supported='no'/>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:   </features>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]: </domainCapabilities>
Feb 01 09:23:58 np0005604215.localdomain nova_compute[225585]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:58.944 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: <domainCapabilities>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <domain>kvm</domain>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <arch>x86_64</arch>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <vcpu max='240'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <iothreads supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <os supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <enum name='firmware'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <loader supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>rom</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>pflash</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='readonly'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>yes</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='secure'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>no</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </loader>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   </os>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <cpu>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <mode name='maximum' supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='maximumMigratable'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>on</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>off</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <mode name='host-model' supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <vendor>AMD</vendor>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='x2apic'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='stibp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ssbd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='succor'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='ibrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lbrv'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <mode name='custom' supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Broadwell-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ddpd-u'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sha512'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sm3'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sm4'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Cooperlake-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Denverton-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Dhyana-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amd-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='auto-ibrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibpb-brtype'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='no-nested-data-bp'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='null-sel-clr-base'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='perfmon-v2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbpb'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='stibp-always-on'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='EPYC-v5'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-128'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-256'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx10-512'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='prefetchiti'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Haswell-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='IvyBridge-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='KnightsMill-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4fmaps'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-4vnniw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512er'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512pf'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fma4'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tbm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xop'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='amx-tile'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-bf16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-fp16'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bitalg'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vbmi2'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrc'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fzrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='la57'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='taa-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='tsx-ldtrk'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='SierraForest-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ifma'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-ne-convert'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx-vnni-int8'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bhi-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='bus-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cmpccxadd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fbsdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='fsrs'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ibrs-all'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='intel-psfd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ipred-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='lam'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mcdt-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pbrsb-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='psdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rfds-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rrsba-ctrl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='serialize'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vaes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='vpclmulqdq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='hle'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='rtm'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512bw'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512cd'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512dq'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512f'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='avx512vl'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='invpcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pcid'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='pku'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='mpx'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v2'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v3'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='core-capability'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='split-lock-detect'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='Snowridge-v4'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='cldemote'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='erms'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='gfni'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdir64b'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='movdiri'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='xsaves'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='athlon-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='core2duo-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='coreduo-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='n270-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='ss'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <blockers model='phenom-v1'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnow'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <feature name='3dnowext'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </blockers>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </mode>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   </cpu>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <memoryBacking supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <enum name='sourceType'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <value>file</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <value>anonymous</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <value>memfd</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   </memoryBacking>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <devices>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <disk supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='diskDevice'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>disk</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>cdrom</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>floppy</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>lun</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>ide</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>fdc</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>sata</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </disk>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <graphics supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>vnc</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>egl-headless</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </graphics>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <video supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='modelType'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>vga</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>cirrus</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>none</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>bochs</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>ramfb</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </video>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <hostdev supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='mode'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>subsystem</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='startupPolicy'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>mandatory</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>requisite</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>optional</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='subsysType'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>pci</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>scsi</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='capsType'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='pciBackend'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </hostdev>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <rng supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio-transitional</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtio-non-transitional</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>random</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>egd</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </rng>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <filesystem supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='driverType'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>path</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>handle</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>virtiofs</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </filesystem>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <tpm supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>tpm-tis</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>tpm-crb</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>emulator</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>external</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='backendVersion'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>2.0</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </tpm>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <redirdev supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='bus'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>usb</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </redirdev>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <channel supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </channel>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <crypto supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='model'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>qemu</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='backendModel'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>builtin</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </crypto>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <interface supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='backendType'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>default</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>passt</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </interface>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <panic supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='model'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>isa</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>hyperv</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </panic>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <console supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='type'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>null</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>vc</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>pty</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>dev</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>file</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>pipe</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>stdio</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>udp</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>tcp</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>unix</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>qemu-vdagent</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>dbus</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </console>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   </devices>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   <features>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <gic supported='no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <vmcoreinfo supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <genid supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <backingStoreInput supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <backup supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <async-teardown supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <s390-pv supported='no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <ps2 supported='yes'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <tdx supported='no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <sev supported='no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <sgx supported='no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <hyperv supported='yes'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <enum name='features'>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>relaxed</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>vapic</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>spinlocks</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>vpindex</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>runtime</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>synic</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>stimer</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>reset</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>vendor_id</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>frequencies</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>reenlightenment</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>tlbflush</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>ipi</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>avic</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>emsr_bitmap</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <value>xmm_input</value>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </enum>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       <defaults>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <spinlocks>4095</spinlocks>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <stimer_direct>on</stimer_direct>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:       </defaults>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     </hyperv>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:     <launchSecurity supported='no'/>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:   </features>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: </domainCapabilities>
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.004 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.005 225589 INFO nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Secure Boot support detected
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.008 225589 INFO nova.virt.libvirt.driver [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.008 225589 INFO nova.virt.libvirt.driver [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.020 225589 DEBUG nova.virt.libvirt.driver [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.057 225589 INFO nova.virt.node [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Determined node identity d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from /var/lib/nova/compute_id
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.081 225589 DEBUG nova.compute.manager [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Verified node d5eeed9a-e4d0-4244-8d4e-39e5c8263590 matches my host np0005604215.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.106 225589 INFO nova.compute.manager [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.187 225589 DEBUG oslo_concurrency.lockutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.187 225589 DEBUG oslo_concurrency.lockutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.188 225589 DEBUG oslo_concurrency.lockutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.188 225589 DEBUG nova.compute.resource_tracker [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.189 225589 DEBUG oslo_concurrency.processutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:23:59 np0005604215.localdomain rsyslogd[760]: imjournal from <localhost:nova_compute>: begin to drop messages due to rate-limiting
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.643 225589 DEBUG oslo_concurrency.processutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.876 225589 WARNING nova.virt.libvirt.driver [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.878 225589 DEBUG nova.compute.resource_tracker [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13587MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.878 225589 DEBUG oslo_concurrency.lockutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:23:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:23:59.879 225589 DEBUG oslo_concurrency.lockutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:24:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64880 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=3365508901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D6AEC0000000001030307) 
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.076 225589 DEBUG nova.compute.resource_tracker [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.076 225589 DEBUG nova.compute.resource_tracker [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.163 225589 DEBUG nova.scheduler.client.report [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.194 225589 DEBUG nova.scheduler.client.report [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.195 225589 DEBUG nova.compute.provider_tree [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.221 225589 DEBUG nova.scheduler.client.report [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.243 225589 DEBUG nova.scheduler.client.report [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: HW_CPU_X86_BMI,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,HW_CPU_X86_ABM,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.263 225589 DEBUG oslo_concurrency.processutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.698 225589 DEBUG oslo_concurrency.processutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.434s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.704 225589 DEBUG nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.704 225589 INFO nova.virt.libvirt.host [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] kernel doesn't support AMD SEV
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.706 225589 DEBUG nova.compute.provider_tree [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.706 225589 DEBUG nova.virt.libvirt.driver [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.734 225589 DEBUG nova.scheduler.client.report [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.782 225589 DEBUG nova.compute.resource_tracker [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.782 225589 DEBUG oslo_concurrency.lockutils [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.904s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.783 225589 DEBUG nova.service [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.813 225589 DEBUG nova.service [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 01 09:24:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:00.814 225589 DEBUG nova.servicegroup.drivers.db [None req-c8935b5f-8258-4f7b-b734-d073ac8ab747 - - - - - -] DB_Driver: join new ServiceGroup member np0005604215.localdomain to the compute group, service = <Service: host=np0005604215.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 01 09:24:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64881 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=3365508901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D6F0E0000000001030307) 
Feb 01 09:24:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64882 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=3365508901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D770E0000000001030307) 
Feb 01 09:24:04 np0005604215.localdomain sshd[225880]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:24:04 np0005604215.localdomain sshd[225880]: Accepted publickey for zuul from 192.168.122.30 port 46188 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:24:04 np0005604215.localdomain systemd-logind[761]: New session 55 of user zuul.
Feb 01 09:24:04 np0005604215.localdomain systemd[1]: Started Session 55 of User zuul.
Feb 01 09:24:04 np0005604215.localdomain sshd[225880]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:24:05 np0005604215.localdomain python3.9[225991]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:24:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44571 DF PROTO=TCP SPT=41572 DPT=9102 SEQ=2997852286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D848D0000000001030307) 
Feb 01 09:24:07 np0005604215.localdomain sudo[226103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qfpeujcblsoafovuasgrvpuwhkhqpket ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937846.4973705-65-251715988459981/AnsiballZ_systemd_service.py
Feb 01 09:24:07 np0005604215.localdomain sudo[226103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:07 np0005604215.localdomain python3.9[226105]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:24:07 np0005604215.localdomain systemd-rc-local-generator[226130]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:24:07 np0005604215.localdomain systemd-sysv-generator[226135]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:07 np0005604215.localdomain sudo[226103]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:09 np0005604215.localdomain python3.9[226249]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:24:09 np0005604215.localdomain network[226266]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:24:09 np0005604215.localdomain network[226267]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:24:09 np0005604215.localdomain network[226268]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:24:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10254 DF PROTO=TCP SPT=46976 DPT=9100 SEQ=1074009599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D910E0000000001030307) 
Feb 01 09:24:10 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:24:13 np0005604215.localdomain sudo[226499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-krhjgnxxhsetgfgvgywrzunfcooisthd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937852.8140898-123-52607617778425/AnsiballZ_systemd_service.py
Feb 01 09:24:13 np0005604215.localdomain sudo[226499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35823 DF PROTO=TCP SPT=43866 DPT=9101 SEQ=1363542700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65D9E0D0000000001030307) 
Feb 01 09:24:13 np0005604215.localdomain python3.9[226501]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:24:14 np0005604215.localdomain sudo[226499]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:15 np0005604215.localdomain sudo[226610]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rchmnwnvuthuzwehwdglrezkwpnjwlnl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937854.8297734-153-174295520009524/AnsiballZ_file.py
Feb 01 09:24:15 np0005604215.localdomain sudo[226610]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64884 DF PROTO=TCP SPT=43260 DPT=9882 SEQ=3365508901 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DA70D0000000001030307) 
Feb 01 09:24:15 np0005604215.localdomain python3.9[226612]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:15 np0005604215.localdomain sudo[226610]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:15 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 76.3 (254 of 333 items), suggesting rotation.
Feb 01 09:24:15 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 09:24:15 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:24:15 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:24:15 np0005604215.localdomain sudo[226721]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lfbxnyagwwcmvmpddifjgmxhahaifsry ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937855.6570725-176-79523333320383/AnsiballZ_file.py
Feb 01 09:24:15 np0005604215.localdomain sudo[226721]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:16 np0005604215.localdomain python3.9[226723]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:16 np0005604215.localdomain sudo[226721]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:24:16 np0005604215.localdomain sudo[226842]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-paqfmoxlcfjuileljsgchbqtshsmotcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937856.4782162-204-179550261492196/AnsiballZ_command.py
Feb 01 09:24:16 np0005604215.localdomain sudo[226842]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:16 np0005604215.localdomain podman[226811]: 2026-02-01 09:24:16.885217472 +0000 UTC m=+0.090557261 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 01 09:24:16 np0005604215.localdomain podman[226811]: 2026-02-01 09:24:16.959587082 +0000 UTC m=+0.164926891 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:24:16 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:24:17 np0005604215.localdomain python3.9[226849]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:24:17 np0005604215.localdomain sudo[226842]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:17 np0005604215.localdomain sshd[226914]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:24:17 np0005604215.localdomain python3.9[226968]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:24:17 np0005604215.localdomain sudo[226986]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:24:17 np0005604215.localdomain sudo[226986]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:24:17 np0005604215.localdomain sudo[226986]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:18 np0005604215.localdomain sudo[227004]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 09:24:18 np0005604215.localdomain sudo[227004]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:24:18 np0005604215.localdomain sshd[226914]: Invalid user celery from 85.206.171.113 port 58130
Feb 01 09:24:18 np0005604215.localdomain sshd[226914]: Received disconnect from 85.206.171.113 port 58130:11: Bye Bye [preauth]
Feb 01 09:24:18 np0005604215.localdomain sshd[226914]: Disconnected from invalid user celery 85.206.171.113 port 58130 [preauth]
Feb 01 09:24:18 np0005604215.localdomain sudo[227004]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:18 np0005604215.localdomain sudo[227140]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-epmatyxrfzlkfskusxzxruewvgvvttdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937858.2628605-258-93446835044749/AnsiballZ_systemd_service.py
Feb 01 09:24:18 np0005604215.localdomain sudo[227129]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:24:18 np0005604215.localdomain sudo[227129]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:24:18 np0005604215.localdomain sudo[227140]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:18 np0005604215.localdomain sudo[227129]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:18 np0005604215.localdomain sudo[227155]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:24:18 np0005604215.localdomain sudo[227155]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:24:18 np0005604215.localdomain python3.9[227154]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:24:18 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:24:18 np0005604215.localdomain systemd-sysv-generator[227212]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:24:18 np0005604215.localdomain systemd-rc-local-generator[227206]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:24:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44573 DF PROTO=TCP SPT=41572 DPT=9102 SEQ=2997852286 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DB50E0000000001030307) 
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:19 np0005604215.localdomain sudo[227140]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:19 np0005604215.localdomain sudo[227155]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: tmp-crun.HHFcVX.mount: Deactivated successfully.
Feb 01 09:24:19 np0005604215.localdomain podman[227258]: 2026-02-01 09:24:19.873000786 +0000 UTC m=+0.081923802 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 01 09:24:19 np0005604215.localdomain podman[227258]: 2026-02-01 09:24:19.882553818 +0000 UTC m=+0.091476824 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 01 09:24:19 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:24:20 np0005604215.localdomain sudo[227278]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:24:20 np0005604215.localdomain sudo[227278]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:24:20 np0005604215.localdomain sudo[227278]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:20 np0005604215.localdomain sudo[227386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azromlfggopodjwyetbhloetcxgtczyq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937860.3258963-283-112859049626223/AnsiballZ_command.py
Feb 01 09:24:20 np0005604215.localdomain sudo[227386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:20 np0005604215.localdomain python3.9[227388]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:24:20 np0005604215.localdomain sudo[227386]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:21 np0005604215.localdomain sudo[227497]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bsdkioefqlozxqqakuqxernkjpnlvxqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937861.1194322-309-216683786809283/AnsiballZ_file.py
Feb 01 09:24:21 np0005604215.localdomain sudo[227497]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:21 np0005604215.localdomain python3.9[227499]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:21 np0005604215.localdomain sudo[227497]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10256 DF PROTO=TCP SPT=46976 DPT=9100 SEQ=1074009599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DC10D0000000001030307) 
Feb 01 09:24:22 np0005604215.localdomain python3.9[227607]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:23 np0005604215.localdomain sudo[227717]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fnutkzmtyfhrbrxavoltdghcneptjsvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937863.1665692-357-276673801646490/AnsiballZ_group.py
Feb 01 09:24:23 np0005604215.localdomain sudo[227717]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:23 np0005604215.localdomain python3.9[227719]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Feb 01 09:24:23 np0005604215.localdomain sudo[227717]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:24 np0005604215.localdomain sudo[227827]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mkidzlfkuwgkkxfayalkjetpqeawpxdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937864.306285-390-160298511479207/AnsiballZ_getent.py
Feb 01 09:24:24 np0005604215.localdomain sudo[227827]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:24 np0005604215.localdomain python3.9[227829]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Feb 01 09:24:24 np0005604215.localdomain sudo[227827]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:25 np0005604215.localdomain sudo[227938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwiygalwmxqngjgaddkqbtmpaurnwlct ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937865.100529-413-4262211533996/AnsiballZ_group.py
Feb 01 09:24:25 np0005604215.localdomain sudo[227938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35825 DF PROTO=TCP SPT=43866 DPT=9101 SEQ=1363542700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DCF0E0000000001030307) 
Feb 01 09:24:25 np0005604215.localdomain python3.9[227940]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Feb 01 09:24:25 np0005604215.localdomain groupadd[227941]: group added to /etc/group: name=ceilometer, GID=42405
Feb 01 09:24:25 np0005604215.localdomain groupadd[227941]: group added to /etc/gshadow: name=ceilometer
Feb 01 09:24:25 np0005604215.localdomain groupadd[227941]: new group: name=ceilometer, GID=42405
Feb 01 09:24:25 np0005604215.localdomain sudo[227938]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:26 np0005604215.localdomain sudo[228054]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xptmohcgbfqttkntnspnjfpeexdslwzk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937865.969808-437-88217026291516/AnsiballZ_user.py
Feb 01 09:24:26 np0005604215.localdomain sudo[228054]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:26 np0005604215.localdomain python3.9[228056]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005604215.localdomain update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Feb 01 09:24:26 np0005604215.localdomain useradd[228058]: new user: name=ceilometer, UID=42405, GID=42405, home=/home/ceilometer, shell=/sbin/nologin, from=/dev/pts/0
Feb 01 09:24:26 np0005604215.localdomain useradd[228058]: add 'ceilometer' to group 'libvirt'
Feb 01 09:24:26 np0005604215.localdomain useradd[228058]: add 'ceilometer' to shadow group 'libvirt'
Feb 01 09:24:26 np0005604215.localdomain sudo[228054]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:28 np0005604215.localdomain python3.9[228172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:28 np0005604215.localdomain python3.9[228258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769937867.8488643-516-70654262693965/.source.conf _original_basename=ceilometer.conf follow=False checksum=250a83fc93bccf58e8e11a3be34df4cfb645c6cf backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:29 np0005604215.localdomain python3.9[228366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50821 DF PROTO=TCP SPT=41970 DPT=9882 SEQ=875914429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DE01D0000000001030307) 
Feb 01 09:24:30 np0005604215.localdomain python3.9[228452]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769937869.1138768-516-183968801404259/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:30 np0005604215.localdomain python3.9[228560]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50822 DF PROTO=TCP SPT=41970 DPT=9882 SEQ=875914429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DE40D0000000001030307) 
Feb 01 09:24:31 np0005604215.localdomain python3.9[228646]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1769937870.3247418-516-201600494908888/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:32 np0005604215.localdomain python3.9[228754]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50823 DF PROTO=TCP SPT=41970 DPT=9882 SEQ=875914429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DEC0D0000000001030307) 
Feb 01 09:24:33 np0005604215.localdomain python3.9[228862]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:34 np0005604215.localdomain python3.9[228970]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:35 np0005604215.localdomain python3.9[229056]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937874.1685665-693-103109515580361/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=79e045f9054a8a551a608acb835c77465f56f6ed backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:35 np0005604215.localdomain python3.9[229164]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:36 np0005604215.localdomain python3.9[229250]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/openstack_network_exporter.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937875.3028827-693-92539809831754/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=e2858327749c09c7b8ca5fc97985d7885b95bd4b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45003 DF PROTO=TCP SPT=50424 DPT=9102 SEQ=3800240860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65DF9CE0000000001030307) 
Feb 01 09:24:36 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:36.816 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:36 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:36.836 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:37 np0005604215.localdomain python3.9[229358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:37 np0005604215.localdomain python3.9[229444]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/telemetry/firewall.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937876.6654112-780-160224977856085/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:38 np0005604215.localdomain python3.9[229552]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:39 np0005604215.localdomain python3.9[229662]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43300 DF PROTO=TCP SPT=57556 DPT=9100 SEQ=3258856151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E064D0000000001030307) 
Feb 01 09:24:39 np0005604215.localdomain python3.9[229770]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:40 np0005604215.localdomain sudo[229878]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bdaiksepkerfujelrofwcscmzsrmzacb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937880.1662278-906-77371326509850/AnsiballZ_file.py
Feb 01 09:24:40 np0005604215.localdomain sudo[229878]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:40 np0005604215.localdomain python3.9[229880]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:40 np0005604215.localdomain sudo[229878]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:41 np0005604215.localdomain sudo[229988]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uywyogcqhvwayrynjgpudzdmsygiotmx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937880.866788-930-10760080120083/AnsiballZ_systemd_service.py
Feb 01 09:24:41 np0005604215.localdomain sudo[229988]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:41 np0005604215.localdomain python3.9[229990]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:24:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:24:41.740 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:24:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:24:41.741 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:24:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:24:41.741 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:24:41 np0005604215.localdomain systemd-sysv-generator[230020]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:24:41 np0005604215.localdomain systemd-rc-local-generator[230015]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:41 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:24:42 np0005604215.localdomain systemd[1]: Listening on Podman API Socket.
Feb 01 09:24:42 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=35826 DF PROTO=TCP SPT=43866 DPT=9101 SEQ=1363542700 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E0F0E0000000001030307) 
Feb 01 09:24:42 np0005604215.localdomain sudo[229988]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:44 np0005604215.localdomain sudo[230138]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dijqtpgbabxfvafbhrzeeikqaxzhjafy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937883.498662-956-101402278803149/AnsiballZ_stat.py
Feb 01 09:24:44 np0005604215.localdomain sudo[230138]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:44 np0005604215.localdomain python3.9[230140]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:44 np0005604215.localdomain sudo[230138]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:45 np0005604215.localdomain sudo[230226]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-forlautbduvgozenadpjcefmkaxrnitl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937883.498662-956-101402278803149/AnsiballZ_copy.py
Feb 01 09:24:45 np0005604215.localdomain sudo[230226]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:45 np0005604215.localdomain python3.9[230228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937883.498662-956-101402278803149/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:45 np0005604215.localdomain sudo[230226]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:45 np0005604215.localdomain sudo[230281]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qhclwrltsvjrkhlmeqkhpubvlwjqsvdt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937883.498662-956-101402278803149/AnsiballZ_stat.py
Feb 01 09:24:45 np0005604215.localdomain sudo[230281]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50825 DF PROTO=TCP SPT=41970 DPT=9882 SEQ=875914429 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E1D0D0000000001030307) 
Feb 01 09:24:45 np0005604215.localdomain python3.9[230283]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:45 np0005604215.localdomain sudo[230281]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:46 np0005604215.localdomain sudo[230369]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lqkprdodpddwkhcgtsjfsvjfleamadgx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937883.498662-956-101402278803149/AnsiballZ_copy.py
Feb 01 09:24:46 np0005604215.localdomain sudo[230369]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:46 np0005604215.localdomain python3.9[230371]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937883.498662-956-101402278803149/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:46 np0005604215.localdomain sudo[230369]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:24:47 np0005604215.localdomain podman[230389]: 2026-02-01 09:24:47.866714821 +0000 UTC m=+0.083016059 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 01 09:24:47 np0005604215.localdomain podman[230389]: 2026-02-01 09:24:47.93010847 +0000 UTC m=+0.146409708 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Feb 01 09:24:47 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:24:48 np0005604215.localdomain sudo[230505]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puntwgxqfydevbwyxvctyenmphtrigas ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937887.9422305-1052-166273673019289/AnsiballZ_file.py
Feb 01 09:24:48 np0005604215.localdomain sudo[230505]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:48 np0005604215.localdomain python3.9[230507]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:48 np0005604215.localdomain sudo[230505]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45005 DF PROTO=TCP SPT=50424 DPT=9102 SEQ=3800240860 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E290D0000000001030307) 
Feb 01 09:24:48 np0005604215.localdomain sudo[230615]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-txemisjereuyqbzdgjvjbwehlxvrfypv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937888.6043649-1076-184054065036995/AnsiballZ_file.py
Feb 01 09:24:48 np0005604215.localdomain sudo[230615]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:49 np0005604215.localdomain python3.9[230617]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:24:49 np0005604215.localdomain sudo[230615]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:49 np0005604215.localdomain sudo[230725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-exudkesnaynqhklgkeoguwbmyjjollzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937889.3157613-1100-222188553158619/AnsiballZ_stat.py
Feb 01 09:24:49 np0005604215.localdomain sudo[230725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:49 np0005604215.localdomain python3.9[230727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:24:49 np0005604215.localdomain sudo[230725]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:50 np0005604215.localdomain sudo[230815]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkhwhrzohnpyqpgggbtpomhkdocpvepp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937889.3157613-1100-222188553158619/AnsiballZ_copy.py
Feb 01 09:24:50 np0005604215.localdomain sudo[230815]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:24:50 np0005604215.localdomain systemd[1]: tmp-crun.vaf7mM.mount: Deactivated successfully.
Feb 01 09:24:50 np0005604215.localdomain podman[230818]: 2026-02-01 09:24:50.265376385 +0000 UTC m=+0.081454446 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:24:50 np0005604215.localdomain podman[230818]: 2026-02-01 09:24:50.295346434 +0000 UTC m=+0.111424495 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:24:50 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:24:50 np0005604215.localdomain python3.9[230817]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937889.3157613-1100-222188553158619/.source.json _original_basename=.h0rzsias follow=False checksum=ce2b0c83293a970bafffa087afa083dd7c93a79c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:50 np0005604215.localdomain sudo[230815]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:51 np0005604215.localdomain python3.9[230943]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43302 DF PROTO=TCP SPT=57556 DPT=9100 SEQ=3258856151 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E370D0000000001030307) 
Feb 01 09:24:53 np0005604215.localdomain sudo[231245]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lwrnzbgoenyacatvibyxzxsfxdugklyg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937893.0205286-1222-278275444699043/AnsiballZ_container_config_data.py
Feb 01 09:24:53 np0005604215.localdomain sudo[231245]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:53 np0005604215.localdomain python3.9[231247]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_pattern=*.json debug=False
Feb 01 09:24:53 np0005604215.localdomain sudo[231245]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:54 np0005604215.localdomain sudo[231355]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vyuvxhdcuohwyqjloinkqmxjwddgxyjc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937894.1331913-1255-146581093128223/AnsiballZ_container_config_hash.py
Feb 01 09:24:54 np0005604215.localdomain sudo[231355]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:54 np0005604215.localdomain python3.9[231357]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:24:54 np0005604215.localdomain sudo[231355]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51057 DF PROTO=TCP SPT=37266 DPT=9101 SEQ=3722363124 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E430D0000000001030307) 
Feb 01 09:24:55 np0005604215.localdomain sudo[231465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgvzjeyxxdqksasrwvmcbruwlvgxlwbc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937895.1460783-1283-122888560976253/AnsiballZ_edpm_container_manage.py
Feb 01 09:24:55 np0005604215.localdomain sudo[231465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:55 np0005604215.localdomain python3[231467]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ceilometer_agent_compute config_id=ceilometer_agent_compute config_overrides={} config_patterns=*.json containers=['ceilometer_agent_compute'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:24:56 np0005604215.localdomain python3[231467]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "be811c7ef606e5fdf21f4bb60e867487043c4ca0ef316c864692549ee6c1c369",
                                                                    "Digest": "sha256:ac1f7272c172d96937d32067aeabcc7fe133ed3e13c60a2317e815e24d8d2689",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute@sha256:ac1f7272c172d96937d32067aeabcc7fe133ed3e13c60a2317e815e24d8d2689"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:22:47.562315026Z",
                                                                    "Config": {
                                                                         "User": "root",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 506512639,
                                                                    "VirtualSize": 506512639,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:439ba2a9156018a21d5d8f457e8fb5fa9d39d0de094f0cf38abf8f5215170cd7",
                                                                              "sha256:dd5ae5ce1d5c4d01e233915d61f7cac1450768a920fde6603b0c84bf26180c44"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "root",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:04.692187463Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:07.73027664Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage ceilometer",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:50.46772776Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:15:52.957817153Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:22:08.791988588Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-ceilometer-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:22:47.559747806Z",
                                                                              "created_by": "/bin/sh -c dnf -y install openstack-ceilometer-compute && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:22:51.022505453Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 01 09:24:56 np0005604215.localdomain podman[231520]: 2026-02-01 09:24:56.2791754 +0000 UTC m=+0.086987738 container remove 35d058b219b984b777df5c2dfe4ed6791ef0c95fff9166d87128d4b7e0ad7ea9 (image=registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1, name=ceilometer_agent_compute, io.openshift.expose-services=, managed_by=tripleo_ansible, release=1766032510, cpe=cpe:/a:redhat:openstack:17.1::el9, konflux.additional-tags=17.1.13 17.1_20260112.1, io.k8s.display-name=Red Hat OpenStack Platform 17.1 ceilometer-compute, org.opencontainers.image.created=2026-01-12T23:07:47Z, distribution-scope=public, architecture=x86_64, io.buildah.version=1.41.5, vendor=Red Hat, Inc., config_data={'depends_on': ['tripleo_nova_libvirt.target'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'TRIPLEO_CONFIG_HASH': '63e53a2f3cd2422147592f2c2c6c2f61'}, 'healthcheck': {'test': '/openstack/healthcheck'}, 'image': 'registry.redhat.io/rhosp-rhel9/openstack-ceilometer-compute:17.1', 'net': 'host', 'privileged': False, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/etc/puppet:/etc/puppet:ro', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/config-data/puppet-generated/ceilometer:/var/lib/kolla/config_files/src:ro', '/run/libvirt:/run/libvirt:shared,z', '/var/log/containers/ceilometer:/var/log/ceilometer:z']}, url=https://www.redhat.com, baseimage=registry.redhat.io/rhel9-2-els/rhel:9.2@sha256:746ff7bb175b780f02d72aa0177aa6d6ba4ebbee4456d01e05d465c1515e02b0, io.openshift.tags=rhosp osp openstack osp-17.1 openstack-ceilometer-compute, config_id=tripleo_step4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, batch=17.1_20260112.1, org.opencontainers.image.revision=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, build-date=2026-01-12T23:07:47Z, version=17.1.13, name=rhosp-rhel9/openstack-ceilometer-compute, container_name=ceilometer_agent_compute, io.k8s.description=Red Hat OpenStack Platform 17.1 ceilometer-compute, maintainer=OpenStack TripleO Team, com.redhat.component=openstack-ceilometer-compute-container, vcs-type=git, vcs-ref=4f6dafcdcf2b41e56d843f4a43eac1db1e9fee06, summary=Red Hat OpenStack Platform 17.1 ceilometer-compute, description=Red Hat OpenStack Platform 17.1 ceilometer-compute, tcib_managed=true)
Feb 01 09:24:56 np0005604215.localdomain python3[231467]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman rm --force ceilometer_agent_compute
Feb 01 09:24:56 np0005604215.localdomain podman[231534]: 
Feb 01 09:24:56 np0005604215.localdomain podman[231534]: 2026-02-01 09:24:56.381118056 +0000 UTC m=+0.084351498 container create 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 01 09:24:56 np0005604215.localdomain podman[231534]: 2026-02-01 09:24:56.341325505 +0000 UTC m=+0.044559017 image pull  quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Feb 01 09:24:56 np0005604215.localdomain python3[231467]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8 --healthcheck-command /openstack/healthcheck compute --label config_id=ceilometer_agent_compute --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z --volume /var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Feb 01 09:24:56 np0005604215.localdomain sudo[231465]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:57 np0005604215.localdomain sudo[231677]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nxllslnngogcwlmrybzdlgyxiqrzhwpn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937897.0920825-1307-176029071987062/AnsiballZ_stat.py
Feb 01 09:24:57 np0005604215.localdomain sudo[231677]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:57 np0005604215.localdomain python3.9[231679]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:57 np0005604215.localdomain sudo[231677]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:57.997 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:57.997 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:57.998 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:24:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:57.998 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.008 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.008 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.009 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.009 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.009 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.010 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.010 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.010 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.011 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.035 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.035 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.036 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.036 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.037 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.500 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.675 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.676 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13578MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.676 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.676 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:24:58 np0005604215.localdomain sudo[231811]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqcplojnrijluxiyttjaztofcqklorre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937898.0082505-1334-67457765630233/AnsiballZ_file.py
Feb 01 09:24:58 np0005604215.localdomain sudo[231811]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.760 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.760 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:24:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:58.795 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:24:58 np0005604215.localdomain python3.9[231813]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:24:58 np0005604215.localdomain sudo[231811]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:59 np0005604215.localdomain sudo[231886]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lcdnlevjramqhjizxqjhopzutlfzpqsr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937898.0082505-1334-67457765630233/AnsiballZ_stat.py
Feb 01 09:24:59 np0005604215.localdomain sudo[231886]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:24:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:59.228 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.432s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:24:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:59.233 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:24:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:59.260 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:24:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:59.263 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:24:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:24:59.263 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.587s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:24:59 np0005604215.localdomain python3.9[231889]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:24:59 np0005604215.localdomain sudo[231886]: pam_unix(sudo:session): session closed for user root
Feb 01 09:24:59 np0005604215.localdomain sudo[231997]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-golscpezzacyrnuakpcsozrjvizowcic ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937899.4842644-1334-4889680573573/AnsiballZ_copy.py
Feb 01 09:24:59 np0005604215.localdomain sudo[231997]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50607 DF PROTO=TCP SPT=60164 DPT=9882 SEQ=2683988532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E554D0000000001030307) 
Feb 01 09:25:00 np0005604215.localdomain python3.9[231999]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937899.4842644-1334-4889680573573/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:00 np0005604215.localdomain sudo[231997]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:00 np0005604215.localdomain sudo[232052]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ahpuglsvzrnxwywphoodfasrkyutncms ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937899.4842644-1334-4889680573573/AnsiballZ_systemd.py
Feb 01 09:25:00 np0005604215.localdomain sudo[232052]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:00 np0005604215.localdomain python3.9[232054]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:25:00 np0005604215.localdomain systemd-sysv-generator[232084]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:25:00 np0005604215.localdomain systemd-rc-local-generator[232081]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:00 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50608 DF PROTO=TCP SPT=60164 DPT=9882 SEQ=2683988532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E594D0000000001030307) 
Feb 01 09:25:01 np0005604215.localdomain sudo[232052]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:01 np0005604215.localdomain sudo[232142]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ljyqgjjiutxvkmwdbyhmxxqslamyzaog ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937899.4842644-1334-4889680573573/AnsiballZ_systemd.py
Feb 01 09:25:01 np0005604215.localdomain sudo[232142]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:01 np0005604215.localdomain python3.9[232144]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:25:01 np0005604215.localdomain systemd-sysv-generator[232178]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:25:01 np0005604215.localdomain systemd-rc-local-generator[232175]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:01 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: Starting ceilometer_agent_compute container...
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: tmp-crun.heGUP1.mount: Deactivated successfully.
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:25:02 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88d132f953dcb9b2677b5d3c46187de258d570d594cb67366b643070acccb340/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Feb 01 09:25:02 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/88d132f953dcb9b2677b5d3c46187de258d570d594cb67366b643070acccb340/merged/var/lib/kolla/config_files/src supports timestamps until 2038 (0x7fffffff)
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:25:02 np0005604215.localdomain podman[232186]: 2026-02-01 09:25:02.326311351 +0000 UTC m=+0.148554493 container init 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + sudo -E kolla_set_configs
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: sudo: unable to send audit message: Operation not permitted
Feb 01 09:25:02 np0005604215.localdomain sudo[232206]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_set_configs
Feb 01 09:25:02 np0005604215.localdomain sudo[232206]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 01 09:25:02 np0005604215.localdomain sudo[232206]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:25:02 np0005604215.localdomain podman[232186]: 2026-02-01 09:25:02.388489727 +0000 UTC m=+0.210732889 container start 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 01 09:25:02 np0005604215.localdomain podman[232186]: ceilometer_agent_compute
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: Started ceilometer_agent_compute container.
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Validating config file
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Copying service configuration files
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Copying /var/lib/kolla/config_files/src/polling.yaml to /etc/ceilometer/polling.yaml
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Copying /var/lib/kolla/config_files/src/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Copying /var/lib/kolla/config_files/src/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: INFO:__main__:Writing out command to execute
Feb 01 09:25:02 np0005604215.localdomain sudo[232206]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:02 np0005604215.localdomain sudo[232142]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: ++ cat /run_command
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + ARGS=
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + sudo kolla_copy_cacerts
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: sudo: unable to send audit message: Operation not permitted
Feb 01 09:25:02 np0005604215.localdomain sudo[232221]: ceilometer : PWD=/ ; USER=root ; COMMAND=/usr/local/bin/kolla_copy_cacerts
Feb 01 09:25:02 np0005604215.localdomain sudo[232221]: pam_systemd(sudo:session): Failed to connect to system bus: No such file or directory
Feb 01 09:25:02 np0005604215.localdomain sudo[232221]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=42405)
Feb 01 09:25:02 np0005604215.localdomain sudo[232221]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + [[ ! -n '' ]]
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + . kolla_extend_start
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + umask 0022
Feb 01 09:25:02 np0005604215.localdomain ceilometer_agent_compute[232200]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Feb 01 09:25:02 np0005604215.localdomain podman[232209]: 2026-02-01 09:25:02.475501035 +0000 UTC m=+0.082550783 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:25:02 np0005604215.localdomain podman[232209]: 2026-02-01 09:25:02.508704976 +0000 UTC m=+0.115754744 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:25:02 np0005604215.localdomain podman[232209]: unhealthy
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:25:02 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:25:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50609 DF PROTO=TCP SPT=60164 DPT=9882 SEQ=2683988532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E614D0000000001030307) 
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.172 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.172 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.172 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.172 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.173 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.174 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.175 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.176 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.177 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.178 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.179 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.180 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.181 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.182 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.183 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.184 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.185 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.186 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.203 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.204 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.205 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Feb 01 09:25:03 np0005604215.localdomain systemd[1]: tmp-crun.ZG3REt.mount: Deactivated successfully.
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.296 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.364 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.364 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.364 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.364 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.364 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.364 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.365 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.366 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.367 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.368 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.369 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.370 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.371 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.372 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.373 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.374 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.375 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.376 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.377 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.378 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.379 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.380 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.381 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.382 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.382 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.382 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.382 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.382 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.382 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.383 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.384 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.385 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.386 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.390 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.397 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:25:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:25:03 np0005604215.localdomain python3.9[232342]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:25:04 np0005604215.localdomain sudo[232453]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gklhviuuqwtznileztsmpbhuaovqwuva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937904.3582253-1469-29673409492341/AnsiballZ_stat.py
Feb 01 09:25:04 np0005604215.localdomain sudo[232453]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:04 np0005604215.localdomain python3.9[232455]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:04 np0005604215.localdomain sudo[232453]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:05 np0005604215.localdomain sudo[232543]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwpvrkvvofomutiaeyookdtbemvtnpht ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937904.3582253-1469-29673409492341/AnsiballZ_copy.py
Feb 01 09:25:05 np0005604215.localdomain sudo[232543]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:05 np0005604215.localdomain python3.9[232545]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937904.3582253-1469-29673409492341/.source.yaml _original_basename=.sbih9u14 follow=False checksum=2ea3c07ebb49f1dff5e53b919d7596cfd075b274 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:05 np0005604215.localdomain sudo[232543]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:05 np0005604215.localdomain sudo[232653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yxqxtthnjzcpotbgqdtdcbxdgmtmmptb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937905.6723943-1514-223925188779916/AnsiballZ_stat.py
Feb 01 09:25:05 np0005604215.localdomain sudo[232653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:06 np0005604215.localdomain python3.9[232655]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:06 np0005604215.localdomain sudo[232653]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:06 np0005604215.localdomain sudo[232741]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yirtcfblxwdkizkhdwnjxupxoqrnyqlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937905.6723943-1514-223925188779916/AnsiballZ_copy.py
Feb 01 09:25:06 np0005604215.localdomain sudo[232741]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42942 DF PROTO=TCP SPT=59920 DPT=9102 SEQ=3185429477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E6ECD0000000001030307) 
Feb 01 09:25:06 np0005604215.localdomain python3.9[232743]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937905.6723943-1514-223925188779916/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:25:06 np0005604215.localdomain sudo[232741]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:07 np0005604215.localdomain sudo[232851]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vuiefjlivghtqldsnaufqdnjtqlclhzn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937907.4836454-1578-260814488417354/AnsiballZ_file.py
Feb 01 09:25:07 np0005604215.localdomain sudo[232851]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:07 np0005604215.localdomain python3.9[232853]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:07 np0005604215.localdomain sudo[232851]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:09 np0005604215.localdomain sudo[232961]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-blgiegwsdmdgstbgprxiwcanjmlksjcl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937908.7897346-1601-219261598305556/AnsiballZ_file.py
Feb 01 09:25:09 np0005604215.localdomain sudo[232961]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:09 np0005604215.localdomain python3.9[232963]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:25:09 np0005604215.localdomain sudo[232961]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31764 DF PROTO=TCP SPT=42036 DPT=9100 SEQ=3105164616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E7B4D0000000001030307) 
Feb 01 09:25:09 np0005604215.localdomain sudo[233071]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lnfpgbggnuokckeenjdreqxtqadhysth ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937909.568716-1626-161828792714118/AnsiballZ_stat.py
Feb 01 09:25:09 np0005604215.localdomain sudo[233071]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:10 np0005604215.localdomain python3.9[233073]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:10 np0005604215.localdomain sudo[233071]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:10 np0005604215.localdomain sudo[233128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qaubzdcrmbnmeptizmudajqvnkhccpmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937909.568716-1626-161828792714118/AnsiballZ_file.py
Feb 01 09:25:10 np0005604215.localdomain sudo[233128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:11 np0005604215.localdomain python3.9[233130]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.3wyxtmlc recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:11 np0005604215.localdomain sudo[233128]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:11 np0005604215.localdomain python3.9[233238]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/node_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36087 DF PROTO=TCP SPT=50782 DPT=9101 SEQ=1762398148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E888D0000000001030307) 
Feb 01 09:25:13 np0005604215.localdomain sudo[233540]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjiptrtirlqejojvfvuikqwmnuertwgr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937913.5714555-1737-107169771812031/AnsiballZ_container_config_data.py
Feb 01 09:25:13 np0005604215.localdomain sudo[233540]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:14 np0005604215.localdomain python3.9[233542]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/node_exporter config_pattern=*.json debug=False
Feb 01 09:25:14 np0005604215.localdomain sudo[233540]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:14 np0005604215.localdomain sudo[233650]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gttldgugvhjqpbytrkwpxwkdwmcoosbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937914.5895727-1770-141616844161344/AnsiballZ_container_config_hash.py
Feb 01 09:25:14 np0005604215.localdomain sudo[233650]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:15 np0005604215.localdomain python3.9[233652]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:25:15 np0005604215.localdomain sudo[233650]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50611 DF PROTO=TCP SPT=60164 DPT=9882 SEQ=2683988532 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E910D0000000001030307) 
Feb 01 09:25:15 np0005604215.localdomain sudo[233760]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bonjybopfhlpxbyaabebsjpzmkhmwgqh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937915.5740786-1799-114823264991714/AnsiballZ_edpm_container_manage.py
Feb 01 09:25:15 np0005604215.localdomain sudo[233760]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:16 np0005604215.localdomain python3[233762]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/node_exporter config_id=node_exporter config_overrides={} config_patterns=*.json containers=['node_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:25:16 np0005604215.localdomain podman[233801]: 
Feb 01 09:25:16 np0005604215.localdomain podman[233801]: 2026-02-01 09:25:16.441508984 +0000 UTC m=+0.074803726 container create c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:25:16 np0005604215.localdomain podman[233801]: 2026-02-01 09:25:16.401576458 +0000 UTC m=+0.034871260 image pull  quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c
Feb 01 09:25:16 np0005604215.localdomain python3[233762]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=node_exporter --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Feb 01 09:25:16 np0005604215.localdomain sudo[233760]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:17 np0005604215.localdomain sudo[233946]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaahocqjdrynowvmxjgzvgtakzgyxmbz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937916.8208055-1824-122772583767933/AnsiballZ_stat.py
Feb 01 09:25:17 np0005604215.localdomain sudo[233946]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:17 np0005604215.localdomain python3.9[233948]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:25:17 np0005604215.localdomain sudo[233946]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:17 np0005604215.localdomain sudo[234058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufustvxtdfvmfqmahsnaihptmmlgwiff ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937917.660335-1850-230688294794770/AnsiballZ_file.py
Feb 01 09:25:17 np0005604215.localdomain sudo[234058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:18 np0005604215.localdomain python3.9[234060]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:18 np0005604215.localdomain sudo[234058]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:18 np0005604215.localdomain sudo[234113]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slgkrmuwtdgrgzdldvdeqhrzslairuig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937917.660335-1850-230688294794770/AnsiballZ_stat.py
Feb 01 09:25:18 np0005604215.localdomain sudo[234113]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:25:18 np0005604215.localdomain podman[234116]: 2026-02-01 09:25:18.434491316 +0000 UTC m=+0.080304617 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 01 09:25:18 np0005604215.localdomain python3.9[234115]: ansible-stat Invoked with path=/etc/systemd/system/edpm_node_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:25:18 np0005604215.localdomain podman[234116]: 2026-02-01 09:25:18.513603784 +0000 UTC m=+0.159417075 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible)
Feb 01 09:25:18 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:25:18 np0005604215.localdomain sudo[234113]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=42944 DF PROTO=TCP SPT=59920 DPT=9102 SEQ=3185429477 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65E9F0D0000000001030307) 
Feb 01 09:25:19 np0005604215.localdomain sudo[234249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwxxcieolhaloczbwbdabvozunmygyyj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937918.5715141-1850-19191274741336/AnsiballZ_copy.py
Feb 01 09:25:19 np0005604215.localdomain sudo[234249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:19 np0005604215.localdomain python3.9[234251]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937918.5715141-1850-19191274741336/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:19 np0005604215.localdomain sudo[234249]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:19 np0005604215.localdomain sudo[234304]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pbaqrqhawglldvqgbjqqhncuikwbtrrd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937918.5715141-1850-19191274741336/AnsiballZ_systemd.py
Feb 01 09:25:19 np0005604215.localdomain sudo[234304]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:19 np0005604215.localdomain python3.9[234306]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:25:19 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:25:19 np0005604215.localdomain systemd-sysv-generator[234332]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:25:19 np0005604215.localdomain systemd-rc-local-generator[234328]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:25:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:19 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:20 np0005604215.localdomain sudo[234304]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:20 np0005604215.localdomain sudo[234343]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:25:20 np0005604215.localdomain sudo[234343]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:25:20 np0005604215.localdomain sudo[234343]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:20 np0005604215.localdomain sudo[234361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:25:20 np0005604215.localdomain sudo[234361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:25:20 np0005604215.localdomain sudo[234431]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-shjzneqxfgvbkoalgzekjmwdyrdyvipw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937918.5715141-1850-19191274741336/AnsiballZ_systemd.py
Feb 01 09:25:20 np0005604215.localdomain sudo[234431]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:25:20 np0005604215.localdomain podman[234435]: 2026-02-01 09:25:20.639267593 +0000 UTC m=+0.097019435 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:25:20 np0005604215.localdomain podman[234435]: 2026-02-01 09:25:20.649911493 +0000 UTC m=+0.107663325 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:25:20 np0005604215.localdomain python3.9[234434]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:25:20 np0005604215.localdomain sudo[234361]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:20 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:25:20 np0005604215.localdomain systemd-sysv-generator[234514]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:25:20 np0005604215.localdomain systemd-rc-local-generator[234511]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: Starting node_exporter container...
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:25:21 np0005604215.localdomain podman[234522]: 2026-02-01 09:25:21.393010384 +0000 UTC m=+0.146023281 container init c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.408Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.408Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.409Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.409Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.409Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.409Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=arp
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=bcache
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=bonding
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=btrfs
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=conntrack
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=cpu
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=cpufreq
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=diskstats
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=edac
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=fibrechannel
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=filefd
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=filesystem
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=infiniband
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=ipvs
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=loadavg
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=mdadm
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=meminfo
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=netclass
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=netdev
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=netstat
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=nfs
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.410Z caller=node_exporter.go:117 level=info collector=nfsd
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=nvme
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=schedstat
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=sockstat
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=softnet
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=systemd
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=tapestats
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=udp_queues
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=vmstat
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=xfs
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=node_exporter.go:117 level=info collector=zfs
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Feb 01 09:25:21 np0005604215.localdomain node_exporter[234537]: ts=2026-02-01T09:25:21.411Z caller=tls_config.go:235 level=info msg="TLS is disabled." http2=false address=[::]:9100
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:25:21 np0005604215.localdomain podman[234522]: 2026-02-01 09:25:21.430277588 +0000 UTC m=+0.183290455 container start c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:25:21 np0005604215.localdomain podman[234522]: node_exporter
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: Started node_exporter container.
Feb 01 09:25:21 np0005604215.localdomain sudo[234431]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:21 np0005604215.localdomain podman[234546]: 2026-02-01 09:25:21.515246188 +0000 UTC m=+0.076536900 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=starting, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:25:21 np0005604215.localdomain podman[234546]: 2026-02-01 09:25:21.530521741 +0000 UTC m=+0.091812463 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:25:21 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:25:21 np0005604215.localdomain sudo[234586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:25:21 np0005604215.localdomain sudo[234586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:25:21 np0005604215.localdomain sudo[234586]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31766 DF PROTO=TCP SPT=42036 DPT=9100 SEQ=3105164616 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65EAB0D0000000001030307) 
Feb 01 09:25:24 np0005604215.localdomain python3.9[234694]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:25:25 np0005604215.localdomain sudo[234802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmukahqyzalwcoqtuskjsazlhxxkopvs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937924.9825702-1986-235212614626012/AnsiballZ_stat.py
Feb 01 09:25:25 np0005604215.localdomain sudo[234802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:25 np0005604215.localdomain python3.9[234804]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:25 np0005604215.localdomain sudo[234802]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36089 DF PROTO=TCP SPT=50782 DPT=9101 SEQ=1762398148 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65EB90D0000000001030307) 
Feb 01 09:25:25 np0005604215.localdomain sudo[234892]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lzdnlfgudzcabjcrogwyysagsmcvvofj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937924.9825702-1986-235212614626012/AnsiballZ_copy.py
Feb 01 09:25:25 np0005604215.localdomain sudo[234892]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:26 np0005604215.localdomain python3.9[234894]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937924.9825702-1986-235212614626012/.source.yaml _original_basename=.yy1a6jmo follow=False checksum=04bc83327d7b52473f6ef5d647c2b6470b901fda backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:26 np0005604215.localdomain sudo[234892]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:26 np0005604215.localdomain sudo[235002]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwwbrkylmtipxrfmimaekiptlxytmdiq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937926.3514736-2032-69131755111773/AnsiballZ_stat.py
Feb 01 09:25:26 np0005604215.localdomain sudo[235002]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:26 np0005604215.localdomain python3.9[235004]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:26 np0005604215.localdomain sudo[235002]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:27 np0005604215.localdomain sudo[235090]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zaunsdgjlnjbtzrythbhktyaawhvqokb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937926.3514736-2032-69131755111773/AnsiballZ_copy.py
Feb 01 09:25:27 np0005604215.localdomain sudo[235090]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:27 np0005604215.localdomain python3.9[235092]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937926.3514736-2032-69131755111773/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:25:27 np0005604215.localdomain sudo[235090]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:28 np0005604215.localdomain sudo[235200]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loouksxhaotucwhiibdmjpeuwqodjygu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937928.1881275-2094-178899939074967/AnsiballZ_file.py
Feb 01 09:25:28 np0005604215.localdomain sudo[235200]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:28 np0005604215.localdomain python3.9[235202]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:28 np0005604215.localdomain sudo[235200]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:29 np0005604215.localdomain sudo[235310]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pklrcvmzcwecvymsmxenbllbwppukxgg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937928.9666722-2117-217222241599432/AnsiballZ_file.py
Feb 01 09:25:29 np0005604215.localdomain sudo[235310]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:29 np0005604215.localdomain python3.9[235312]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:25:29 np0005604215.localdomain sudo[235310]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:29 np0005604215.localdomain sudo[235420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdjqmygqrulopqzdcgnygzhwlppuznjk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937929.6713774-2142-250734104259888/AnsiballZ_stat.py
Feb 01 09:25:29 np0005604215.localdomain sudo[235420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57898 DF PROTO=TCP SPT=58514 DPT=9882 SEQ=632004469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65ECA7D0000000001030307) 
Feb 01 09:25:30 np0005604215.localdomain python3.9[235422]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:30 np0005604215.localdomain sudo[235420]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:30 np0005604215.localdomain sudo[235477]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zllprbdtyfwxhoshscnxhmczrnuhlqig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937929.6713774-2142-250734104259888/AnsiballZ_file.py
Feb 01 09:25:30 np0005604215.localdomain sudo[235477]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:30 np0005604215.localdomain python3.9[235479]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.ij_k1s7j recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:30 np0005604215.localdomain sudo[235477]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57899 DF PROTO=TCP SPT=58514 DPT=9882 SEQ=632004469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65ECE8D0000000001030307) 
Feb 01 09:25:31 np0005604215.localdomain python3.9[235587]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/podman_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:25:32 np0005604215.localdomain podman[235799]: 2026-02-01 09:25:32.867994463 +0000 UTC m=+0.082812553 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:25:32 np0005604215.localdomain podman[235799]: 2026-02-01 09:25:32.897770526 +0000 UTC m=+0.112588596 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 01 09:25:32 np0005604215.localdomain podman[235799]: unhealthy
Feb 01 09:25:32 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:25:32 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:25:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57900 DF PROTO=TCP SPT=58514 DPT=9882 SEQ=632004469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65ED68E0000000001030307) 
Feb 01 09:25:33 np0005604215.localdomain sudo[235908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gdyqeurmuhdyacnbxhluoetaxwhiqlwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937933.0963998-2253-200213778168196/AnsiballZ_container_config_data.py
Feb 01 09:25:33 np0005604215.localdomain sudo[235908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:33 np0005604215.localdomain python3.9[235910]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/podman_exporter config_pattern=*.json debug=False
Feb 01 09:25:33 np0005604215.localdomain sudo[235908]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:34 np0005604215.localdomain sudo[236018]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qgqgrsovxnuhgaxggrlcuxrcfwiedxvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937934.2162423-2286-169277851499683/AnsiballZ_container_config_hash.py
Feb 01 09:25:34 np0005604215.localdomain sudo[236018]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:34 np0005604215.localdomain python3.9[236020]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:25:34 np0005604215.localdomain sudo[236018]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:36 np0005604215.localdomain sudo[236128]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cxrafdqcgtxoihtxheiohezhgcvlszmu ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937935.823184-2315-137583925820412/AnsiballZ_edpm_container_manage.py
Feb 01 09:25:36 np0005604215.localdomain sudo[236128]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:36 np0005604215.localdomain python3[236130]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/podman_exporter config_id=podman_exporter config_overrides={} config_patterns=*.json containers=['podman_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:25:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41375 DF PROTO=TCP SPT=55504 DPT=9102 SEQ=2679331588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65EE40D0000000001030307) 
Feb 01 09:25:38 np0005604215.localdomain podman[236144]: 2026-02-01 09:25:36.584367882 +0000 UTC m=+0.043796207 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 01 09:25:38 np0005604215.localdomain podman[236215]: 
Feb 01 09:25:38 np0005604215.localdomain podman[236215]: 2026-02-01 09:25:38.612348767 +0000 UTC m=+0.075586841 container create a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:25:38 np0005604215.localdomain podman[236215]: 2026-02-01 09:25:38.574236826 +0000 UTC m=+0.037474940 image pull  quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 01 09:25:38 np0005604215.localdomain python3[236130]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env CONTAINER_HOST=unix:///run/podman/podman.sock --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=podman_exporter --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd
Feb 01 09:25:38 np0005604215.localdomain sudo[236128]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:39 np0005604215.localdomain sudo[236360]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hohnukoywruzgwharttsdxdpvbvmrthe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937939.1092107-2340-167378571283819/AnsiballZ_stat.py
Feb 01 09:25:39 np0005604215.localdomain sudo[236360]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:39 np0005604215.localdomain python3.9[236362]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:25:39 np0005604215.localdomain sudo[236360]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25130 DF PROTO=TCP SPT=40726 DPT=9100 SEQ=1859316989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65EF08E0000000001030307) 
Feb 01 09:25:40 np0005604215.localdomain sudo[236472]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlhtgmkmwabjjkvfrzcthotykpqwedhw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937939.9201643-2366-253029503816251/AnsiballZ_file.py
Feb 01 09:25:40 np0005604215.localdomain sudo[236472]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:40 np0005604215.localdomain python3.9[236474]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:40 np0005604215.localdomain sudo[236472]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:40 np0005604215.localdomain sudo[236527]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qmyulmalklpbbfanogwcowfpfzsukqoh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937939.9201643-2366-253029503816251/AnsiballZ_stat.py
Feb 01 09:25:40 np0005604215.localdomain sudo[236527]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:40 np0005604215.localdomain python3.9[236529]: ansible-stat Invoked with path=/etc/systemd/system/edpm_podman_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:25:40 np0005604215.localdomain sudo[236527]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:41 np0005604215.localdomain sudo[236636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-imruplyefitpnhrpgjiudounteijadyt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937940.9208505-2366-45396584602564/AnsiballZ_copy.py
Feb 01 09:25:41 np0005604215.localdomain sudo[236636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:41 np0005604215.localdomain python3.9[236638]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937940.9208505-2366-45396584602564/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:41 np0005604215.localdomain sudo[236636]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:25:41.742 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:25:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:25:41.744 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:25:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:25:41.744 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:25:41 np0005604215.localdomain sudo[236691]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mcuwsltkrzfsvzugciybohmakcfaildz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937940.9208505-2366-45396584602564/AnsiballZ_systemd.py
Feb 01 09:25:41 np0005604215.localdomain sudo[236691]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:42 np0005604215.localdomain sshd[236694]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:25:42 np0005604215.localdomain python3.9[236693]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:25:42 np0005604215.localdomain systemd-sysv-generator[236722]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:25:42 np0005604215.localdomain systemd-rc-local-generator[236716]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:42 np0005604215.localdomain sudo[236691]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:42 np0005604215.localdomain sshd[236694]: Invalid user natalia from 85.206.171.113 port 40552
Feb 01 09:25:42 np0005604215.localdomain sshd[236694]: Received disconnect from 85.206.171.113 port 40552:11: Bye Bye [preauth]
Feb 01 09:25:42 np0005604215.localdomain sshd[236694]: Disconnected from invalid user natalia 85.206.171.113 port 40552 [preauth]
Feb 01 09:25:42 np0005604215.localdomain sudo[236784]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cfpkizgulscylerceyzvbafgdzxqylqw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937940.9208505-2366-45396584602564/AnsiballZ_systemd.py
Feb 01 09:25:42 np0005604215.localdomain sudo[236784]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45969 DF PROTO=TCP SPT=36562 DPT=9101 SEQ=1052668187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65EFDCD0000000001030307) 
Feb 01 09:25:43 np0005604215.localdomain python3.9[236786]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:25:43 np0005604215.localdomain systemd-sysv-generator[236814]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:25:43 np0005604215.localdomain systemd-rc-local-generator[236811]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Starting podman_exporter container...
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:25:43 np0005604215.localdomain podman[236827]: 2026-02-01 09:25:43.792652869 +0000 UTC m=+0.155313309 container init a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:25:43 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:25:43.814Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Feb 01 09:25:43 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:25:43.814Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Feb 01 09:25:43 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:25:43.814Z caller=handler.go:94 level=info msg="enabled collectors"
Feb 01 09:25:43 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:25:43.814Z caller=handler.go:105 level=info collector=container
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Starting Podman API Service...
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Started Podman API Service.
Feb 01 09:25:43 np0005604215.localdomain podman[236827]: 2026-02-01 09:25:43.850567261 +0000 UTC m=+0.213227711 container start a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:25:43 np0005604215.localdomain podman[236827]: podman_exporter
Feb 01 09:25:43 np0005604215.localdomain systemd[1]: Started podman_exporter container.
Feb 01 09:25:43 np0005604215.localdomain sudo[236784]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: time="2026-02-01T09:25:43Z" level=info msg="/usr/bin/podman filtering at log level info"
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: time="2026-02-01T09:25:43Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: time="2026-02-01T09:25:43Z" level=info msg="Setting parallel job count to 25"
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: time="2026-02-01T09:25:43Z" level=info msg="Using systemd socket activation to determine API endpoint"
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: time="2026-02-01T09:25:43Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"/run/podman/podman.sock\""
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:25:43 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Feb 01 09:25:43 np0005604215.localdomain podman[236852]: time="2026-02-01T09:25:43Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:25:44 np0005604215.localdomain podman[236851]: 2026-02-01 09:25:43.948893645 +0000 UTC m=+0.102079691 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:25:44 np0005604215.localdomain podman[236851]: 2026-02-01 09:25:44.029969695 +0000 UTC m=+0.183155721 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:25:44 np0005604215.localdomain podman[236851]: unhealthy
Feb 01 09:25:44 np0005604215.localdomain python3.9[236998]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:25:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57902 DF PROTO=TCP SPT=58514 DPT=9882 SEQ=632004469 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F070D0000000001030307) 
Feb 01 09:25:45 np0005604215.localdomain sudo[237106]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szjvwxwmlctuolpmprmjcwhmmyfccmpd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937945.667739-2502-145822862782350/AnsiballZ_stat.py
Feb 01 09:25:45 np0005604215.localdomain sudo[237106]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 01 09:25:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 01 09:25:46 np0005604215.localdomain python3.9[237108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:46 np0005604215.localdomain sudo[237106]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 01 09:25:46 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:25:46 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:25:46 np0005604215.localdomain sudo[237196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pnrjlnxogfukucqmofyjfhpqkbzhhdmf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937945.667739-2502-145822862782350/AnsiballZ_copy.py
Feb 01 09:25:46 np0005604215.localdomain sudo[237196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:46 np0005604215.localdomain python3.9[237198]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937945.667739-2502-145822862782350/.source.yaml _original_basename=.m2z58c7c follow=False checksum=46a7fe50889260457a5874849c068d8814e58990 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:46 np0005604215.localdomain sudo[237196]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:47 np0005604215.localdomain sudo[237306]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cmwvejbldhlamyxysgyahzpxzotdinrm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937946.8806725-2547-41141610643947/AnsiballZ_stat.py
Feb 01 09:25:47 np0005604215.localdomain sudo[237306]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:47 np0005604215.localdomain python3.9[237308]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:47 np0005604215.localdomain sudo[237306]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:47 np0005604215.localdomain sudo[237394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eemxuesdxeblujkheviywactfymnahlt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937946.8806725-2547-41141610643947/AnsiballZ_copy.py
Feb 01 09:25:47 np0005604215.localdomain sudo[237394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:25:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 01 09:25:47 np0005604215.localdomain python3.9[237396]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769937946.8806725-2547-41141610643947/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:25:47 np0005604215.localdomain sudo[237394]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 01 09:25:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:25:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:25:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:25:48 np0005604215.localdomain podman[237432]: 2026-02-01 09:25:48.867522957 +0000 UTC m=+0.079393208 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:25:48 np0005604215.localdomain podman[237432]: 2026-02-01 09:25:48.957690199 +0000 UTC m=+0.169560460 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller)
Feb 01 09:25:49 np0005604215.localdomain sudo[237528]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jkasiwbxwencpjhollmskocxiefwdiuc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937948.7608948-2610-16157229343880/AnsiballZ_file.py
Feb 01 09:25:49 np0005604215.localdomain sudo[237528]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=41377 DF PROTO=TCP SPT=55504 DPT=9102 SEQ=2679331588 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F150D0000000001030307) 
Feb 01 09:25:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 01 09:25:49 np0005604215.localdomain python3.9[237530]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:25:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:25:49 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:25:49 np0005604215.localdomain sudo[237528]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:50 np0005604215.localdomain sudo[237638]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wobntjxbmmcyebmfhtswtdpqdoxgmtzt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937950.42561-2634-71016466091939/AnsiballZ_file.py
Feb 01 09:25:50 np0005604215.localdomain sudo[237638]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:25:50 np0005604215.localdomain podman[237641]: 2026-02-01 09:25:50.834366999 +0000 UTC m=+0.093500235 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:25:50 np0005604215.localdomain podman[237641]: 2026-02-01 09:25:50.867776463 +0000 UTC m=+0.126909689 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:25:50 np0005604215.localdomain python3.9[237640]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:25:50 np0005604215.localdomain sudo[237638]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:51 np0005604215.localdomain sudo[237765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bnqnhjsotkepomtqyejpizaqchooyrbd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937951.1926882-2657-30290040139551/AnsiballZ_stat.py
Feb 01 09:25:51 np0005604215.localdomain sudo[237765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 01 09:25:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890-merged.mount: Deactivated successfully.
Feb 01 09:25:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:25:51 np0005604215.localdomain python3.9[237767]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:25:51 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:25:51 np0005604215.localdomain sudo[237765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:51 np0005604215.localdomain podman[237768]: 2026-02-01 09:25:51.764349296 +0000 UTC m=+0.139611962 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:25:51 np0005604215.localdomain podman[237768]: 2026-02-01 09:25:51.773759648 +0000 UTC m=+0.149022374 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:25:51 np0005604215.localdomain sudo[237843]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-olpzjvhmwizjzamgqbkgbdyscaszvpki ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937951.1926882-2657-30290040139551/AnsiballZ_file.py
Feb 01 09:25:51 np0005604215.localdomain sudo[237843]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:52 np0005604215.localdomain python3.9[237845]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/var/lib/kolla/config_files/ceilometer_agent_compute.json _original_basename=.xv4bchj9 recurse=False state=file path=/var/lib/kolla/config_files/ceilometer_agent_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=25132 DF PROTO=TCP SPT=40726 DPT=9100 SEQ=1859316989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F210E0000000001030307) 
Feb 01 09:25:52 np0005604215.localdomain sudo[237843]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:53 np0005604215.localdomain python3.9[237953]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:25:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:25:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:25:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:25:54 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:25:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45971 DF PROTO=TCP SPT=36562 DPT=9101 SEQ=1052668187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F2D0D0000000001030307) 
Feb 01 09:25:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:25:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:25:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:25:56 np0005604215.localdomain sudo[238255]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgxrtdtgnlhjtahzeakieylrjirbvomc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937956.2760718-2769-222369033829951/AnsiballZ_container_config_data.py
Feb 01 09:25:56 np0005604215.localdomain sudo[238255]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:56 np0005604215.localdomain python3.9[238257]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_pattern=*.json debug=False
Feb 01 09:25:56 np0005604215.localdomain sudo[238255]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:25:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:25:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:25:57 np0005604215.localdomain sudo[238365]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqdypxcxrmivztkxcvmbwzfzfcieghgu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937957.3262875-2802-142458267743241/AnsiballZ_container_config_hash.py
Feb 01 09:25:57 np0005604215.localdomain sudo[238365]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:25:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:25:57 np0005604215.localdomain python3.9[238367]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:25:57 np0005604215.localdomain sudo[238365]: pam_unix(sudo:session): session closed for user root
Feb 01 09:25:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:25:58 np0005604215.localdomain sudo[238475]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-syknfaafzxpfscgarjkjugzsbxjyxbkz ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769937958.3029234-2831-274809639784573/AnsiballZ_edpm_container_manage.py
Feb 01 09:25:58 np0005604215.localdomain sudo[238475]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:25:58 np0005604215.localdomain python3[238477]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/openstack_network_exporter config_id=openstack_network_exporter config_overrides={} config_patterns=*.json containers=['openstack_network_exporter'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.256 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.257 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.282 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.283 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.283 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.284 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.284 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.284 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.307 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.307 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.308 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.308 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.309 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.767 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.925 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.927 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13279MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.927 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:25:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:25:59.928 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.007 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.008 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:26:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44330 DF PROTO=TCP SPT=40362 DPT=9882 SEQ=4143488162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F3FAC0000000001030307) 
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.055 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.511 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.516 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.536 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.538 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:26:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:00.538 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:26:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2e81b03279955a60a1adecef9798de6e2f56144145c95c44327ebc53e7747a37-merged.mount: Deactivated successfully.
Feb 01 09:26:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44331 DF PROTO=TCP SPT=40362 DPT=9882 SEQ=4143488162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F43CD0000000001030307) 
Feb 01 09:26:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:01.250 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:26:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:01.251 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:26:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:01.252 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:26:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:01.265 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:26:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:01.266 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:26:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:01.266 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:26:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:26:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:26:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44332 DF PROTO=TCP SPT=40362 DPT=9882 SEQ=4143488162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F4BCE0000000001030307) 
Feb 01 09:26:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:26:03 np0005604215.localdomain podman[238548]: 2026-02-01 09:26:03.317899148 +0000 UTC m=+0.282911329 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:26:03 np0005604215.localdomain podman[238548]: 2026-02-01 09:26:03.351546049 +0000 UTC m=+0.316558240 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:26:03 np0005604215.localdomain podman[238548]: unhealthy
Feb 01 09:26:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:05 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:26:05 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:26:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36014 DF PROTO=TCP SPT=42638 DPT=9102 SEQ=244624595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F594D0000000001030307) 
Feb 01 09:26:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:09 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16967 DF PROTO=TCP SPT=41312 DPT=9100 SEQ=1228177203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F65CD0000000001030307) 
Feb 01 09:26:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:10 np0005604215.localdomain podman[238535]: 2026-02-01 09:26:00.878848708 +0000 UTC m=+0.049525304 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d
Feb 01 09:26:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:26:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0bc97c01fd5eabbf2d8e0d9991f11a9043512db93b5f6f0454866fe7414277f1-merged.mount: Deactivated successfully.
Feb 01 09:26:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0bc97c01fd5eabbf2d8e0d9991f11a9043512db93b5f6f0454866fe7414277f1-merged.mount: Deactivated successfully.
Feb 01 09:26:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48342 DF PROTO=TCP SPT=46924 DPT=9101 SEQ=1319681187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F72CE0000000001030307) 
Feb 01 09:26:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 01 09:26:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:14 np0005604215.localdomain podman[238624]: 2026-02-01 09:26:13.47235719 +0000 UTC m=+0.044553397 image pull  quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d
Feb 01 09:26:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 01 09:26:15 np0005604215.localdomain podman[238624]: 
Feb 01 09:26:15 np0005604215.localdomain podman[238624]: 2026-02-01 09:26:15.060420731 +0000 UTC m=+1.632616908 container create 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, release=1769056855, version=9.7, managed_by=edpm_ansible, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter)
Feb 01 09:26:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44334 DF PROTO=TCP SPT=40362 DPT=9882 SEQ=4143488162 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F7B0E0000000001030307) 
Feb 01 09:26:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d2793eb0d727691e97e5e2f52ec5e9822efebe0b6bf32e0fb26a5897fd53d53c-merged.mount: Deactivated successfully.
Feb 01 09:26:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:26:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:17 np0005604215.localdomain python3[238477]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --env OS_ENDPOINT_TYPE=internal --env EDPM_CONFIG_HASH=df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8 --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=openstack_network_exporter --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d
Feb 01 09:26:17 np0005604215.localdomain podman[238637]: 2026-02-01 09:26:17.52762142 +0000 UTC m=+0.738457268 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:26:17 np0005604215.localdomain podman[238637]: 2026-02-01 09:26:17.564632718 +0000 UTC m=+0.775468516 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:26:17 np0005604215.localdomain podman[238637]: unhealthy
Feb 01 09:26:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36016 DF PROTO=TCP SPT=42638 DPT=9102 SEQ=244624595 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F890D0000000001030307) 
Feb 01 09:26:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:26:19 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:26:19 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:26:19 np0005604215.localdomain podman[238668]: 2026-02-01 09:26:19.48415616 +0000 UTC m=+0.081945488 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 01 09:26:19 np0005604215.localdomain podman[238668]: 2026-02-01 09:26:19.577350498 +0000 UTC m=+0.175139826 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:26:20 np0005604215.localdomain systemd[1]: tmp-crun.IBqLZg.mount: Deactivated successfully.
Feb 01 09:26:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:20 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:26:20 np0005604215.localdomain sudo[238475]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:21 np0005604215.localdomain sudo[238745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:26:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:26:21 np0005604215.localdomain sudo[238745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:26:21 np0005604215.localdomain sudo[238745]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:21 np0005604215.localdomain sudo[238796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:26:21 np0005604215.localdomain sudo[238796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:26:21 np0005604215.localdomain podman[238779]: 2026-02-01 09:26:21.858988698 +0000 UTC m=+0.093804168 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:26:21 np0005604215.localdomain podman[238779]: 2026-02-01 09:26:21.863121725 +0000 UTC m=+0.097937225 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:26:21 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=16969 DF PROTO=TCP SPT=41312 DPT=9100 SEQ=1228177203 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65F950D0000000001030307) 
Feb 01 09:26:21 np0005604215.localdomain sudo[238862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aldaobkzcgkcbaahbumnmkihkkphisbs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937981.643887-2855-73309481194589/AnsiballZ_stat.py
Feb 01 09:26:21 np0005604215.localdomain sudo[238862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:22 np0005604215.localdomain python3.9[238864]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:26:22 np0005604215.localdomain sudo[238862]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:23 np0005604215.localdomain sudo[238984]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sbxmsghnxciqejxfghzszuduqxmlnqdr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937983.3187711-2882-194548726206730/AnsiballZ_file.py
Feb 01 09:26:23 np0005604215.localdomain sudo[238984]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:23 np0005604215.localdomain python3.9[238986]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:26:23 np0005604215.localdomain sudo[238984]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8fb1968646de61e5d6c5b7938dce54da276edc06f0bc75651b588722ba09cba1-merged.mount: Deactivated successfully.
Feb 01 09:26:24 np0005604215.localdomain sudo[239039]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mgadmdgilwjoxbuyefepjcdaaqveleik ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937983.3187711-2882-194548726206730/AnsiballZ_stat.py
Feb 01 09:26:24 np0005604215.localdomain sudo[239039]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:24 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:26:24 np0005604215.localdomain python3.9[239041]: ansible-stat Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:26:24 np0005604215.localdomain sudo[239039]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:24 np0005604215.localdomain sudo[239148]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpiairlgticclaqzzxyiyqszuhhezjch ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937984.3007665-2882-210085300833454/AnsiballZ_copy.py
Feb 01 09:26:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:26:24 np0005604215.localdomain sudo[239148]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully.
Feb 01 09:26:24 np0005604215.localdomain python3.9[239151]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769937984.3007665-2882-210085300833454/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:26:24 np0005604215.localdomain sudo[239148]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:26:25 np0005604215.localdomain podman[239150]: 2026-02-01 09:26:25.096918396 +0000 UTC m=+0.328192023 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:26:25 np0005604215.localdomain podman[239150]: 2026-02-01 09:26:25.18059597 +0000 UTC m=+0.411869547 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:26:25 np0005604215.localdomain sudo[239238]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iasmqlvewesmzyffbujnxpvdidzotpux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937984.3007665-2882-210085300833454/AnsiballZ_systemd.py
Feb 01 09:26:25 np0005604215.localdomain sudo[239238]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=48344 DF PROTO=TCP SPT=46924 DPT=9101 SEQ=1319681187 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FA30D0000000001030307) 
Feb 01 09:26:25 np0005604215.localdomain python3.9[239240]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:26:25 np0005604215.localdomain systemd-sysv-generator[239268]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:26:25 np0005604215.localdomain systemd-rc-local-generator[239261]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully.
Feb 01 09:26:25 np0005604215.localdomain sudo[239238]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:26 np0005604215.localdomain sudo[239329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-coqrgcvictcqxeqidqdtkxxfttpakccx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937984.3007665-2882-210085300833454/AnsiballZ_systemd.py
Feb 01 09:26:26 np0005604215.localdomain sudo[239329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:26:26 np0005604215.localdomain python3.9[239331]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:26:26 np0005604215.localdomain systemd-sysv-generator[239365]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:26:26 np0005604215.localdomain systemd-rc-local-generator[239359]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:26:26 np0005604215.localdomain systemd[1]: Starting openstack_network_exporter container...
Feb 01 09:26:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:28 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:26:28 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4af5d24186534ea0cf728f4415b3148bc4b8d7d4884197e22eaaa04c28c8325/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Feb 01 09:26:28 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e4af5d24186534ea0cf728f4415b3148bc4b8d7d4884197e22eaaa04c28c8325/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Feb 01 09:26:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:26:28 np0005604215.localdomain podman[239373]: 2026-02-01 09:26:28.861150547 +0000 UTC m=+1.922153799 container init 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, version=9.7, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git)
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *bridge.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *coverage.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *datapath.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *iface.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *memory.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:55: *ovnnorthd.Collector not registered, metric set not enabled
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *ovn.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:55: *ovsdbserver.Collector not registered, metric set not enabled
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *pmd_perf.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *pmd_rxq.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: INFO    09:26:28 main.go:48: registering *vswitch.Collector
Feb 01 09:26:28 np0005604215.localdomain openstack_network_exporter[239388]: NOTICE  09:26:28 main.go:82: listening on http://:9105/metrics
Feb 01 09:26:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:26:28 np0005604215.localdomain podman[239373]: 2026-02-01 09:26:28.897775273 +0000 UTC m=+1.958778495 container start 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, distribution-scope=public, io.openshift.expose-services=, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container)
Feb 01 09:26:28 np0005604215.localdomain podman[239373]: openstack_network_exporter
Feb 01 09:26:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6089 DF PROTO=TCP SPT=58296 DPT=9882 SEQ=718988965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FB4DD0000000001030307) 
Feb 01 09:26:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:30 np0005604215.localdomain systemd[1]: Started openstack_network_exporter container.
Feb 01 09:26:30 np0005604215.localdomain sudo[239329]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:30 np0005604215.localdomain podman[239398]: 2026-02-01 09:26:30.81020998 +0000 UTC m=+1.908193969 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=starting, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.7, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1769056855, config_id=openstack_network_exporter)
Feb 01 09:26:30 np0005604215.localdomain podman[239398]: 2026-02-01 09:26:30.832738652 +0000 UTC m=+1.930722621 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, version=9.7, container_name=openstack_network_exporter, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc.)
Feb 01 09:26:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6090 DF PROTO=TCP SPT=58296 DPT=9882 SEQ=718988965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FB8CD0000000001030307) 
Feb 01 09:26:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:31 np0005604215.localdomain python3.9[239526]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:26:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:31 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:26:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6091 DF PROTO=TCP SPT=58296 DPT=9882 SEQ=718988965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FC0CD0000000001030307) 
Feb 01 09:26:33 np0005604215.localdomain sudo[239635]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oogzcauleaqyvpesyogtydsatugjlnyo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937992.7768538-3017-81001878009641/AnsiballZ_stat.py
Feb 01 09:26:33 np0005604215.localdomain sudo[239635]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:33 np0005604215.localdomain python3.9[239637]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:26:33 np0005604215.localdomain sudo[239635]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:33 np0005604215.localdomain sudo[239725]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-loqfxkwhfblhwyolldnubqkprotfcvjs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937992.7768538-3017-81001878009641/AnsiballZ_copy.py
Feb 01 09:26:33 np0005604215.localdomain sudo[239725]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:33 np0005604215.localdomain python3.9[239727]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769937992.7768538-3017-81001878009641/.source.yaml _original_basename=.xsyneil2 follow=False checksum=5cb1253a55f76cdc87f63bf6441ba44eab00e487 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:26:33 np0005604215.localdomain sudo[239725]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:34 np0005604215.localdomain sudo[239835]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mksrvyyjrxgxraddcqdlouklcmuvjsmy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769937994.1733854-3063-214819823474943/AnsiballZ_find.py
Feb 01 09:26:34 np0005604215.localdomain sudo[239835]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:26:34 np0005604215.localdomain python3.9[239837]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:26:34 np0005604215.localdomain sudo[239835]: pam_unix(sudo:session): session closed for user root
Feb 01 09:26:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f7de42bd2ef28ab6d43ca2881ed0bac026c1f46d7bf355b9a366b5c9ec93a4c0-merged.mount: Deactivated successfully.
Feb 01 09:26:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f7de42bd2ef28ab6d43ca2881ed0bac026c1f46d7bf355b9a366b5c9ec93a4c0-merged.mount: Deactivated successfully.
Feb 01 09:26:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:26:35 np0005604215.localdomain podman[239855]: 2026-02-01 09:26:35.863041466 +0000 UTC m=+0.071340729 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, tcib_managed=true)
Feb 01 09:26:35 np0005604215.localdomain podman[239855]: 2026-02-01 09:26:35.892792035 +0000 UTC m=+0.101091278 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:26:35 np0005604215.localdomain podman[239855]: unhealthy
Feb 01 09:26:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55719 DF PROTO=TCP SPT=38118 DPT=9102 SEQ=2308156601 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FCE8E0000000001030307) 
Feb 01 09:26:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:26:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 01 09:26:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 01 09:26:37 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:26:37 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:26:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:26:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:26:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51275 DF PROTO=TCP SPT=53180 DPT=9100 SEQ=2190293042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FDB0D0000000001030307) 
Feb 01 09:26:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:26:41.743 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:26:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:26:41.744 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:26:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:26:41.744 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:26:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39522 DF PROTO=TCP SPT=33450 DPT=9101 SEQ=438668524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FE80D0000000001030307) 
Feb 01 09:26:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 01 09:26:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b2933f5278c3e34f217deba3df65be56d0deb9a26e06617aee0cea81e2014367-merged.mount: Deactivated successfully.
Feb 01 09:26:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b2933f5278c3e34f217deba3df65be56d0deb9a26e06617aee0cea81e2014367-merged.mount: Deactivated successfully.
Feb 01 09:26:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 01 09:26:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=6093 DF PROTO=TCP SPT=58296 DPT=9882 SEQ=718988965 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FF10D0000000001030307) 
Feb 01 09:26:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 01 09:26:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-dccddcf2ee10c6337ca051732a7fa75bba9a539cf44a7a4bc7d14c12cf4b5db0-merged.mount: Deactivated successfully.
Feb 01 09:26:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:26:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:26:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55721 DF PROTO=TCP SPT=38118 DPT=9102 SEQ=2308156601 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA65FFF0D0000000001030307) 
Feb 01 09:26:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:26:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-45fdf082ac490d270b07fd0f17cf89cd8bf1d13e0d604cb75e37ccaf54fab194-merged.mount: Deactivated successfully.
Feb 01 09:26:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-45fdf082ac490d270b07fd0f17cf89cd8bf1d13e0d604cb75e37ccaf54fab194-merged.mount: Deactivated successfully.
Feb 01 09:26:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:26:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:26:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:51 np0005604215.localdomain podman[239873]: 2026-02-01 09:26:51.716467196 +0000 UTC m=+2.190691808 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=starting, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:26:51 np0005604215.localdomain podman[239873]: 2026-02-01 09:26:51.75031571 +0000 UTC m=+2.224540372 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:26:51 np0005604215.localdomain podman[239873]: unhealthy
Feb 01 09:26:51 np0005604215.localdomain podman[239884]: 2026-02-01 09:26:51.75821274 +0000 UTC m=+0.972218051 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:26:51 np0005604215.localdomain podman[239884]: 2026-02-01 09:26:51.824159252 +0000 UTC m=+1.038164613 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:26:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51277 DF PROTO=TCP SPT=53180 DPT=9100 SEQ=2190293042 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6600B0D0000000001030307) 
Feb 01 09:26:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:53 np0005604215.localdomain rsyslogd[760]: imjournal: 2179 messages lost due to rate-limiting (20000 allowed within 600 seconds)
Feb 01 09:26:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:26:53 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:26:53 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:26:53 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:26:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:26:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:26:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:54 np0005604215.localdomain podman[239932]: 2026-02-01 09:26:54.766897152 +0000 UTC m=+0.086776647 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:26:54 np0005604215.localdomain podman[239932]: 2026-02-01 09:26:54.771628508 +0000 UTC m=+0.091508053 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 01 09:26:54 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:26:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39524 DF PROTO=TCP SPT=33450 DPT=9101 SEQ=438668524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660190D0000000001030307) 
Feb 01 09:26:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:26:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:26:56 np0005604215.localdomain podman[239949]: 2026-02-01 09:26:56.620239406 +0000 UTC m=+0.081493533 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:26:56 np0005604215.localdomain podman[239949]: 2026-02-01 09:26:56.652184997 +0000 UTC m=+0.113439114 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:26:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:57.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:26:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0e4bbbcc3a308b062cd809f5d981a575292a522b3c6697e4ac7d70789e33f207-merged.mount: Deactivated successfully.
Feb 01 09:26:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d315715373fb2ed69473b661022a322c730f5613516f294042e6eac2843e9be-merged.mount: Deactivated successfully.
Feb 01 09:26:58 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:26:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:58.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:26:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:58.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:26:59 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:26:59 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 01 09:26:59 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 01 09:26:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:26:59.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27947 DF PROTO=TCP SPT=54428 DPT=9882 SEQ=2863979396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6602A0D0000000001030307) 
Feb 01 09:27:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:00.990 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:00.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:00.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:27:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:00.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.019 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.019 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.020 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.020 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.020 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:27:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27948 DF PROTO=TCP SPT=54428 DPT=9882 SEQ=2863979396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6602E0D0000000001030307) 
Feb 01 09:27:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.465 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:27:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.636 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.637 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13257MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.637 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.637 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.735 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.735 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:27:01 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:01.766 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:27:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:27:02 np0005604215.localdomain podman[240011]: 2026-02-01 09:27:02.116442005 +0000 UTC m=+0.085683361 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_id=openstack_network_exporter, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, release=1769056855, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:27:02 np0005604215.localdomain podman[240011]: 2026-02-01 09:27:02.130579961 +0000 UTC m=+0.099821307 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, io.openshift.tags=minimal rhel9)
Feb 01 09:27:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:02.259 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.493s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:27:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:02.265 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:27:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:02.287 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:27:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:02.290 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:27:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:02.291 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.653s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:27:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 01 09:27:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27949 DF PROTO=TCP SPT=54428 DPT=9882 SEQ=2863979396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660360D0000000001030307) 
Feb 01 09:27:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-311e95bb3bdbe1bb40730cbc80ffa3861fcafc1265b18b49a2e8169fc5d3cbf1-merged.mount: Deactivated successfully.
Feb 01 09:27:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-311e95bb3bdbe1bb40730cbc80ffa3861fcafc1265b18b49a2e8169fc5d3cbf1-merged.mount: Deactivated successfully.
Feb 01 09:27:03 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:27:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:03.291 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:03.292 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:27:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:03.292 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:27:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:03.311 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:27:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:03.312 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.398 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:27:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:27:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 01 09:27:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 01 09:27:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:05 np0005604215.localdomain sshd[240033]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:27:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:06 np0005604215.localdomain sshd[240033]: Invalid user media from 85.206.171.113 port 58342
Feb 01 09:27:06 np0005604215.localdomain sshd[240033]: Received disconnect from 85.206.171.113 port 58342:11: Bye Bye [preauth]
Feb 01 09:27:06 np0005604215.localdomain sshd[240033]: Disconnected from invalid user media 85.206.171.113 port 58342 [preauth]
Feb 01 09:27:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32382 DF PROTO=TCP SPT=50226 DPT=9102 SEQ=2308247506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660438D0000000001030307) 
Feb 01 09:27:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0336e79261e1f534d091cad94b9980aafc6b329c3b01bda2d50fcc505860ff11-merged.mount: Deactivated successfully.
Feb 01 09:27:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:27:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-179e7ed4ab403439e752a2c426c6db4ca9807018662c061e320fe01562a6e116-merged.mount: Deactivated successfully.
Feb 01 09:27:07 np0005604215.localdomain podman[240035]: 2026-02-01 09:27:07.866273452 +0000 UTC m=+0.101817992 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:27:07 np0005604215.localdomain podman[240035]: 2026-02-01 09:27:07.895830936 +0000 UTC m=+0.131375426 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 09:27:07 np0005604215.localdomain podman[240035]: unhealthy
Feb 01 09:27:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62840 DF PROTO=TCP SPT=32930 DPT=9100 SEQ=1332322599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660500D0000000001030307) 
Feb 01 09:27:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:10 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:27:10 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:27:12 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=39525 DF PROTO=TCP SPT=33450 DPT=9101 SEQ=438668524 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660590E0000000001030307) 
Feb 01 09:27:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=27951 DF PROTO=TCP SPT=54428 DPT=9882 SEQ=2863979396 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660670D0000000001030307) 
Feb 01 09:27:16 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-901f926467172f87fed8e093a0c623b4edfdf674c0cbe61bc939afde2d57f8c6-merged.mount: Deactivated successfully.
Feb 01 09:27:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32384 DF PROTO=TCP SPT=50226 DPT=9102 SEQ=2308247506 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660730E0000000001030307) 
Feb 01 09:27:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=62842 DF PROTO=TCP SPT=32930 DPT=9100 SEQ=1332322599 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660810E0000000001030307) 
Feb 01 09:27:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:27:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:27:23 np0005604215.localdomain podman[240053]: 2026-02-01 09:27:23.835122487 +0000 UTC m=+0.070626629 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:27:23 np0005604215.localdomain podman[240053]: 2026-02-01 09:27:23.847591974 +0000 UTC m=+0.083096156 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:27:23 np0005604215.localdomain podman[240053]: unhealthy
Feb 01 09:27:23 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:27:23 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:27:23 np0005604215.localdomain podman[240054]: 2026-02-01 09:27:23.848606705 +0000 UTC m=+0.078039039 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller)
Feb 01 09:27:23 np0005604215.localdomain podman[240054]: 2026-02-01 09:27:23.928184923 +0000 UTC m=+0.157617267 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:27:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19941 DF PROTO=TCP SPT=44490 DPT=9101 SEQ=2930761123 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6608D0D0000000001030307) 
Feb 01 09:27:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:27:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce-merged.mount: Deactivated successfully.
Feb 01 09:27:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-671a12c1b149c45f560a497746a5c06b1baf4bea205bfa54dc10c3d286f5bbce-merged.mount: Deactivated successfully.
Feb 01 09:27:26 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:27:26 np0005604215.localdomain podman[240101]: 2026-02-01 09:27:26.364841983 +0000 UTC m=+0.577753867 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:27:26 np0005604215.localdomain podman[240101]: 2026-02-01 09:27:26.398851188 +0000 UTC m=+0.611763122 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 01 09:27:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:27:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:28 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:27:28 np0005604215.localdomain podman[240119]: 2026-02-01 09:27:28.926987005 +0000 UTC m=+0.297778630 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:27:28 np0005604215.localdomain podman[240119]: 2026-02-01 09:27:28.937743938 +0000 UTC m=+0.308535573 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:27:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5831 DF PROTO=TCP SPT=49530 DPT=9882 SEQ=2860037822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6609F3C0000000001030307) 
Feb 01 09:27:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:30 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:27:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5832 DF PROTO=TCP SPT=49530 DPT=9882 SEQ=2860037822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660A34E0000000001030307) 
Feb 01 09:27:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5833 DF PROTO=TCP SPT=49530 DPT=9882 SEQ=2860037822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660AB4D0000000001030307) 
Feb 01 09:27:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:27:33 np0005604215.localdomain podman[240144]: 2026-02-01 09:27:33.381320792 +0000 UTC m=+0.076514462 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, managed_by=edpm_ansible, container_name=openstack_network_exporter, release=1769056855, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:27:33 np0005604215.localdomain podman[240144]: 2026-02-01 09:27:33.391284361 +0000 UTC m=+0.086478031 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, managed_by=edpm_ansible)
Feb 01 09:27:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-50254bf8e87a075d183197f5531e6c0f97888346b53b5d118b5ece2506404cbc-merged.mount: Deactivated successfully.
Feb 01 09:27:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-50254bf8e87a075d183197f5531e6c0f97888346b53b5d118b5ece2506404cbc-merged.mount: Deactivated successfully.
Feb 01 09:27:35 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:27:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44022 DF PROTO=TCP SPT=39798 DPT=9102 SEQ=2139760279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660B8CD0000000001030307) 
Feb 01 09:27:36 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully.
Feb 01 09:27:36 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:27:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:27:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:27:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10994 DF PROTO=TCP SPT=35626 DPT=9100 SEQ=1453490941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660C54D0000000001030307) 
Feb 01 09:27:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:27:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:40 np0005604215.localdomain podman[240164]: 2026-02-01 09:27:40.876383564 +0000 UTC m=+0.087596666 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:27:40 np0005604215.localdomain podman[240164]: 2026-02-01 09:27:40.90982413 +0000 UTC m=+0.121037212 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:27:40 np0005604215.localdomain podman[240164]: unhealthy
Feb 01 09:27:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:41 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:27:41 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:27:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:27:41.745 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:27:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:27:41.746 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:27:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:27:41.746 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:27:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5892 DF PROTO=TCP SPT=41892 DPT=9101 SEQ=840572674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660D28D0000000001030307) 
Feb 01 09:27:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:27:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:27:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:27:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:27:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5835 DF PROTO=TCP SPT=49530 DPT=9882 SEQ=2860037822 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660DB0E0000000001030307) 
Feb 01 09:27:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:27:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-42956910233e56c0615893b331e8357f0bd5264eb11a7b97d46d18517d01f2f9-merged.mount: Deactivated successfully.
Feb 01 09:27:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=44024 DF PROTO=TCP SPT=39798 DPT=9102 SEQ=2139760279 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660E90D0000000001030307) 
Feb 01 09:27:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully.
Feb 01 09:27:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:27:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:27:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e140ff8bd9a23afd7615f9a56d0521b539df31404142c6e9fb61e24e5e6cdd4-merged.mount: Deactivated successfully.
Feb 01 09:27:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b15384c0932804d65f1aed603e616839f92c3a386a4220ff9b424e6f3ffa126e-merged.mount: Deactivated successfully.
Feb 01 09:27:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully.
Feb 01 09:27:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully.
Feb 01 09:27:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully.
Feb 01 09:27:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=10996 DF PROTO=TCP SPT=35626 DPT=9100 SEQ=1453490941 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA660F50D0000000001030307) 
Feb 01 09:27:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:27:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully.
Feb 01 09:27:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c649efc911c887686c8351fe543502de582148a048396cbc7ad85b29ea075fe6-merged.mount: Deactivated successfully.
Feb 01 09:27:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:27:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:27:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:27:54 np0005604215.localdomain podman[240182]: 2026-02-01 09:27:54.064507406 +0000 UTC m=+0.082429495 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:27:54 np0005604215.localdomain podman[240182]: 2026-02-01 09:27:54.076631302 +0000 UTC m=+0.094553411 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:27:54 np0005604215.localdomain podman[240182]: unhealthy
Feb 01 09:27:54 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:27:54 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:27:54 np0005604215.localdomain sshd[225883]: Received disconnect from 192.168.122.30 port 46188:11: disconnected by user
Feb 01 09:27:54 np0005604215.localdomain sshd[225883]: Disconnected from user zuul 192.168.122.30 port 46188
Feb 01 09:27:54 np0005604215.localdomain sshd[225880]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:27:54 np0005604215.localdomain systemd[1]: session-55.scope: Deactivated successfully.
Feb 01 09:27:54 np0005604215.localdomain systemd[1]: session-55.scope: Consumed 1min 9.186s CPU time.
Feb 01 09:27:54 np0005604215.localdomain systemd-logind[761]: Session 55 logged out. Waiting for processes to exit.
Feb 01 09:27:54 np0005604215.localdomain systemd-logind[761]: Removed session 55.
Feb 01 09:27:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:27:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 01 09:27:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 01 09:27:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5894 DF PROTO=TCP SPT=41892 DPT=9101 SEQ=840572674 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661030E0000000001030307) 
Feb 01 09:27:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-71578ef0b1e4f969b13e723033957534bc6c3b31bff47c5fdb42a55e43d4cef9-merged.mount: Deactivated successfully.
Feb 01 09:27:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:27:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:57.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully.
Feb 01 09:27:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully.
Feb 01 09:27:58 np0005604215.localdomain podman[240206]: 2026-02-01 09:27:58.388869456 +0000 UTC m=+1.970463155 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:27:58 np0005604215.localdomain podman[240206]: 2026-02-01 09:27:58.492448876 +0000 UTC m=+2.074042635 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:27:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:27:58.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:27:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:28:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55978 DF PROTO=TCP SPT=44164 DPT=9882 SEQ=888582071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661146D0000000001030307) 
Feb 01 09:28:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully.
Feb 01 09:28:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully.
Feb 01 09:28:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4e4217686394af7a9122b2b81585c3ad5207fe018f230f20d139fff3e54ac3cc-merged.mount: Deactivated successfully.
Feb 01 09:28:00 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:28:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:00.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:00.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:00.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:00.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:28:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:28:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55979 DF PROTO=TCP SPT=44164 DPT=9882 SEQ=888582071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661188D0000000001030307) 
Feb 01 09:28:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:28:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully.
Feb 01 09:28:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-33f73751efe606c7233470249b676223e1b26b870cc49c3dbfbe2c7691e9f3fe-merged.mount: Deactivated successfully.
Feb 01 09:28:02 np0005604215.localdomain podman[240258]: 2026-02-01 09:28:02.311894445 +0000 UTC m=+1.273740198 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:28:02 np0005604215.localdomain podman[240258]: 2026-02-01 09:28:02.347350824 +0000 UTC m=+1.309196557 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:28:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:02.990 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:02.991 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.019 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.019 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.020 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.039 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.039 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.039 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.060 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.060 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.061 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.061 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.061 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:28:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55980 DF PROTO=TCP SPT=44164 DPT=9882 SEQ=888582071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661208D0000000001030307) 
Feb 01 09:28:03 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:28:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:28:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.552 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.491s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:28:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:28:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:28:03 np0005604215.localdomain podman[240319]: 2026-02-01 09:28:03.713702233 +0000 UTC m=+0.093370915 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, distribution-scope=public, GIT_BRANCH=main, name=rhceph, io.openshift.expose-services=, release=1764794109, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container)
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.744 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.745 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13148MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.745 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.745 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.816 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.817 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:28:03 np0005604215.localdomain podman[240319]: 2026-02-01 09:28:03.822830726 +0000 UTC m=+0.202499478 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, name=rhceph, build-date=2025-12-08T17:28:53Z, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, ceph=True, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7)
Feb 01 09:28:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:03.848 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:28:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:04.301 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.453s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:28:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:04.307 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:28:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:04.323 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:28:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:04.325 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:28:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:04.326 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.580s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:28:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3105551fde90ad87a79816e708b2cc4b7af2f50432ce26b439bbd7707bc89976-merged.mount: Deactivated successfully.
Feb 01 09:28:05 np0005604215.localdomain systemd[1]: tmp-crun.KZtXMK.mount: Deactivated successfully.
Feb 01 09:28:05 np0005604215.localdomain podman[240246]: 2026-02-01 09:28:05.666793687 +0000 UTC m=+6.635262574 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:28:05 np0005604215.localdomain podman[240246]: 2026-02-01 09:28:05.702658339 +0000 UTC m=+6.671127236 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:28:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:28:06 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:28:06 np0005604215.localdomain podman[240392]: 2026-02-01 09:28:06.539769514 +0000 UTC m=+0.129565507 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, config_id=openstack_network_exporter, release=1769056855, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:28:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 01 09:28:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47602 DF PROTO=TCP SPT=37270 DPT=9102 SEQ=3306673345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6612E0D0000000001030307) 
Feb 01 09:28:06 np0005604215.localdomain podman[240392]: 2026-02-01 09:28:06.585533183 +0000 UTC m=+0.175329166 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., architecture=x86_64, config_id=openstack_network_exporter, release=1769056855, version=9.7, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:28:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:07 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:28:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 01 09:28:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-193d63b6dd9579507d9f1518ccbcb97a99c18e05e53fbccdc25e375b68ff02d6-merged.mount: Deactivated successfully.
Feb 01 09:28:08 np0005604215.localdomain sudo[238796]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:08 np0005604215.localdomain sudo[240430]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:28:08 np0005604215.localdomain sudo[240430]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:28:08 np0005604215.localdomain sudo[240430]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:08 np0005604215.localdomain sudo[240448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:28:08 np0005604215.localdomain sudo[240448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:28:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58713 DF PROTO=TCP SPT=49182 DPT=9100 SEQ=2712581907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6613A8D0000000001030307) 
Feb 01 09:28:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:28:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:28:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:28:11 np0005604215.localdomain sudo[240448]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:28:11 np0005604215.localdomain podman[240498]: 2026-02-01 09:28:11.620693322 +0000 UTC m=+0.085507551 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 01 09:28:11 np0005604215.localdomain podman[240498]: 2026-02-01 09:28:11.652465767 +0000 UTC m=+0.117280016 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 01 09:28:11 np0005604215.localdomain podman[240498]: unhealthy
Feb 01 09:28:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:28:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:28:12 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:28:12 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:28:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53483 DF PROTO=TCP SPT=55396 DPT=9101 SEQ=2203481120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661478E0000000001030307) 
Feb 01 09:28:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=55982 DF PROTO=TCP SPT=44164 DPT=9882 SEQ=888582071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661510D0000000001030307) 
Feb 01 09:28:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:28:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c-merged.mount: Deactivated successfully.
Feb 01 09:28:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bd91fbbf62f7f0af7c33a117d6552a6678d20821e3759b8b2c7a56c46d8f5a7c-merged.mount: Deactivated successfully.
Feb 01 09:28:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=47604 DF PROTO=TCP SPT=37270 DPT=9102 SEQ=3306673345 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6615F0D0000000001030307) 
Feb 01 09:28:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:28:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 01 09:28:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 01 09:28:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:28:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:28:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=58715 DF PROTO=TCP SPT=49182 DPT=9100 SEQ=2712581907 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6616B0E0000000001030307) 
Feb 01 09:28:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:24 np0005604215.localdomain sshd[240516]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:28:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:28:24 np0005604215.localdomain podman[240518]: 2026-02-01 09:28:24.631901968 +0000 UTC m=+0.089245340 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:28:24 np0005604215.localdomain podman[240518]: 2026-02-01 09:28:24.644605576 +0000 UTC m=+0.101948948 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:28:24 np0005604215.localdomain podman[240518]: unhealthy
Feb 01 09:28:24 np0005604215.localdomain sshd[240516]: Invalid user storm from 85.206.171.113 port 50824
Feb 01 09:28:24 np0005604215.localdomain sshd[240516]: Received disconnect from 85.206.171.113 port 50824:11: Bye Bye [preauth]
Feb 01 09:28:24 np0005604215.localdomain sshd[240516]: Disconnected from invalid user storm 85.206.171.113 port 50824 [preauth]
Feb 01 09:28:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=53485 DF PROTO=TCP SPT=55396 DPT=9101 SEQ=2203481120 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661770D0000000001030307) 
Feb 01 09:28:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-14738252526b4ecc3e5658790c785cb46cd573b0c30a58499169cca3263ae65c-merged.mount: Deactivated successfully.
Feb 01 09:28:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ef459e28ad8635c7a92e994211ce7b874f14e5a38aca9f947ab317c65716a008-merged.mount: Deactivated successfully.
Feb 01 09:28:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ef459e28ad8635c7a92e994211ce7b874f14e5a38aca9f947ab317c65716a008-merged.mount: Deactivated successfully.
Feb 01 09:28:26 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:28:26 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:28:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:28:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 01 09:28:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 01 09:28:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:28:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:28:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29408 DF PROTO=TCP SPT=34674 DPT=9882 SEQ=141345313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661899D0000000001030307) 
Feb 01 09:28:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:28:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:30 np0005604215.localdomain podman[240542]: 2026-02-01 09:28:30.653372797 +0000 UTC m=+0.086748094 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:28:30 np0005604215.localdomain podman[240542]: 2026-02-01 09:28:30.713114825 +0000 UTC m=+0.146490122 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller)
Feb 01 09:28:30 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:28:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29409 DF PROTO=TCP SPT=34674 DPT=9882 SEQ=141345313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6618D8D0000000001030307) 
Feb 01 09:28:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29410 DF PROTO=TCP SPT=34674 DPT=9882 SEQ=141345313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661958F0000000001030307) 
Feb 01 09:28:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2830780bc6d16943969f9158fd5036df60ccc26823e26fa259ba0accaa537c16-merged.mount: Deactivated successfully.
Feb 01 09:28:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:28:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d8c1697d3f9451811eabeba845d2774ca9523a4c1f6255791f262d42dbea547b-merged.mount: Deactivated successfully.
Feb 01 09:28:33 np0005604215.localdomain podman[240567]: 2026-02-01 09:28:33.398837249 +0000 UTC m=+0.066569497 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:28:33 np0005604215.localdomain podman[240567]: 2026-02-01 09:28:33.411750264 +0000 UTC m=+0.079482582 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:28:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d8c1697d3f9451811eabeba845d2774ca9523a4c1f6255791f262d42dbea547b-merged.mount: Deactivated successfully.
Feb 01 09:28:33 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:28:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:28:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:28:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:28:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:28:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56847 DF PROTO=TCP SPT=58622 DPT=9102 SEQ=4015639268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661A34D0000000001030307) 
Feb 01 09:28:36 np0005604215.localdomain podman[240590]: 2026-02-01 09:28:36.618280146 +0000 UTC m=+0.082902996 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:28:36 np0005604215.localdomain podman[240590]: 2026-02-01 09:28:36.626660662 +0000 UTC m=+0.091283342 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:28:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:28:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:28:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:28:38 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:28:38 np0005604215.localdomain podman[240607]: 2026-02-01 09:28:38.073241122 +0000 UTC m=+0.423084010 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:28:38 np0005604215.localdomain podman[240607]: 2026-02-01 09:28:38.086025122 +0000 UTC m=+0.435868020 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, distribution-scope=public, build-date=2026-01-22T05:09:47Z, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Feb 01 09:28:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:28:39 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:28:39 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61270 DF PROTO=TCP SPT=55446 DPT=9100 SEQ=332900147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661AFCD0000000001030307) 
Feb 01 09:28:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:28:41.745 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:28:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:28:41.746 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:28:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:28:41.746 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:28:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:28:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-457ad0364d778031a8ec3d2148346ff43fcb0296666a44f62af8c388a01d2e64-merged.mount: Deactivated successfully.
Feb 01 09:28:43 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17949 DF PROTO=TCP SPT=34894 DPT=9101 SEQ=1299868485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661BCCD0000000001030307) 
Feb 01 09:28:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 01 09:28:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:28:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 01 09:28:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 01 09:28:43 np0005604215.localdomain podman[240628]: 2026-02-01 09:28:43.367160331 +0000 UTC m=+0.076303265 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:28:43 np0005604215.localdomain podman[240628]: 2026-02-01 09:28:43.374755942 +0000 UTC m=+0.083898896 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:28:43 np0005604215.localdomain podman[240628]: unhealthy
Feb 01 09:28:43 np0005604215.localdomain sshd[240646]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:28:43 np0005604215.localdomain sshd[240646]: Accepted publickey for zuul from 192.168.122.30 port 37478 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:28:43 np0005604215.localdomain systemd-logind[761]: New session 56 of user zuul.
Feb 01 09:28:43 np0005604215.localdomain systemd[1]: Started Session 56 of User zuul.
Feb 01 09:28:43 np0005604215.localdomain sshd[240646]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:28:44 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:28:44 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Failed with result 'exit-code'.
Feb 01 09:28:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 01 09:28:44 np0005604215.localdomain sudo[240740]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-apnujslgtolhkusdjnkhgjlfmqufcbky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938123.9946246-3484-27942399908182/AnsiballZ_podman_container_info.py
Feb 01 09:28:44 np0005604215.localdomain sudo[240740]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:44 np0005604215.localdomain python3.9[240742]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Feb 01 09:28:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29412 DF PROTO=TCP SPT=34674 DPT=9882 SEQ=141345313 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661C50D0000000001030307) 
Feb 01 09:28:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:45 np0005604215.localdomain sudo[240740]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 01 09:28:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5f62902336a91aed0e6d89cda1611500b3d6fe7b4bddf84b8ce31199c37cfaf6-merged.mount: Deactivated successfully.
Feb 01 09:28:46 np0005604215.localdomain sudo[240863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aqvtxxntxttpftjmgfbsontxhwimhpfp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938125.5786428-3495-136753952563867/AnsiballZ_podman_container_exec.py
Feb 01 09:28:46 np0005604215.localdomain sudo[240863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:47 np0005604215.localdomain python3.9[240865]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:28:47 np0005604215.localdomain systemd[1]: Started libpod-conmon-c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.scope.
Feb 01 09:28:47 np0005604215.localdomain podman[240866]: 2026-02-01 09:28:47.179169079 +0000 UTC m=+0.098183754 container exec c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:28:47 np0005604215.localdomain podman[240866]: 2026-02-01 09:28:47.183624466 +0000 UTC m=+0.102639201 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 01 09:28:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 01 09:28:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 01 09:28:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 01 09:28:48 np0005604215.localdomain sudo[240863]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=56849 DF PROTO=TCP SPT=58622 DPT=9102 SEQ=4015639268 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661D30D0000000001030307) 
Feb 01 09:28:49 np0005604215.localdomain sudo[241001]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ybzozfowdpudgcxrtdxzkweaznvjlrux ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938129.4326212-3503-113441296854137/AnsiballZ_podman_container_exec.py
Feb 01 09:28:49 np0005604215.localdomain sudo[241001]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:28:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 01 09:28:49 np0005604215.localdomain python3.9[241003]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:28:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992-merged.mount: Deactivated successfully.
Feb 01 09:28:50 np0005604215.localdomain systemd[1]: libpod-conmon-c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.scope: Deactivated successfully.
Feb 01 09:28:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.scope.
Feb 01 09:28:50 np0005604215.localdomain podman[241004]: 2026-02-01 09:28:50.218825858 +0000 UTC m=+0.289163654 container exec c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 01 09:28:50 np0005604215.localdomain podman[241004]: 2026-02-01 09:28:50.250676563 +0000 UTC m=+0.321014349 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, config_id=ovn_controller)
Feb 01 09:28:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:28:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:28:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad-merged.mount: Deactivated successfully.
Feb 01 09:28:51 np0005604215.localdomain sudo[241001]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:51 np0005604215.localdomain sudo[241141]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-obssxcyxmfrhuqdtuakgofeoyatkbepb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938131.2715142-3511-57061703016146/AnsiballZ_file.py
Feb 01 09:28:51 np0005604215.localdomain sudo[241141]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:51 np0005604215.localdomain systemd[1]: libpod-conmon-c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.scope: Deactivated successfully.
Feb 01 09:28:51 np0005604215.localdomain python3.9[241144]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:28:51 np0005604215.localdomain sudo[241141]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:51 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61272 DF PROTO=TCP SPT=55446 DPT=9100 SEQ=332900147 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661DF0D0000000001030307) 
Feb 01 09:28:52 np0005604215.localdomain sudo[241252]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ccyhhicafgjhpmatpkfipcyzrebwfcby ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938132.0097296-3520-5030412786668/AnsiballZ_podman_container_info.py
Feb 01 09:28:52 np0005604215.localdomain sudo[241252]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:52 np0005604215.localdomain python3.9[241254]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Feb 01 09:28:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424-merged.mount: Deactivated successfully.
Feb 01 09:28:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 01 09:28:54 np0005604215.localdomain sudo[241252]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:54 np0005604215.localdomain sudo[241375]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ubizrqjnrvcstqcqqhjggwfctnvvitqb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938134.630788-3528-242942121665485/AnsiballZ_podman_container_exec.py
Feb 01 09:28:54 np0005604215.localdomain sudo[241375]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:55 np0005604215.localdomain python3.9[241377]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:28:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:55 np0005604215.localdomain systemd[1]: Started libpod-conmon-412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.scope.
Feb 01 09:28:55 np0005604215.localdomain podman[241378]: 2026-02-01 09:28:55.233588259 +0000 UTC m=+0.092504980 container exec 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:28:55 np0005604215.localdomain podman[241378]: 2026-02-01 09:28:55.263080472 +0000 UTC m=+0.121997183 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 01 09:28:55 np0005604215.localdomain sudo[241375]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=17951 DF PROTO=TCP SPT=34894 DPT=9101 SEQ=1299868485 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661ED0E0000000001030307) 
Feb 01 09:28:55 np0005604215.localdomain systemd[1]: libpod-conmon-412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.scope: Deactivated successfully.
Feb 01 09:28:55 np0005604215.localdomain sudo[241514]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-edqgvcyqbzuxqpqkhuttedqjdzngvacu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938135.5162997-3536-250590524003416/AnsiballZ_podman_container_exec.py
Feb 01 09:28:55 np0005604215.localdomain sudo[241514]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:56 np0005604215.localdomain python3.9[241516]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:28:56 np0005604215.localdomain systemd[1]: Started libpod-conmon-412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.scope.
Feb 01 09:28:56 np0005604215.localdomain podman[241517]: 2026-02-01 09:28:56.149367436 +0000 UTC m=+0.130211223 container exec 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:28:56 np0005604215.localdomain podman[241517]: 2026-02-01 09:28:56.178302891 +0000 UTC m=+0.159146618 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:28:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ac04412f5c5a43e8c61c2b8d6c1acf66f67fc19f0d028526d9bdbd1ed0352faf-merged.mount: Deactivated successfully.
Feb 01 09:28:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1783ac4e59af83bfa6c705cb913a4e3f5e5d835b34fd8ada82ce7a661d9e5a58-merged.mount: Deactivated successfully.
Feb 01 09:28:56 np0005604215.localdomain sudo[241514]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:28:56 np0005604215.localdomain sudo[241665]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hqrmxeklfpjxmouepvhrkivjaruwzfbw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938136.6082866-3544-52782003941799/AnsiballZ_file.py
Feb 01 09:28:56 np0005604215.localdomain sudo[241665]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: libpod-conmon-412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.scope: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain podman[241564]: 2026-02-01 09:28:57.136445523 +0000 UTC m=+0.603324752 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:28:57 np0005604215.localdomain python3.9[241667]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:28:57 np0005604215.localdomain podman[241564]: 2026-02-01 09:28:57.172395673 +0000 UTC m=+0.639274842 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:28:57 np0005604215.localdomain podman[241564]: unhealthy
Feb 01 09:28:57 np0005604215.localdomain sudo[241665]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:57 np0005604215.localdomain sudo[241785]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uucaahjaeadphgvqcqyttdllsxoiaknp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938137.3969245-3553-112789691184536/AnsiballZ_podman_container_info.py
Feb 01 09:28:57 np0005604215.localdomain sudo[241785]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:28:57 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:28:57 np0005604215.localdomain python3.9[241787]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Feb 01 09:28:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:57.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:57 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:57.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 09:28:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:58.016 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 09:28:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:58.017 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:58.017 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 09:28:58 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:58.045 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:28:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f747231ffc56e15c128dac75ec633f161eee676530b28d17cb7b8d0be7728054-merged.mount: Deactivated successfully.
Feb 01 09:28:58 np0005604215.localdomain sudo[241785]: pam_unix(sudo:session): session closed for user root
Feb 01 09:28:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:59.060 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:28:59 np0005604215.localdomain sudo[241908]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aufrmnwuxjtayapypyemfiqytbgpcxql ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938139.1639173-3561-194422371771932/AnsiballZ_podman_container_exec.py
Feb 01 09:28:59 np0005604215.localdomain sudo[241908]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:28:59 np0005604215.localdomain python3.9[241910]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:28:59 np0005604215.localdomain systemd[1]: Started libpod-conmon-3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.scope.
Feb 01 09:28:59 np0005604215.localdomain podman[241911]: 2026-02-01 09:28:59.719637482 +0000 UTC m=+0.090611112 container exec 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:28:59 np0005604215.localdomain podman[241911]: 2026-02-01 09:28:59.752786666 +0000 UTC m=+0.123760326 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 01 09:28:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:28:59.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:00 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61411 DF PROTO=TCP SPT=51370 DPT=9882 SEQ=1196771716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA661FECD0000000001030307) 
Feb 01 09:29:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:29:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 01 09:29:00 np0005604215.localdomain sudo[241908]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 01 09:29:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:29:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:00.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:00.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:29:01 np0005604215.localdomain sudo[242058]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhdhfqprdulusnkdfoiaucvdfemhorzd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938140.6983654-3569-202393238172269/AnsiballZ_podman_container_exec.py
Feb 01 09:29:01 np0005604215.localdomain sudo[242058]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:01 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61412 DF PROTO=TCP SPT=51370 DPT=9882 SEQ=1196771716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66202CD0000000001030307) 
Feb 01 09:29:01 np0005604215.localdomain python3.9[242060]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:29:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:29:02 np0005604215.localdomain systemd[1]: libpod-conmon-3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.scope: Deactivated successfully.
Feb 01 09:29:02 np0005604215.localdomain podman[241995]: 2026-02-01 09:29:02.13602011 +0000 UTC m=+1.341815947 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 01 09:29:02 np0005604215.localdomain podman[241995]: 2026-02-01 09:29:02.182642876 +0000 UTC m=+1.388438653 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Feb 01 09:29:02 np0005604215.localdomain systemd[1]: Started libpod-conmon-3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.scope.
Feb 01 09:29:02 np0005604215.localdomain podman[242061]: 2026-02-01 09:29:02.22200156 +0000 UTC m=+0.947288461 container exec 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:29:02 np0005604215.localdomain podman[242061]: 2026-02-01 09:29:02.255860915 +0000 UTC m=+0.981147846 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 01 09:29:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:02.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:02.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:29:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:02.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.015 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.015 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.015 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.016 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.016 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.057 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.057 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.058 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.058 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.058 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:29:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61413 DF PROTO=TCP SPT=51370 DPT=9882 SEQ=1196771716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6620ACE0000000001030307) 
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:29:03 np0005604215.localdomain sudo[242058]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.398 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.399 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.400 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.401 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:29:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.496 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.438s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.655 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.656 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13137MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.656 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.656 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.795 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.796 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:29:03 np0005604215.localdomain sudo[242242]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ufvolvtthgdqwuhtgkfmfjywybwyvdka ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938143.552001-3577-231894235342602/AnsiballZ_file.py
Feb 01 09:29:03 np0005604215.localdomain sudo[242242]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.879 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.975 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.975 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:29:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:03.990 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.011 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: HW_CPU_X86_BMI,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,HW_CPU_X86_ABM,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.042 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:29:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:04 np0005604215.localdomain systemd[1]: libpod-conmon-3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.scope: Deactivated successfully.
Feb 01 09:29:04 np0005604215.localdomain podman[242212]: 2026-02-01 09:29:04.103038366 +0000 UTC m=+0.317179232 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:29:04 np0005604215.localdomain podman[242212]: 2026-02-01 09:29:04.141653116 +0000 UTC m=+0.355793972 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:29:04 np0005604215.localdomain python3.9[242244]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:04 np0005604215.localdomain sudo[242242]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:04 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.501 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.505 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.528 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.530 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:29:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:04.531 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.874s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:29:05 np0005604215.localdomain sudo[242386]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yltthalfmwmnusszhyloowlrkytkexdw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938145.1958883-3586-50583965741982/AnsiballZ_podman_container_info.py
Feb 01 09:29:05 np0005604215.localdomain sudo[242386]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:05 np0005604215.localdomain python3.9[242388]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Feb 01 09:29:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 01 09:29:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-19867aa9ce07feb42ab4d071eed0ec581b8be5de4a737b08d8913c4970e7b3a5-merged.mount: Deactivated successfully.
Feb 01 09:29:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:06.526 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:29:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64187 DF PROTO=TCP SPT=38200 DPT=9102 SEQ=1392793442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662184D0000000001030307) 
Feb 01 09:29:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:29:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:29:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 01 09:29:08 np0005604215.localdomain sudo[242386]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:08 np0005604215.localdomain podman[242401]: 2026-02-01 09:29:08.692897363 +0000 UTC m=+0.330400076 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:29:08 np0005604215.localdomain podman[242401]: 2026-02-01 09:29:08.69835257 +0000 UTC m=+0.335855293 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:29:09 np0005604215.localdomain sudo[242525]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ijxhiafcmnqiusgzvgnobhlmgwpyjidt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938148.9222627-3594-91588356405082/AnsiballZ_podman_container_exec.py
Feb 01 09:29:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:29:09 np0005604215.localdomain sudo[242525]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:09 np0005604215.localdomain python3.9[242528]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:09 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18567 DF PROTO=TCP SPT=35012 DPT=9100 SEQ=3267543071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66224CD0000000001030307) 
Feb 01 09:29:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:29:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-eae537b18cb4af6ef1d611e84802ac12d948a1ed622870af6f76704805834c9a-merged.mount: Deactivated successfully.
Feb 01 09:29:10 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:29:10 np0005604215.localdomain podman[242527]: 2026-02-01 09:29:10.82609069 +0000 UTC m=+1.538691057 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., version=9.7, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z)
Feb 01 09:29:10 np0005604215.localdomain podman[242527]: 2026-02-01 09:29:10.836087196 +0000 UTC m=+1.548687593 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, release=1769056855, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:29:10 np0005604215.localdomain systemd[1]: Started libpod-conmon-c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.scope.
Feb 01 09:29:10 np0005604215.localdomain podman[242540]: 2026-02-01 09:29:10.924279392 +0000 UTC m=+1.390896847 container exec c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:29:10 np0005604215.localdomain podman[242540]: 2026-02-01 09:29:10.957757936 +0000 UTC m=+1.424375431 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:29:11 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:11 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain sudo[242525]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain systemd[1]: libpod-conmon-c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.scope: Deactivated successfully.
Feb 01 09:29:12 np0005604215.localdomain sudo[242685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xxgdtlazghtflygsunkuqhkkpwzydacc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938152.3089137-3602-89504637674161/AnsiballZ_podman_container_exec.py
Feb 01 09:29:12 np0005604215.localdomain sudo[242685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:13 np0005604215.localdomain python3.9[242687]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:13 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19714 DF PROTO=TCP SPT=49910 DPT=9101 SEQ=4115187361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662320D0000000001030307) 
Feb 01 09:29:13 np0005604215.localdomain systemd[1]: Started libpod-conmon-c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.scope.
Feb 01 09:29:13 np0005604215.localdomain podman[242688]: 2026-02-01 09:29:13.191019895 +0000 UTC m=+0.124809509 container exec c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:29:13 np0005604215.localdomain podman[242688]: 2026-02-01 09:29:13.221869268 +0000 UTC m=+0.155658842 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:29:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:29:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-40d13af751dd0e47fc8bb889a91a6d655bc2617bd5ab127ac97d8b2c392f6c58-merged.mount: Deactivated successfully.
Feb 01 09:29:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-877c65e867b205f11a32fcdb99f229d7cc1aad0815e744014cf57490bce97673-merged.mount: Deactivated successfully.
Feb 01 09:29:15 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=61415 DF PROTO=TCP SPT=51370 DPT=9882 SEQ=1196771716 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6623B0E0000000001030307) 
Feb 01 09:29:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-877c65e867b205f11a32fcdb99f229d7cc1aad0815e744014cf57490bce97673-merged.mount: Deactivated successfully.
Feb 01 09:29:15 np0005604215.localdomain sudo[242685]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:15 np0005604215.localdomain podman[242717]: 2026-02-01 09:29:15.571812843 +0000 UTC m=+0.782459000 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:29:15 np0005604215.localdomain podman[242717]: 2026-02-01 09:29:15.610696663 +0000 UTC m=+0.821342770 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:29:16 np0005604215.localdomain sudo[242839]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ffsvosjecywwknmmsxgirlmrzeceufab ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938155.7136705-3610-227475235994235/AnsiballZ_file.py
Feb 01 09:29:16 np0005604215.localdomain sudo[242839]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:17 np0005604215.localdomain python3.9[242841]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:17 np0005604215.localdomain sudo[242839]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:17 np0005604215.localdomain sudo[242949]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfxrevlzirdidbaocbvzkxazqjfgwfkh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938157.2668939-3619-249408396028483/AnsiballZ_podman_container_info.py
Feb 01 09:29:17 np0005604215.localdomain sudo[242949]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:17 np0005604215.localdomain python3.9[242951]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Feb 01 09:29:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:18 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:18 np0005604215.localdomain systemd[1]: libpod-conmon-c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.scope: Deactivated successfully.
Feb 01 09:29:18 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:29:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=64189 DF PROTO=TCP SPT=38200 DPT=9102 SEQ=1392793442 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662490D0000000001030307) 
Feb 01 09:29:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:21 np0005604215.localdomain sudo[242949]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:21 np0005604215.localdomain sudo[243074]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kepilcyaohzrprvvlnskmurtuhpifybx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938161.7063994-3627-68969475201490/AnsiballZ_podman_container_exec.py
Feb 01 09:29:21 np0005604215.localdomain sudo[243074]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:22 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18569 DF PROTO=TCP SPT=35012 DPT=9100 SEQ=3267543071 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662550D0000000001030307) 
Feb 01 09:29:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:22 np0005604215.localdomain python3.9[243076]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:22 np0005604215.localdomain systemd[1]: Started libpod-conmon-a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.scope.
Feb 01 09:29:22 np0005604215.localdomain podman[243077]: 2026-02-01 09:29:22.302877553 +0000 UTC m=+0.100838275 container exec a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:29:22 np0005604215.localdomain podman[243077]: 2026-02-01 09:29:22.331575201 +0000 UTC m=+0.129535883 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:29:22 np0005604215.localdomain sudo[243074]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:23 np0005604215.localdomain sudo[243215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-admbobhztvvtjaqhizsmmarruaebnoea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938162.7813334-3635-137785745729479/AnsiballZ_podman_container_exec.py
Feb 01 09:29:23 np0005604215.localdomain sudo[243215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:23 np0005604215.localdomain python3.9[243217]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-7b9f50aed1094cdf3c8ae90862135d9821bbb7f673296f42b1c4d115dfdd346a-merged.mount: Deactivated successfully.
Feb 01 09:29:25 np0005604215.localdomain systemd[1]: libpod-conmon-a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.scope: Deactivated successfully.
Feb 01 09:29:25 np0005604215.localdomain systemd[1]: Started libpod-conmon-a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.scope.
Feb 01 09:29:25 np0005604215.localdomain podman[243218]: 2026-02-01 09:29:25.204526747 +0000 UTC m=+1.906614615 container exec a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:29:25 np0005604215.localdomain podman[243218]: 2026-02-01 09:29:25.235595054 +0000 UTC m=+1.937682892 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:29:25 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19716 DF PROTO=TCP SPT=49910 DPT=9101 SEQ=4115187361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662630E0000000001030307) 
Feb 01 09:29:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 01 09:29:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 01 09:29:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 01 09:29:26 np0005604215.localdomain sudo[243215]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:26 np0005604215.localdomain sudo[243356]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-osusvbhpgjctqbmlwvwxzxvukgcwqcja ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938166.3062727-3643-102124686282413/AnsiballZ_file.py
Feb 01 09:29:26 np0005604215.localdomain sudo[243356]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:26 np0005604215.localdomain python3.9[243358]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:26 np0005604215.localdomain sudo[243356]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 01 09:29:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c3c2fee87fe7e8303aaac2829f1b7d26d779101a77d8fd6a9f6bec71602d9a66-merged.mount: Deactivated successfully.
Feb 01 09:29:27 np0005604215.localdomain systemd[1]: libpod-conmon-a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.scope: Deactivated successfully.
Feb 01 09:29:27 np0005604215.localdomain sudo[243466]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-slqwtukfdugtlzbiudanujqhqhqghhph ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938167.0906901-3652-206470734042030/AnsiballZ_podman_container_info.py
Feb 01 09:29:27 np0005604215.localdomain sudo[243466]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:27 np0005604215.localdomain python3.9[243468]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Feb 01 09:29:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:29:27 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:28 np0005604215.localdomain sudo[243466]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:28 np0005604215.localdomain podman[243482]: 2026-02-01 09:29:28.097405308 +0000 UTC m=+0.170484670 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:29:28 np0005604215.localdomain podman[243482]: 2026-02-01 09:29:28.135677935 +0000 UTC m=+0.208757257 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:29:28 np0005604215.localdomain podman[243482]: unhealthy
Feb 01 09:29:28 np0005604215.localdomain sudo[243613]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hdkhubfcjgvhuttlokhedxgbraggiwwd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938168.275657-3660-243466947531254/AnsiballZ_podman_container_exec.py
Feb 01 09:29:28 np0005604215.localdomain sudo[243613]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:28 np0005604215.localdomain python3.9[243615]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4c416128fe28816a81362614e1a7f9e853b273ba662e28de61a85f5c6446ec2c-merged.mount: Deactivated successfully.
Feb 01 09:29:28 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d96180a36bb10b52574296fc744e208425bb78036eb13d53db69ed84f3ab806e-merged.mount: Deactivated successfully.
Feb 01 09:29:28 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:29:28 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:29:28 np0005604215.localdomain systemd[1]: Started libpod-conmon-1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.scope.
Feb 01 09:29:28 np0005604215.localdomain podman[243616]: 2026-02-01 09:29:28.962557522 +0000 UTC m=+0.184996774 container exec 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, container_name=openstack_network_exporter)
Feb 01 09:29:28 np0005604215.localdomain podman[243616]: 2026-02-01 09:29:28.966590594 +0000 UTC m=+0.189029816 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7)
Feb 01 09:29:29 np0005604215.localdomain sudo[243613]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:29 np0005604215.localdomain systemd[1]: libpod-conmon-1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.scope: Deactivated successfully.
Feb 01 09:29:29 np0005604215.localdomain sudo[243756]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmbloyrqfjwidepcgzlactmrjspufpig ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938169.1989334-3668-106714614843359/AnsiballZ_podman_container_exec.py
Feb 01 09:29:29 np0005604215.localdomain sudo[243756]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:29 np0005604215.localdomain python3.9[243758]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Feb 01 09:29:29 np0005604215.localdomain systemd[1]: Started libpod-conmon-1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.scope.
Feb 01 09:29:29 np0005604215.localdomain podman[243759]: 2026-02-01 09:29:29.775092591 +0000 UTC m=+0.097632219 container exec 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, config_id=openstack_network_exporter)
Feb 01 09:29:29 np0005604215.localdomain podman[243759]: 2026-02-01 09:29:29.808927033 +0000 UTC m=+0.131466651 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, release=1769056855, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, version=9.7, distribution-scope=public, name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 01 09:29:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b4f761d90eeb5a4c1ea51e856783cf8398e02a6caf306b90498250a43e5bbae1-merged.mount: Deactivated successfully.
Feb 01 09:29:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e1fac4507a16e359f79966290a44e975bb0ed717e8b6cc0e34b61e8c96e0a1a3-merged.mount: Deactivated successfully.
Feb 01 09:29:30 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19602 DF PROTO=TCP SPT=54666 DPT=9882 SEQ=1052902866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66273FD0000000001030307) 
Feb 01 09:29:31 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19603 DF PROTO=TCP SPT=54666 DPT=9882 SEQ=1052902866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66278100000000001030307) 
Feb 01 09:29:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:31 np0005604215.localdomain sudo[243756]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:32 np0005604215.localdomain sudo[243895]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-utjakpnyzxepkpgvztgzlacotojjpooz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938171.8937328-3676-57906994056747/AnsiballZ_file.py
Feb 01 09:29:32 np0005604215.localdomain sudo[243895]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:32 np0005604215.localdomain python3.9[243897]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:32 np0005604215.localdomain sudo[243895]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19604 DF PROTO=TCP SPT=54666 DPT=9882 SEQ=1052902866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662800D0000000001030307) 
Feb 01 09:29:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:29:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:33 np0005604215.localdomain systemd[1]: libpod-conmon-1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.scope: Deactivated successfully.
Feb 01 09:29:33 np0005604215.localdomain podman[243915]: 2026-02-01 09:29:33.58969657 +0000 UTC m=+0.221159315 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 09:29:33 np0005604215.localdomain podman[243915]: 2026-02-01 09:29:33.663652226 +0000 UTC m=+0.295114971 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:29:33 np0005604215.localdomain sudo[244030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fccxcocisetwbsrpyfnlhcuwveadsslf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938173.583651-3689-199794850309604/AnsiballZ_file.py
Feb 01 09:29:33 np0005604215.localdomain sudo[244030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:34 np0005604215.localdomain python3.9[244032]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:34 np0005604215.localdomain sudo[244030]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:29:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:34 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:29:34 np0005604215.localdomain podman[244050]: 2026-02-01 09:29:34.656009749 +0000 UTC m=+0.118160635 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:29:34 np0005604215.localdomain podman[244050]: 2026-02-01 09:29:34.668612593 +0000 UTC m=+0.130763479 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:29:34 np0005604215.localdomain sudo[244161]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evxtjxvfmsqbnjkmcomscfxuqxviitek ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938174.614794-3717-202549061113124/AnsiballZ_stat.py
Feb 01 09:29:34 np0005604215.localdomain sudo[244161]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:35 np0005604215.localdomain python3.9[244163]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:35 np0005604215.localdomain sudo[244161]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:35 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:29:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:36 np0005604215.localdomain sudo[244249]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hyqafjzaqcsupoibdulqotfupsugoheg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938174.614794-3717-202549061113124/AnsiballZ_copy.py
Feb 01 09:29:36 np0005604215.localdomain sudo[244249]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:36 np0005604215.localdomain python3.9[244251]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938174.614794-3717-202549061113124/.source.yaml _original_basename=firewall.yaml follow=False checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:36 np0005604215.localdomain sudo[244249]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65415 DF PROTO=TCP SPT=50262 DPT=9102 SEQ=3382619436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6628D8E0000000001030307) 
Feb 01 09:29:37 np0005604215.localdomain sudo[244359]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-llgjluifmelnikqecjtmouphjfyckutq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938176.7559242-3766-4135192610019/AnsiballZ_file.py
Feb 01 09:29:37 np0005604215.localdomain sudo[244359]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:37 np0005604215.localdomain python3.9[244361]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:37 np0005604215.localdomain sudo[244359]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:37 np0005604215.localdomain sudo[244469]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bheqgzriyejpqdlzkiwozdchffjxaumy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938177.5467927-3791-271750624944671/AnsiballZ_stat.py
Feb 01 09:29:37 np0005604215.localdomain sudo[244469]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-abe2e37cef3553dd7ed72567236ba15185ae0f96cf280ad9def2a9cdb2b0b4c7-merged.mount: Deactivated successfully.
Feb 01 09:29:38 np0005604215.localdomain python3.9[244471]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:38 np0005604215.localdomain sudo[244469]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:38 np0005604215.localdomain sudo[244526]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uofwknvfbfpujwcmhwownmubcneyuczf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938177.5467927-3791-271750624944671/AnsiballZ_file.py
Feb 01 09:29:38 np0005604215.localdomain sudo[244526]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:38 np0005604215.localdomain python3.9[244528]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:38 np0005604215.localdomain sudo[244526]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:39 np0005604215.localdomain sudo[244636]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ftckfkehkkllmvlffolkbzhtxqvkursv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938178.7395287-3825-384494140135/AnsiballZ_stat.py
Feb 01 09:29:39 np0005604215.localdomain sudo[244636]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:39 np0005604215.localdomain python3.9[244638]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:39 np0005604215.localdomain sudo[244636]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:39 np0005604215.localdomain sudo[244693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tlaectuhjlpxkxbvbmlskwgyookmubfj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938178.7395287-3825-384494140135/AnsiballZ_file.py
Feb 01 09:29:39 np0005604215.localdomain sudo[244693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:39 np0005604215.localdomain python3.9[244695]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.id90h_1_ recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:39 np0005604215.localdomain sudo[244693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:39 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31015 DF PROTO=TCP SPT=52738 DPT=9100 SEQ=3196346975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6629A0D0000000001030307) 
Feb 01 09:29:40 np0005604215.localdomain sudo[244803]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-josdrstjycfkxkcupvwiwurydwbywmdc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938179.968075-3861-99454940935358/AnsiballZ_stat.py
Feb 01 09:29:40 np0005604215.localdomain sudo[244803]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:40 np0005604215.localdomain python3.9[244805]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:40 np0005604215.localdomain sudo[244803]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:40 np0005604215.localdomain sudo[244860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ztfrimrbcgwysaanjsdvmdxwpkgpsvju ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938179.968075-3861-99454940935358/AnsiballZ_file.py
Feb 01 09:29:40 np0005604215.localdomain sudo[244860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:40 np0005604215.localdomain python3.9[244862]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:40 np0005604215.localdomain sudo[244860]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:29:41 np0005604215.localdomain podman[244880]: 2026-02-01 09:29:41.365630285 +0000 UTC m=+0.078692271 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Feb 01 09:29:41 np0005604215.localdomain podman[244880]: 2026-02-01 09:29:41.370405541 +0000 UTC m=+0.083467497 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:29:41 np0005604215.localdomain sudo[244987]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mrfhffzulmpysurgqsoahjfgusqjjqkb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938181.3527381-3901-196905479042186/AnsiballZ_command.py
Feb 01 09:29:41 np0005604215.localdomain sudo[244987]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:29:41.747 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:29:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:29:41.747 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:29:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:29:41.747 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:29:41 np0005604215.localdomain sshd[244990]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:29:41 np0005604215.localdomain python3.9[244989]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:29:41 np0005604215.localdomain sudo[244987]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:42 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19717 DF PROTO=TCP SPT=49910 DPT=9101 SEQ=4115187361 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662A30D0000000001030307) 
Feb 01 09:29:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:29:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:42 np0005604215.localdomain sshd[244990]: Invalid user mikael from 85.206.171.113 port 45258
Feb 01 09:29:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-bbf98921711ec0c598fda2e2ca2c55c79674f35f32436d92adf3bb7290153e1a-merged.mount: Deactivated successfully.
Feb 01 09:29:42 np0005604215.localdomain sudo[245112]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-klioyogslxllfjxkcvvsnizcpoaiovnc ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769938182.2206604-3924-196713602590472/AnsiballZ_edpm_nftables_from_files.py
Feb 01 09:29:42 np0005604215.localdomain sudo[245112]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:42 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:29:42 np0005604215.localdomain sshd[244990]: Received disconnect from 85.206.171.113 port 45258:11: Bye Bye [preauth]
Feb 01 09:29:42 np0005604215.localdomain sshd[244990]: Disconnected from invalid user mikael 85.206.171.113 port 45258 [preauth]
Feb 01 09:29:42 np0005604215.localdomain podman[245030]: 2026-02-01 09:29:42.689661463 +0000 UTC m=+0.442414803 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, architecture=x86_64, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, maintainer=Red Hat, Inc., release=1769056855, version=9.7, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:29:42 np0005604215.localdomain podman[245030]: 2026-02-01 09:29:42.702401351 +0000 UTC m=+0.455154651 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, version=9.7, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1769056855, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Feb 01 09:29:42 np0005604215.localdomain python3[245114]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Feb 01 09:29:42 np0005604215.localdomain sudo[245112]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:43 np0005604215.localdomain sudo[245230]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tnftjccfyldccsjxlfuljidbzljxeyga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938183.1213653-3948-251533814020927/AnsiballZ_stat.py
Feb 01 09:29:43 np0005604215.localdomain sudo[245230]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:43 np0005604215.localdomain python3.9[245232]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:43 np0005604215.localdomain sudo[245230]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:43 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:29:43 np0005604215.localdomain sudo[245287]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sosusgllhuilotsfsatszazfhvrauvcs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938183.1213653-3948-251533814020927/AnsiballZ_file.py
Feb 01 09:29:43 np0005604215.localdomain sudo[245287]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:44 np0005604215.localdomain python3.9[245289]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:44 np0005604215.localdomain sudo[245287]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:44 np0005604215.localdomain sudo[245397]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bpvhafmtbmiuazloyjlyjbogsgwsoqne ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938184.4022686-3984-36786987858919/AnsiballZ_stat.py
Feb 01 09:29:44 np0005604215.localdomain sudo[245397]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:44 np0005604215.localdomain python3.9[245399]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:44 np0005604215.localdomain sudo[245397]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:45 np0005604215.localdomain sudo[245454]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpstclrknatnsisswkqmlcmsllbcsoqq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938184.4022686-3984-36786987858919/AnsiballZ_file.py
Feb 01 09:29:45 np0005604215.localdomain sudo[245454]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:45 np0005604215.localdomain python3.9[245456]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:45 np0005604215.localdomain sudo[245454]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:45 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19606 DF PROTO=TCP SPT=54666 DPT=9882 SEQ=1052902866 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662B10E0000000001030307) 
Feb 01 09:29:46 np0005604215.localdomain sudo[245564]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kjokspqsqjxkffkbnjffpkjasskbzxse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938185.6793122-4020-208843024857057/AnsiballZ_stat.py
Feb 01 09:29:46 np0005604215.localdomain sudo[245564]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:46 np0005604215.localdomain python3.9[245566]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:46 np0005604215.localdomain sudo[245564]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52bb44324f3eb9002a3bf4ee7b8544bc72e25676c81bb6c59a692125c71221e1-merged.mount: Deactivated successfully.
Feb 01 09:29:47 np0005604215.localdomain sudo[245621]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zyazvprpwqcnnczjuaetpmozrufqlrqx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938185.6793122-4020-208843024857057/AnsiballZ_file.py
Feb 01 09:29:47 np0005604215.localdomain sudo[245621]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-892d1779a7f946097f73616f672cd69c2781ff491e090964134e591e5adb1a86-merged.mount: Deactivated successfully.
Feb 01 09:29:47 np0005604215.localdomain python3.9[245623]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:47 np0005604215.localdomain sudo[245621]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-892d1779a7f946097f73616f672cd69c2781ff491e090964134e591e5adb1a86-merged.mount: Deactivated successfully.
Feb 01 09:29:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:29:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:29:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65417 DF PROTO=TCP SPT=50262 DPT=9102 SEQ=3382619436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662BD0D0000000001030307) 
Feb 01 09:29:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:29:48 np0005604215.localdomain podman[245641]: 2026-02-01 09:29:48.862450727 +0000 UTC m=+0.086036805 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:29:48 np0005604215.localdomain podman[245641]: 2026-02-01 09:29:48.875606609 +0000 UTC m=+0.099192737 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:29:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:49 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:29:49 np0005604215.localdomain sudo[245750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cdhpxvhowbvcegxylammmelobqgfahns ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938188.9733934-4056-163108484041239/AnsiballZ_stat.py
Feb 01 09:29:49 np0005604215.localdomain sudo[245750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:49 np0005604215.localdomain python3.9[245752]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:49 np0005604215.localdomain sudo[245750]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:49 np0005604215.localdomain sudo[245807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-igjvsgycxmizrhbftparukdersyjqtdk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938188.9733934-4056-163108484041239/AnsiballZ_file.py
Feb 01 09:29:49 np0005604215.localdomain sudo[245807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e812fa34defdc78ec5fb2b77011468829f7a2881cc57f804b1a422dc9e19278a-merged.mount: Deactivated successfully.
Feb 01 09:29:49 np0005604215.localdomain python3.9[245809]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:49 np0005604215.localdomain sudo[245807]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ac18d148f1ccb0eaa519a008e32625aabf00d458250cb02e5015187c1942ecc7-merged.mount: Deactivated successfully.
Feb 01 09:29:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully.
Feb 01 09:29:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2-merged.mount: Deactivated successfully.
Feb 01 09:29:50 np0005604215.localdomain sudo[245918]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhjsegajsetqeaasxoosdunocadxowzw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938190.1951253-4092-161351601652820/AnsiballZ_stat.py
Feb 01 09:29:50 np0005604215.localdomain sudo[245918]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:50 np0005604215.localdomain python3.9[245920]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:29:50 np0005604215.localdomain sudo[245918]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:51 np0005604215.localdomain sudo[246008]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-evonudbhpbvxidkdvstgbhdtrvizfdev ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938190.1951253-4092-161351601652820/AnsiballZ_copy.py
Feb 01 09:29:51 np0005604215.localdomain sudo[246008]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5f30d5cd30916d88e24f21a5c8313738088a285d6d2d0efec09cc705e86eb786-merged.mount: Deactivated successfully.
Feb 01 09:29:51 np0005604215.localdomain python3.9[246010]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1769938190.1951253-4092-161351601652820/.source.nft follow=False _original_basename=ruleset.j2 checksum=953266ca5f7d82d2777a0a437bd7feceb9259ee8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:51 np0005604215.localdomain sudo[246008]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595-merged.mount: Deactivated successfully.
Feb 01 09:29:52 np0005604215.localdomain sudo[246118]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gownwfvjtfkvcwryxsddvgqbiejiovva ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938191.7780263-4137-4725847174288/AnsiballZ_file.py
Feb 01 09:29:52 np0005604215.localdomain sudo[246118]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:52 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac-merged.mount: Deactivated successfully.
Feb 01 09:29:52 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=31017 DF PROTO=TCP SPT=52738 DPT=9100 SEQ=3196346975 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662CB0D0000000001030307) 
Feb 01 09:29:52 np0005604215.localdomain python3.9[246120]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:52 np0005604215.localdomain sudo[246118]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:52 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba6f0be74a40197166410c33403600ee466dbd9d2ddae7d7f49f78c9646720b2-merged.mount: Deactivated successfully.
Feb 01 09:29:52 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a1185e7325783fe8cba63270bc6e59299386d7c73e4bc34c560a1fbc9e6d7e2c-merged.mount: Deactivated successfully.
Feb 01 09:29:52 np0005604215.localdomain sudo[246228]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ykpygmcqqrfcfiaxjupybytrfufmllfh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938192.6736119-4160-254258745290739/AnsiballZ_command.py
Feb 01 09:29:52 np0005604215.localdomain sudo[246228]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:53 np0005604215.localdomain python3.9[246230]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:29:53 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2cd9444c84550fbd551e3826a8110fcc009757858b99e84f1119041f2325189b-merged.mount: Deactivated successfully.
Feb 01 09:29:53 np0005604215.localdomain sudo[246228]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:53 np0005604215.localdomain sudo[246341]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lkqqupjdbxisqoxdysfkcrsfrwgjxfhi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938193.4470832-4186-82955993702481/AnsiballZ_blockinfile.py
Feb 01 09:29:53 np0005604215.localdomain sudo[246341]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:54 np0005604215.localdomain python3.9[246343]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"
                                                            include "/etc/nftables/edpm-chains.nft"
                                                            include "/etc/nftables/edpm-rules.nft"
                                                            include "/etc/nftables/edpm-jumps.nft"
                                                             path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:54 np0005604215.localdomain sudo[246341]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:29:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 01 09:29:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 01 09:29:54 np0005604215.localdomain sudo[246451]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aeiphxryjvvoyskknrdbjsrnrnsrpobq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938194.4891987-4211-41472214788077/AnsiballZ_command.py
Feb 01 09:29:54 np0005604215.localdomain sudo[246451]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:54 np0005604215.localdomain python3.9[246453]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:29:54 np0005604215.localdomain sudo[246451]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:55 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=7184 DF PROTO=TCP SPT=47710 DPT=9101 SEQ=1485413952 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662D70D0000000001030307) 
Feb 01 09:29:55 np0005604215.localdomain sudo[246562]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zhyipdsbbvekpkpmnrwxwbuzwmcfvnhj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938195.3185012-4236-256429299737859/AnsiballZ_stat.py
Feb 01 09:29:55 np0005604215.localdomain sudo[246562]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:29:55 np0005604215.localdomain python3.9[246564]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:29:55 np0005604215.localdomain sudo[246562]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:55 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-93889cfe8a1eaf916da420177b6c00eab0b6f1d6521b96229ee8963de2bbdb6f-merged.mount: Deactivated successfully.
Feb 01 09:29:56 np0005604215.localdomain sudo[246674]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ooocidqhpsrobbnygasuxsycmjhoiakv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938196.0321114-4259-276283681197533/AnsiballZ_command.py
Feb 01 09:29:56 np0005604215.localdomain sudo[246674]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:56 np0005604215.localdomain python3.9[246676]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:29:56 np0005604215.localdomain sudo[246674]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:56 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-52c18398a3f1352893ce0f0dc9f4c3a3bdf5492a6bf738875b375a7d97e85441-merged.mount: Deactivated successfully.
Feb 01 09:29:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:29:57 np0005604215.localdomain sudo[246787]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lbhcwlqcsobiztbgqjbmqhjfqncapgzx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938196.8084385-4283-95560114883908/AnsiballZ_file.py
Feb 01 09:29:57 np0005604215.localdomain sudo[246787]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:29:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:58 np0005604215.localdomain python3.9[246789]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:29:58 np0005604215.localdomain sudo[246787]: pam_unix(sudo:session): session closed for user root
Feb 01 09:29:58 np0005604215.localdomain sshd[240646]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:29:58 np0005604215.localdomain systemd[1]: session-56.scope: Deactivated successfully.
Feb 01 09:29:58 np0005604215.localdomain systemd[1]: session-56.scope: Consumed 26.776s CPU time.
Feb 01 09:29:58 np0005604215.localdomain systemd-logind[761]: Session 56 logged out. Waiting for processes to exit.
Feb 01 09:29:58 np0005604215.localdomain systemd-logind[761]: Removed session 56.
Feb 01 09:29:58 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:29:59 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-67ff2fcf098d662f72898d504b273725d460bc3ee224388c566fda6c94421648-merged.mount: Deactivated successfully.
Feb 01 09:29:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:29:59 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e3a7790e7cad798695025ef44722873ac2669462e661d130061be9d691861f40-merged.mount: Deactivated successfully.
Feb 01 09:29:59 np0005604215.localdomain podman[246807]: 2026-02-01 09:29:59.646350293 +0000 UTC m=+0.093933517 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:29:59 np0005604215.localdomain podman[246807]: 2026-02-01 09:29:59.655834441 +0000 UTC m=+0.103417695 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:29:59 np0005604215.localdomain podman[246807]: unhealthy
Feb 01 09:29:59 np0005604215.localdomain systemd[1]: tmp-crun.4uaFHR.mount: Deactivated successfully.
Feb 01 09:29:59 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:29:59.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:30:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 01 09:30:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 01 09:30:00 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Main process exited, code=exited, status=1/FAILURE
Feb 01 09:30:00 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Failed with result 'exit-code'.
Feb 01 09:30:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:00.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:30:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:30:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:30:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:30:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:30:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:30:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:30:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:30:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:30:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ba9090487930ea4ca9efbb869a950be47d0c5c3f7a5f6eb919ee0be5f322c2ce-merged.mount: Deactivated successfully.
Feb 01 09:30:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4fd9ea2ebfbeb4119560e74e5b0456fd618118c9f72a7ecf288a55a3e1a95413-merged.mount: Deactivated successfully.
Feb 01 09:30:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4fd9ea2ebfbeb4119560e74e5b0456fd618118c9f72a7ecf288a55a3e1a95413-merged.mount: Deactivated successfully.
Feb 01 09:30:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:02.991 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:03.016 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:03.016 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:03.017 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:30:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:30:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 01 09:30:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 01 09:30:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19910 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662F6C00000000001030307) 
Feb 01 09:30:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0f91fc7b8e87158c92eb7740043cf5d022febeae010865e677c28eba378655ce-merged.mount: Deactivated successfully.
Feb 01 09:30:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:30:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f998c699a79bb0ab8f605537409d8dfabf90b90001094b51abb2cd93ea9feefe-merged.mount: Deactivated successfully.
Feb 01 09:30:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:03.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:03.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:30:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:03.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:30:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:04.009 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:30:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19911 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662FACD0000000001030307) 
Feb 01 09:30:04 np0005604215.localdomain sshd[246835]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:30:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:30:04 np0005604215.localdomain sshd[246835]: Accepted publickey for zuul from 192.168.122.30 port 55024 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:30:04 np0005604215.localdomain systemd-logind[761]: New session 57 of user zuul.
Feb 01 09:30:04 np0005604215.localdomain systemd[1]: Started Session 57 of User zuul.
Feb 01 09:30:04 np0005604215.localdomain sshd[246835]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:30:04 np0005604215.localdomain podman[246837]: 2026-02-01 09:30:04.781486973 +0000 UTC m=+0.082927720 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 01 09:30:04 np0005604215.localdomain podman[246837]: 2026-02-01 09:30:04.842845075 +0000 UTC m=+0.144285842 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:30:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:04.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:04.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:04.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.022 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.023 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.023 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.023 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.024 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:30:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-530e571f2fdc2c9cc9ab61d58bf266b4766d3c3aa17392b07069c5b092adeb06-merged.mount: Deactivated successfully.
Feb 01 09:30:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8f493ed320f2136eba98c6f6d73d7580e3273443b9599c34d1438e87453daf45-merged.mount: Deactivated successfully.
Feb 01 09:30:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65418 DF PROTO=TCP SPT=50262 DPT=9102 SEQ=3382619436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA662FD0D0000000001030307) 
Feb 01 09:30:05 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:25:43 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 140477 "" "Go-http-client/1.1"
Feb 01 09:30:05 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:30:05 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:30:05.156Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Feb 01 09:30:05 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:30:05.157Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Feb 01 09:30:05 np0005604215.localdomain podman_exporter[236841]: ts=2026-02-01T09:30:05.157Z caller=tls_config.go:316 level=info msg="TLS is disabled." http2=false address=[::]:9882
Feb 01 09:30:05 np0005604215.localdomain sudo[246990]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tfptavtkacktdwrxzitjzpfctizidpyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938204.8491-23-86602083496172/AnsiballZ_file.py
Feb 01 09:30:05 np0005604215.localdomain sudo[246990]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:30:05 np0005604215.localdomain podman[246993]: 2026-02-01 09:30:05.487123653 +0000 UTC m=+0.077586628 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:30:05 np0005604215.localdomain podman[246993]: 2026-02-01 09:30:05.527740171 +0000 UTC m=+0.118203176 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.530 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.507s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:30:05 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:30:05 np0005604215.localdomain python3.9[246992]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config/container-startup-config/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.667 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.669 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13098MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.669 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.669 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:30:05 np0005604215.localdomain sudo[246990]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.734 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.735 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:30:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:05.767 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:30:06 np0005604215.localdomain sudo[247144]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pitfjxsacpoxjcvohdaqwkbtkrubeies ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938205.8490722-23-216404706199868/AnsiballZ_file.py
Feb 01 09:30:06 np0005604215.localdomain sudo[247144]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:06.235 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:30:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:06.277 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:30:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:06.293 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:30:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:06.296 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:30:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:06.296 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.627s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:30:06 np0005604215.localdomain python3.9[247146]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:06 np0005604215.localdomain sudo[247144]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19912 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66302CD0000000001030307) 
Feb 01 09:30:06 np0005604215.localdomain sudo[247256]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qkyesppxgkqeuerxcwimejytkmrvlcjp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938206.4831643-23-153782860106776/AnsiballZ_file.py
Feb 01 09:30:06 np0005604215.localdomain sudo[247256]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:06 np0005604215.localdomain python3.9[247258]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-sriov-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:07 np0005604215.localdomain sudo[247256]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:30:07.292 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:30:07 np0005604215.localdomain python3.9[247366]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:08 np0005604215.localdomain python3.9[247452]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938207.243748-101-236050600101142/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:09 np0005604215.localdomain python3.9[247560]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:10 np0005604215.localdomain python3.9[247646]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938208.7806091-101-100936056147069/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19913 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663128E0000000001030307) 
Feb 01 09:30:11 np0005604215.localdomain python3.9[247754]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:12 np0005604215.localdomain python3.9[247840]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/01-neutron-sriov-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938210.6498568-101-249700732951954/.source.conf follow=False _original_basename=neutron-sriov-agent.conf.j2 checksum=0711e0aa3ee7c85c85c3e1039f4da2e49344129d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:30:12 np0005604215.localdomain podman[247858]: 2026-02-01 09:30:12.864413149 +0000 UTC m=+0.077658510 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent)
Feb 01 09:30:12 np0005604215.localdomain podman[247858]: 2026-02-01 09:30:12.900657014 +0000 UTC m=+0.113902365 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:30:12 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:30:14 np0005604215.localdomain python3.9[247966]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:30:14 np0005604215.localdomain systemd[1]: tmp-crun.4AWsoI.mount: Deactivated successfully.
Feb 01 09:30:14 np0005604215.localdomain podman[248053]: 2026-02-01 09:30:14.869650271 +0000 UTC m=+0.085915541 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, vcs-type=git, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855)
Feb 01 09:30:14 np0005604215.localdomain python3.9[248052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-sriov-agent/10-neutron-sriov.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938213.8174965-275-272263216462854/.source.conf _original_basename=10-neutron-sriov.conf follow=False checksum=a74956efcd0a6873aac81fb89a0017e3332e5948 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:14 np0005604215.localdomain podman[248053]: 2026-02-01 09:30:14.882587895 +0000 UTC m=+0.098853185 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, release=1769056855, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:30:14 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:30:15 np0005604215.localdomain python3.9[248179]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:30:16 np0005604215.localdomain sudo[248289]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tewzmmpukhkgxjzkqzpraoyhtsxoityz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938215.9203627-347-59549273414103/AnsiballZ_file.py
Feb 01 09:30:16 np0005604215.localdomain sudo[248289]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:16 np0005604215.localdomain python3.9[248291]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:16 np0005604215.localdomain sudo[248289]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:16 np0005604215.localdomain sudo[248399]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dobhtaakwdrjroswjiamrzthmhwbrlqh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938216.662095-372-13818302834922/AnsiballZ_stat.py
Feb 01 09:30:16 np0005604215.localdomain sudo[248399]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:17 np0005604215.localdomain python3.9[248401]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:17 np0005604215.localdomain sudo[248399]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:17 np0005604215.localdomain sudo[248456]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gydcsteppgjklljoesrzsminsytlecbp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938216.662095-372-13818302834922/AnsiballZ_file.py
Feb 01 09:30:17 np0005604215.localdomain sudo[248456]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:17 np0005604215.localdomain python3.9[248458]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:17 np0005604215.localdomain sudo[248456]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:18 np0005604215.localdomain sudo[248566]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jziesczbnlzfitpprtkdwxunjexywmhs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938217.7771347-372-121426700618312/AnsiballZ_stat.py
Feb 01 09:30:18 np0005604215.localdomain sudo[248566]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:18 np0005604215.localdomain python3.9[248568]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:18 np0005604215.localdomain sudo[248566]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:18 np0005604215.localdomain sudo[248623]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mjzxaqdvqszeewvkpvwmjmthqokjsvse ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938217.7771347-372-121426700618312/AnsiballZ_file.py
Feb 01 09:30:18 np0005604215.localdomain sudo[248623]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:18 np0005604215.localdomain python3.9[248625]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:18 np0005604215.localdomain sudo[248623]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19914 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663330F0000000001030307) 
Feb 01 09:30:19 np0005604215.localdomain sudo[248733]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tuvyzscfckatriguuykhjmiytwheteho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938219.1329374-440-87822407491074/AnsiballZ_file.py
Feb 01 09:30:19 np0005604215.localdomain sudo[248733]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:30:19 np0005604215.localdomain podman[248736]: 2026-02-01 09:30:19.45942101 +0000 UTC m=+0.069922733 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute)
Feb 01 09:30:19 np0005604215.localdomain podman[248736]: 2026-02-01 09:30:19.47352196 +0000 UTC m=+0.084023673 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:30:19 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:30:19 np0005604215.localdomain python3.9[248735]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:19 np0005604215.localdomain sudo[248733]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:20 np0005604215.localdomain sudo[248862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-scitoyepkemdbtdpilcltmjxizvqyzly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938219.838978-464-244622402729086/AnsiballZ_stat.py
Feb 01 09:30:20 np0005604215.localdomain sudo[248862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:20 np0005604215.localdomain python3.9[248864]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:20 np0005604215.localdomain sudo[248862]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:20 np0005604215.localdomain sudo[248919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ogmxjdarlluspxhvidnhogxzgvanjjnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938219.838978-464-244622402729086/AnsiballZ_file.py
Feb 01 09:30:20 np0005604215.localdomain sudo[248919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:20 np0005604215.localdomain python3.9[248921]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:20 np0005604215.localdomain sudo[248919]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:21 np0005604215.localdomain sudo[249030]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pmdubwgvmjualqxckiuotdecidztqkbo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938221.033305-500-88665295267890/AnsiballZ_stat.py
Feb 01 09:30:21 np0005604215.localdomain sudo[249030]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:21 np0005604215.localdomain python3.9[249032]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:21 np0005604215.localdomain sudo[249030]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:23 np0005604215.localdomain sudo[249068]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:30:23 np0005604215.localdomain sudo[249068]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:30:23 np0005604215.localdomain sudo[249068]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:23 np0005604215.localdomain sudo[249103]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nykivskpkagiiulvouljmbsgkheenpts ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938221.033305-500-88665295267890/AnsiballZ_file.py
Feb 01 09:30:23 np0005604215.localdomain sudo[249103]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:23 np0005604215.localdomain python3.9[249107]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:23 np0005604215.localdomain sudo[249103]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:24 np0005604215.localdomain sudo[249215]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdaxtnxginkolmavfdonecfvsqvqszje ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938223.534441-536-264993602623372/AnsiballZ_systemd.py
Feb 01 09:30:24 np0005604215.localdomain sudo[249215]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:24 np0005604215.localdomain python3.9[249217]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:30:24 np0005604215.localdomain systemd-sysv-generator[249243]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:30:24 np0005604215.localdomain systemd-rc-local-generator[249240]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:24 np0005604215.localdomain sudo[249215]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:25 np0005604215.localdomain sudo[249363]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dhwbaajqbdlazbxpfwknsftgkofwncxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938225.1633055-560-37327810560283/AnsiballZ_stat.py
Feb 01 09:30:25 np0005604215.localdomain sudo[249363]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:25 np0005604215.localdomain python3.9[249365]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:25 np0005604215.localdomain sudo[249363]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:25 np0005604215.localdomain sudo[249420]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-elekzpsfqriooeopkfpnrqpfdvuugitl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938225.1633055-560-37327810560283/AnsiballZ_file.py
Feb 01 09:30:25 np0005604215.localdomain sudo[249420]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:26 np0005604215.localdomain python3.9[249422]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:26 np0005604215.localdomain sudo[249420]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:26 np0005604215.localdomain sudo[249530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jnfdiylgzylwjofpvbpbphpdppvwtzlw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938226.30209-596-85349179022244/AnsiballZ_stat.py
Feb 01 09:30:26 np0005604215.localdomain sudo[249530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:26 np0005604215.localdomain python3.9[249532]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:26 np0005604215.localdomain sudo[249530]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:27 np0005604215.localdomain sudo[249587]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-swgrrdppllyntultfnnhuceezrfyuwio ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938226.30209-596-85349179022244/AnsiballZ_file.py
Feb 01 09:30:27 np0005604215.localdomain sudo[249587]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:27 np0005604215.localdomain python3.9[249589]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:27 np0005604215.localdomain sudo[249587]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:27 np0005604215.localdomain sudo[249697]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqvdghyzhepilujilgdyylxrtjbzwdcm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938227.5852761-632-216947322715946/AnsiballZ_systemd.py
Feb 01 09:30:27 np0005604215.localdomain sudo[249697]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:28 np0005604215.localdomain python3.9[249699]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:30:28 np0005604215.localdomain systemd-sysv-generator[249728]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:30:28 np0005604215.localdomain systemd-rc-local-generator[249723]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 09:30:28 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 09:30:28 np0005604215.localdomain sudo[249697]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:29 np0005604215.localdomain sudo[249848]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-anfvrlvwljpnnmmmkbjiqbksqtwqxvdo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938229.289065-662-145761960146867/AnsiballZ_file.py
Feb 01 09:30:29 np0005604215.localdomain sudo[249848]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:29 np0005604215.localdomain python3.9[249850]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:29 np0005604215.localdomain sudo[249848]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:30:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:30:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:30:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 142457 "" "Go-http-client/1.1"
Feb 01 09:30:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:30:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15432 "" "Go-http-client/1.1"
Feb 01 09:30:30 np0005604215.localdomain sudo[249962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eoccmlinsrphakyypegscafqiiukoyih ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938230.2362673-687-83882040911231/AnsiballZ_file.py
Feb 01 09:30:30 np0005604215.localdomain sudo[249962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:30:30 np0005604215.localdomain podman[249965]: 2026-02-01 09:30:30.605154127 +0000 UTC m=+0.085405508 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=unhealthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:30:30 np0005604215.localdomain podman[249965]: 2026-02-01 09:30:30.617728839 +0000 UTC m=+0.097980190 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:30:30 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:30:30 np0005604215.localdomain python3.9[249964]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:30:30 np0005604215.localdomain sudo[249962]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:31 np0005604215.localdomain sudo[250095]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgabsxmpahxzdaydltsouofoigvlkxgj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938230.94414-710-113527581641549/AnsiballZ_stat.py
Feb 01 09:30:31 np0005604215.localdomain sudo[250095]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:31 np0005604215.localdomain python3.9[250097]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_sriov_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:31 np0005604215.localdomain sudo[250095]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:30:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:30:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:30:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:30:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:30:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:30:31 np0005604215.localdomain sudo[250183]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dfvuychsvmwuxnhdytgyosruvholpysq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938230.94414-710-113527581641549/AnsiballZ_copy.py
Feb 01 09:30:31 np0005604215.localdomain sudo[250183]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:31 np0005604215.localdomain python3.9[250185]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_sriov_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938230.94414-710-113527581641549/.source.json _original_basename=.hahx198j follow=False checksum=a32073fdba4733b9ffe872cfb91708eff83a585a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:31 np0005604215.localdomain sudo[250183]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:32 np0005604215.localdomain python3.9[250293]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2994 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6636BF10000000001030307) 
Feb 01 09:30:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2995 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663700E0000000001030307) 
Feb 01 09:30:34 np0005604215.localdomain sudo[250595]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wbpwdswuplthnngvazqtlndwkwyrqqre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938234.3869076-830-7759416104050/AnsiballZ_container_config_data.py
Feb 01 09:30:34 np0005604215.localdomain sudo[250595]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:34 np0005604215.localdomain python3.9[250597]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_pattern=*.json debug=False
Feb 01 09:30:34 np0005604215.localdomain sudo[250595]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19915 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663730D0000000001030307) 
Feb 01 09:30:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:30:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:30:35 np0005604215.localdomain podman[250609]: 2026-02-01 09:30:35.867909403 +0000 UTC m=+0.080031658 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:30:35 np0005604215.localdomain podman[250612]: 2026-02-01 09:30:35.952881418 +0000 UTC m=+0.162411616 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:30:35 np0005604215.localdomain podman[250609]: 2026-02-01 09:30:35.964645506 +0000 UTC m=+0.176767721 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:30:35 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:30:36 np0005604215.localdomain podman[250612]: 2026-02-01 09:30:36.015608824 +0000 UTC m=+0.225139012 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:30:36 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:30:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2996 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663780D0000000001030307) 
Feb 01 09:30:36 np0005604215.localdomain sudo[250753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-usskuqgsrrybplbaqsrrkgaxgpmvwssu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938236.269754-864-213177878355541/AnsiballZ_container_config_hash.py
Feb 01 09:30:36 np0005604215.localdomain sudo[250753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:36 np0005604215.localdomain python3.9[250755]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:30:36 np0005604215.localdomain sudo[250753]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65419 DF PROTO=TCP SPT=50262 DPT=9102 SEQ=3382619436 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6637B0E0000000001030307) 
Feb 01 09:30:38 np0005604215.localdomain sudo[250863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wxfqbeigabajsovhsjbmplmbriybgohh ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769938237.3344843-893-43458201604958/AnsiballZ_edpm_container_manage.py
Feb 01 09:30:38 np0005604215.localdomain sudo[250863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:39 np0005604215.localdomain python3[250865]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_sriov_agent config_id=neutron_sriov_agent config_overrides={} config_patterns=*.json containers=['neutron_sriov_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:30:39 np0005604215.localdomain podman[250901]: 
Feb 01 09:30:39 np0005604215.localdomain podman[250901]: 2026-02-01 09:30:39.353744954 +0000 UTC m=+0.079904575 container create 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:30:39 np0005604215.localdomain podman[250901]: 2026-02-01 09:30:39.310826414 +0000 UTC m=+0.036986065 image pull  quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 01 09:30:39 np0005604215.localdomain python3[250865]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_sriov_agent --conmon-pidfile /run/neutron_sriov_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c --label config_id=neutron_sriov_agent --label container_name=neutron_sriov_agent --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user neutron --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified
Feb 01 09:30:39 np0005604215.localdomain sudo[250863]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:39 np0005604215.localdomain auditd[727]: Audit daemon rotating log files
Feb 01 09:30:40 np0005604215.localdomain sudo[251046]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ymwqvvmyyismngyeasoiqjlygetfuacs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938239.7522483-917-180657376504352/AnsiballZ_stat.py
Feb 01 09:30:40 np0005604215.localdomain sudo[251046]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:40 np0005604215.localdomain python3.9[251048]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:30:40 np0005604215.localdomain sudo[251046]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2997 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66387CD0000000001030307) 
Feb 01 09:30:40 np0005604215.localdomain sudo[251158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpogxpyclykqwqvabskynaswtpsidqxs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938240.5747306-944-151753157737394/AnsiballZ_file.py
Feb 01 09:30:40 np0005604215.localdomain sudo[251158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:41 np0005604215.localdomain python3.9[251160]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:41 np0005604215.localdomain sudo[251158]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:41 np0005604215.localdomain sudo[251213]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-smnfymjpgmywnoaibwahontbqdsemodd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938240.5747306-944-151753157737394/AnsiballZ_stat.py
Feb 01 09:30:41 np0005604215.localdomain sudo[251213]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:41 np0005604215.localdomain python3.9[251215]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_sriov_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:30:41 np0005604215.localdomain sudo[251213]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:30:41.748 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:30:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:30:41.749 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:30:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:30:41.749 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:30:42 np0005604215.localdomain sudo[251322]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gryrzcsnykgwhrfxcahvcudrqozqhbqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938241.5803125-944-70358029612621/AnsiballZ_copy.py
Feb 01 09:30:42 np0005604215.localdomain sudo[251322]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:42 np0005604215.localdomain python3.9[251324]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769938241.5803125-944-70358029612621/source dest=/etc/systemd/system/edpm_neutron_sriov_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:42 np0005604215.localdomain sudo[251322]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:42 np0005604215.localdomain sudo[251377]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hflldwwpcjzyldduhcgaskwhfqbheaeg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938241.5803125-944-70358029612621/AnsiballZ_systemd.py
Feb 01 09:30:42 np0005604215.localdomain sudo[251377]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:42 np0005604215.localdomain python3.9[251379]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:30:42 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:30:42 np0005604215.localdomain systemd-sysv-generator[251408]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:30:42 np0005604215.localdomain systemd-rc-local-generator[251402]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:30:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:42 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:30:43 np0005604215.localdomain sudo[251377]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:43 np0005604215.localdomain podman[251416]: 2026-02-01 09:30:43.232549522 +0000 UTC m=+0.087845780 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:30:43 np0005604215.localdomain podman[251416]: 2026-02-01 09:30:43.269827295 +0000 UTC m=+0.125123623 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:30:43 np0005604215.localdomain sudo[251486]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtgbyjliyinxjxijnkwudlseqnwajwaz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938241.5803125-944-70358029612621/AnsiballZ_systemd.py
Feb 01 09:30:43 np0005604215.localdomain sudo[251486]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:43 np0005604215.localdomain python3.9[251488]: ansible-systemd Invoked with state=restarted name=edpm_neutron_sriov_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:30:43 np0005604215.localdomain systemd-rc-local-generator[251516]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:30:43 np0005604215.localdomain systemd-sysv-generator[251519]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:43 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:30:44 np0005604215.localdomain systemd[1]: Starting neutron_sriov_agent container...
Feb 01 09:30:44 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:30:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19db9a1c30f9b931e44b1c23ff04fd048a4b5218c0b521ac43c5f273eeaa49c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 01 09:30:44 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19db9a1c30f9b931e44b1c23ff04fd048a4b5218c0b521ac43c5f273eeaa49c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:30:44 np0005604215.localdomain podman[251528]: 2026-02-01 09:30:44.250118591 +0000 UTC m=+0.122930069 container init 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, managed_by=edpm_ansible)
Feb 01 09:30:44 np0005604215.localdomain podman[251528]: 2026-02-01 09:30:44.262357093 +0000 UTC m=+0.135168641 container start 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=neutron_sriov_agent, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_sriov_agent, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:30:44 np0005604215.localdomain podman[251528]: neutron_sriov_agent
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + sudo -E kolla_set_configs
Feb 01 09:30:44 np0005604215.localdomain systemd[1]: Started neutron_sriov_agent container.
Feb 01 09:30:44 np0005604215.localdomain sudo[251486]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Validating config file
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Copying service configuration files
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Writing out command to execute
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: ++ cat /run_command
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + CMD=/usr/bin/neutron-sriov-nic-agent
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + ARGS=
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + sudo kolla_copy_cacerts
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + [[ ! -n '' ]]
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + . kolla_extend_start
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + umask 0022
Feb 01 09:30:44 np0005604215.localdomain neutron_sriov_agent[251542]: + exec /usr/bin/neutron-sriov-nic-agent
Feb 01 09:30:45 np0005604215.localdomain python3.9[251664]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:30:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:30:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5433 writes, 23K keys, 5433 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5433 writes, 751 syncs, 7.23 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:30:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:30:45 np0005604215.localdomain systemd[1]: tmp-crun.Mwqkdy.mount: Deactivated successfully.
Feb 01 09:30:45 np0005604215.localdomain podman[251682]: 2026-02-01 09:30:45.873340369 +0000 UTC m=+0.090412176 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc.)
Feb 01 09:30:45 np0005604215.localdomain podman[251682]: 2026-02-01 09:30:45.883576652 +0000 UTC m=+0.100648429 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vendor=Red Hat, Inc., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, io.openshift.tags=minimal rhel9)
Feb 01 09:30:45 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.939 2 INFO neutron.common.config [-] Logging enabled!
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.939 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.939 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.940 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.940 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005604215.localdomain'}
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.940 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-86d6ffca-9bd7-4d35-a743-ea5092aeab08 - - - - - -] RPC agent_id: nic-switch-agent.np0005604215.localdomain
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.945 2 INFO neutron.agent.agent_extensions_manager [None req-86d6ffca-9bd7-4d35-a743-ea5092aeab08 - - - - - -] Loaded agent extensions: ['qos']
Feb 01 09:30:45 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:45.945 2 INFO neutron.agent.agent_extensions_manager [None req-86d6ffca-9bd7-4d35-a743-ea5092aeab08 - - - - - -] Initializing agent extension 'qos'
Feb 01 09:30:46 np0005604215.localdomain sudo[251794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntsvrrbsswqwxpiwmavzzbrnapzdxshh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938245.9698136-1079-74085641582447/AnsiballZ_stat.py
Feb 01 09:30:46 np0005604215.localdomain sudo[251794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:46 np0005604215.localdomain python3.9[251796]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:30:46 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:46.462 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-86d6ffca-9bd7-4d35-a743-ea5092aeab08 - - - - - -] Agent initialized successfully, now running... 
Feb 01 09:30:46 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:46.462 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-86d6ffca-9bd7-4d35-a743-ea5092aeab08 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Feb 01 09:30:46 np0005604215.localdomain neutron_sriov_agent[251542]: 2026-02-01 09:30:46.463 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-86d6ffca-9bd7-4d35-a743-ea5092aeab08 - - - - - -] Agent out of sync with plugin!
Feb 01 09:30:46 np0005604215.localdomain sudo[251794]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:46 np0005604215.localdomain sudo[251884]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nlnmzeziutbskdrzptnmhnsbvzjltige ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938245.9698136-1079-74085641582447/AnsiballZ_copy.py
Feb 01 09:30:46 np0005604215.localdomain sudo[251884]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:47 np0005604215.localdomain python3.9[251886]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938245.9698136-1079-74085641582447/.source.yaml _original_basename=.kpzqw_th follow=False checksum=b3cbbb2fba8ac1ae44c39a232429364988d5d801 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:30:47 np0005604215.localdomain sudo[251884]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:47 np0005604215.localdomain sudo[251994]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qbjjbplqmefwmalxuausyrgswqbwycdp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938247.312771-1125-19647589921170/AnsiballZ_systemd.py
Feb 01 09:30:47 np0005604215.localdomain sudo[251994]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:47 np0005604215.localdomain python3.9[251996]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_sriov_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:30:47 np0005604215.localdomain systemd[1]: Stopping neutron_sriov_agent container...
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: tmp-crun.euaI2v.mount: Deactivated successfully.
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: libpod-521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891.scope: Deactivated successfully.
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: libpod-521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891.scope: Consumed 1.771s CPU time.
Feb 01 09:30:48 np0005604215.localdomain podman[252000]: 2026-02-01 09:30:48.059244116 +0000 UTC m=+0.086092188 container died 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: tmp-crun.AHsFCG.mount: Deactivated successfully.
Feb 01 09:30:48 np0005604215.localdomain podman[252000]: 2026-02-01 09:30:48.113693778 +0000 UTC m=+0.140541780 container cleanup 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']})
Feb 01 09:30:48 np0005604215.localdomain podman[252000]: neutron_sriov_agent
Feb 01 09:30:48 np0005604215.localdomain podman[252027]: 2026-02-01 09:30:48.194618082 +0000 UTC m=+0.052932357 container cleanup 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, config_id=neutron_sriov_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_sriov_agent, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:30:48 np0005604215.localdomain podman[252027]: neutron_sriov_agent
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: edpm_neutron_sriov_agent.service: Deactivated successfully.
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: Stopped neutron_sriov_agent container.
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: Starting neutron_sriov_agent container...
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:30:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19db9a1c30f9b931e44b1c23ff04fd048a4b5218c0b521ac43c5f273eeaa49c/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 01 09:30:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c19db9a1c30f9b931e44b1c23ff04fd048a4b5218c0b521ac43c5f273eeaa49c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:30:48 np0005604215.localdomain podman[252039]: 2026-02-01 09:30:48.331269635 +0000 UTC m=+0.103876935 container init 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, container_name=neutron_sriov_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=neutron_sriov_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 01 09:30:48 np0005604215.localdomain podman[252039]: 2026-02-01 09:30:48.339844268 +0000 UTC m=+0.112451558 container start 521369efe03bd350bddc08a55e0a279c01928f7a9d9eaef6d6d9292f24ef4891 (image=quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified, name=neutron_sriov_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-92014ee3c62b3d5f146d1dca0b039f1231ba53c2115cfac0921365576bf44e2c'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-sriov-agent:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'neutron', 'volumes': ['/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/openstack/neutron-sriov-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_sriov_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/neutron-sriov/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=neutron_sriov_agent, container_name=neutron_sriov_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:30:48 np0005604215.localdomain podman[252039]: neutron_sriov_agent
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + sudo -E kolla_set_configs
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: Started neutron_sriov_agent container.
Feb 01 09:30:48 np0005604215.localdomain sudo[251994]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Validating config file
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Copying service configuration files
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Writing out command to execute
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: ++ cat /run_command
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + CMD=/usr/bin/neutron-sriov-nic-agent
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + ARGS=
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + sudo kolla_copy_cacerts
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + [[ ! -n '' ]]
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + . kolla_extend_start
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: Running command: '/usr/bin/neutron-sriov-nic-agent'
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + echo 'Running command: '\''/usr/bin/neutron-sriov-nic-agent'\'''
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + umask 0022
Feb 01 09:30:48 np0005604215.localdomain neutron_sriov_agent[252054]: + exec /usr/bin/neutron-sriov-nic-agent
Feb 01 09:30:48 np0005604215.localdomain sshd[246835]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:30:48 np0005604215.localdomain systemd-logind[761]: Session 57 logged out. Waiting for processes to exit.
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: session-57.scope: Deactivated successfully.
Feb 01 09:30:48 np0005604215.localdomain systemd[1]: session-57.scope: Consumed 22.557s CPU time.
Feb 01 09:30:48 np0005604215.localdomain systemd-logind[761]: Removed session 57.
Feb 01 09:30:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2998 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663A90D0000000001030307) 
Feb 01 09:30:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:30:49 np0005604215.localdomain systemd[1]: tmp-crun.Wxfmhw.mount: Deactivated successfully.
Feb 01 09:30:49 np0005604215.localdomain podman[252086]: 2026-02-01 09:30:49.874437725 +0000 UTC m=+0.091115417 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 01 09:30:49 np0005604215.localdomain podman[252086]: 2026-02-01 09:30:49.886682938 +0000 UTC m=+0.103360620 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:30:49 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.009 2 INFO neutron.common.config [-] Logging enabled!
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.009 2 INFO neutron.common.config [-] /usr/bin/neutron-sriov-nic-agent version 22.2.2.dev44
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.009 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Physical Devices mappings: {'dummy_sriov_net': ['dummy-dev']}
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.009 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Exclude Devices: {}
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.009 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider bandwidths: {}
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.009 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider inventory defaults: {'allocation_ratio': 1.0, 'min_unit': 1, 'step_size': 1, 'reserved': 0}
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.010 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [-] Resource provider hypervisors: {'dummy-dev': 'np0005604215.localdomain'}
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.010 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-92b1031d-f3e6-4d59-90ee-0d9c32b77b51 - - - - - -] RPC agent_id: nic-switch-agent.np0005604215.localdomain
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.014 2 INFO neutron.agent.agent_extensions_manager [None req-92b1031d-f3e6-4d59-90ee-0d9c32b77b51 - - - - - -] Loaded agent extensions: ['qos']
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.015 2 INFO neutron.agent.agent_extensions_manager [None req-92b1031d-f3e6-4d59-90ee-0d9c32b77b51 - - - - - -] Initializing agent extension 'qos'
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.193 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-92b1031d-f3e6-4d59-90ee-0d9c32b77b51 - - - - - -] Agent initialized successfully, now running... 
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.194 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-92b1031d-f3e6-4d59-90ee-0d9c32b77b51 - - - - - -] SRIOV NIC Agent RPC Daemon Started!
Feb 01 09:30:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:30:50.196 2 INFO neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [None req-92b1031d-f3e6-4d59-90ee-0d9c32b77b51 - - - - - -] Agent out of sync with plugin!
Feb 01 09:30:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:30:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 6600.1 total, 600.0 interval
                                                          Cumulative writes: 5223 writes, 23K keys, 5223 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5223 writes, 658 syncs, 7.94 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                          Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:30:54 np0005604215.localdomain sshd[252106]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:30:54 np0005604215.localdomain sshd[252106]: Accepted publickey for zuul from 192.168.122.30 port 44058 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:30:54 np0005604215.localdomain systemd-logind[761]: New session 58 of user zuul.
Feb 01 09:30:54 np0005604215.localdomain systemd[1]: Started Session 58 of User zuul.
Feb 01 09:30:54 np0005604215.localdomain sshd[252106]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:30:55 np0005604215.localdomain python3.9[252217]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:30:56 np0005604215.localdomain sudo[252329]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hnivdfeueihsmjswojgunbibkbgcoujn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938256.3696709-62-143436439191995/AnsiballZ_setup.py
Feb 01 09:30:56 np0005604215.localdomain sudo[252329]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:56 np0005604215.localdomain python3.9[252331]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:30:57 np0005604215.localdomain sudo[252329]: pam_unix(sudo:session): session closed for user root
Feb 01 09:30:58 np0005604215.localdomain sudo[252392]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wcclsbilyllcmjgxwfdinojzvvwntrkl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938256.3696709-62-143436439191995/AnsiballZ_dnf.py
Feb 01 09:30:58 np0005604215.localdomain sudo[252392]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:30:58 np0005604215.localdomain python3.9[252394]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch3.3'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:31:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:31:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:31:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:31:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144458 "" "Go-http-client/1.1"
Feb 01 09:31:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:31:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15876 "" "Go-http-client/1.1"
Feb 01 09:31:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:31:00 np0005604215.localdomain systemd[1]: tmp-crun.kIQDMh.mount: Deactivated successfully.
Feb 01 09:31:00 np0005604215.localdomain podman[252397]: 2026-02-01 09:31:00.885077032 +0000 UTC m=+0.097499586 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:31:00 np0005604215.localdomain podman[252397]: 2026-02-01 09:31:00.896718306 +0000 UTC m=+0.109140860 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:31:00 np0005604215.localdomain sshd[252421]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:31:00 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:31:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:00.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:00 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:00.997 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:31:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:31:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:31:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:31:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:31:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:31:01 np0005604215.localdomain sshd[252421]: Invalid user user14 from 85.206.171.113 port 43568
Feb 01 09:31:01 np0005604215.localdomain sshd[252421]: Received disconnect from 85.206.171.113 port 43568:11: Bye Bye [preauth]
Feb 01 09:31:01 np0005604215.localdomain sshd[252421]: Disconnected from invalid user user14 85.206.171.113 port 43568 [preauth]
Feb 01 09:31:01 np0005604215.localdomain sudo[252392]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:02 np0005604215.localdomain sudo[252530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxmfrmpxdyojrtdrrueushlivsvehbdh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938262.2628787-98-152410047431320/AnsiballZ_systemd.py
Feb 01 09:31:02 np0005604215.localdomain sudo[252530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:02.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:02.997 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:31:03 np0005604215.localdomain python3.9[252532]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Feb 01 09:31:03 np0005604215.localdomain sudo[252530]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:31:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:31:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57330 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663E1210000000001030307) 
Feb 01 09:31:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57331 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663E50D0000000001030307) 
Feb 01 09:31:04 np0005604215.localdomain sudo[252643]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pwucguieosgqmpspxmcsziyaqwnnsicn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938264.564046-125-60314708273659/AnsiballZ_file.py
Feb 01 09:31:04 np0005604215.localdomain sudo[252643]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:04.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:04.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:05 np0005604215.localdomain python3.9[252645]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/container-startup-config setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:05 np0005604215.localdomain sudo[252643]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=2999 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663E90E0000000001030307) 
Feb 01 09:31:05 np0005604215.localdomain sudo[252753]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jqhjwkfbiqhgasbqwjftvxbkylrnreqn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938265.302506-125-174760799292199/AnsiballZ_file.py
Feb 01 09:31:05 np0005604215.localdomain sudo[252753]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:05 np0005604215.localdomain python3.9[252755]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:05 np0005604215.localdomain sudo[252753]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:05.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:05.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:31:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:05.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:31:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:06.011 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:31:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:06.011 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:06 np0005604215.localdomain sudo[252863]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mpbcnpdqpzlqqmqqckjvsqrxgkioeowo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938265.940035-125-183430369932642/AnsiballZ_file.py
Feb 01 09:31:06 np0005604215.localdomain sudo[252863]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:31:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:31:06 np0005604215.localdomain podman[252867]: 2026-02-01 09:31:06.30671702 +0000 UTC m=+0.085444080 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:31:06 np0005604215.localdomain podman[252867]: 2026-02-01 09:31:06.318707294 +0000 UTC m=+0.097434384 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:31:06 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:31:06 np0005604215.localdomain podman[252866]: 2026-02-01 09:31:06.414268692 +0000 UTC m=+0.192696232 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 01 09:31:06 np0005604215.localdomain python3.9[252865]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/neutron-dhcp-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:06 np0005604215.localdomain sudo[252863]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:06 np0005604215.localdomain podman[252866]: 2026-02-01 09:31:06.486709975 +0000 UTC m=+0.265137555 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 01 09:31:06 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:31:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57332 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663ED0D0000000001030307) 
Feb 01 09:31:06 np0005604215.localdomain sudo[253020]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iwpnirrtkrbtaahtioqhsfouurdhxncl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938266.5851831-125-133145877591957/AnsiballZ_file.py
Feb 01 09:31:06 np0005604215.localdomain sudo[253020]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:06.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:06.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.020 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.020 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.021 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.021 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.022 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:31:07 np0005604215.localdomain python3.9[253022]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:07 np0005604215.localdomain sudo[253020]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.450 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.429s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:31:07 np0005604215.localdomain sudo[253152]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcbijibeoqahljnmdhodftdbuotritvm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938267.2329147-125-151238044917840/AnsiballZ_file.py
Feb 01 09:31:07 np0005604215.localdomain sudo[253152]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=19916 DF PROTO=TCP SPT=56516 DPT=9102 SEQ=1565514684 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663F10D0000000001030307) 
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.658 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.659 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=13035MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.660 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.660 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:31:07 np0005604215.localdomain python3.9[253154]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:07 np0005604215.localdomain sudo[253152]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.737 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.738 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:31:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:07.760 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:31:08 np0005604215.localdomain sudo[253282]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ejoysvlihukqhyeoqlhykhqeztivoizx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938267.8277657-125-187623722552100/AnsiballZ_file.py
Feb 01 09:31:08 np0005604215.localdomain sudo[253282]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:08.185 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.425s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:31:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:08.193 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:31:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:08.213 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:31:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:08.216 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:31:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:31:08.217 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:31:08 np0005604215.localdomain python3.9[253284]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ns-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:08 np0005604215.localdomain sudo[253282]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:08 np0005604215.localdomain sudo[253394]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nrsnlnudseyjbbjigkdzwzqjppnfvega ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938268.4407303-125-179150145068601/AnsiballZ_file.py
Feb 01 09:31:08 np0005604215.localdomain sudo[253394]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:08 np0005604215.localdomain python3.9[253396]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:08 np0005604215.localdomain sudo[253394]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:09 np0005604215.localdomain sudo[253504]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uclbjkisjmjfnkdzoxeeygjwsjpjttwg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938269.5048542-275-51986481746936/AnsiballZ_stat.py
Feb 01 09:31:09 np0005604215.localdomain sudo[253504]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:10 np0005604215.localdomain python3.9[253506]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:10 np0005604215.localdomain sudo[253504]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57333 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA663FCCD0000000001030307) 
Feb 01 09:31:10 np0005604215.localdomain sudo[253592]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-azcmjbkdwvlukcetzhmcbsgtvkwmlwsk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938269.5048542-275-51986481746936/AnsiballZ_copy.py
Feb 01 09:31:10 np0005604215.localdomain sudo[253592]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:10 np0005604215.localdomain python3.9[253594]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/container-startup-config/neutron_dhcp_agent.yaml mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938269.5048542-275-51986481746936/.source.yaml follow=False _original_basename=neutron_dhcp_agent.yaml.j2 checksum=472c5e922ae22c8bdcaef73d1ca73ce5597b440e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:10 np0005604215.localdomain sudo[253592]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:11 np0005604215.localdomain python3.9[253702]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:12 np0005604215.localdomain python3.9[253788]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938271.1218581-320-66902994009648/.source.conf follow=False _original_basename=neutron.conf.j2 checksum=24e013b64eb8be4a13596c6ffccbd94df7442bd2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:31:13 np0005604215.localdomain systemd[1]: tmp-crun.RatLOy.mount: Deactivated successfully.
Feb 01 09:31:13 np0005604215.localdomain podman[253897]: 2026-02-01 09:31:13.879516437 +0000 UTC m=+0.093581899 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:31:13 np0005604215.localdomain podman[253897]: 2026-02-01 09:31:13.909991459 +0000 UTC m=+0.124056971 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:31:13 np0005604215.localdomain python3.9[253896]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:13 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:31:14 np0005604215.localdomain python3.9[254000]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938272.3197813-320-1082298037086/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:15 np0005604215.localdomain python3.9[254108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:16 np0005604215.localdomain python3.9[254194]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/01-neutron-dhcp-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938274.5805-320-49681259680869/.source.conf follow=False _original_basename=neutron-dhcp-agent.conf.j2 checksum=1165b10d39ffebe4cf306f978262ccf67cc9110d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:31:16 np0005604215.localdomain systemd[1]: tmp-crun.0AQkyz.mount: Deactivated successfully.
Feb 01 09:31:16 np0005604215.localdomain podman[254212]: 2026-02-01 09:31:16.875954048 +0000 UTC m=+0.085571524 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:31:16 np0005604215.localdomain podman[254212]: 2026-02-01 09:31:16.910349865 +0000 UTC m=+0.119967341 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, name=ubi9/ubi-minimal, version=9.7, distribution-scope=public, config_id=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Feb 01 09:31:16 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:31:17 np0005604215.localdomain python3.9[254322]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:17 np0005604215.localdomain python3.9[254408]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/neutron-dhcp-agent/10-neutron-dhcp.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938276.992342-495-196133821449027/.source.conf _original_basename=10-neutron-dhcp.conf follow=False checksum=a74956efcd0a6873aac81fb89a0017e3332e5948 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:18 np0005604215.localdomain python3.9[254516]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57334 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6641D0D0000000001030307) 
Feb 01 09:31:19 np0005604215.localdomain python3.9[254602]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938278.2622726-539-117828673137640/.source follow=False _original_basename=haproxy.j2 checksum=eddfecb822bb60e7241db0fd719c7552d2d25452 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:19 np0005604215.localdomain python3.9[254710]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:20 np0005604215.localdomain python3.9[254796]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/dhcp_agent_dnsmasq_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938279.4214728-539-134782723114566/.source follow=False _original_basename=dnsmasq.j2 checksum=a6b8b2fb47e7419d250eaee9e3565b13fff8f42e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:31:20 np0005604215.localdomain podman[254841]: 2026-02-01 09:31:20.873690434 +0000 UTC m=+0.088097308 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:31:20 np0005604215.localdomain podman[254841]: 2026-02-01 09:31:20.884269548 +0000 UTC m=+0.098676422 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 01 09:31:20 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:31:21 np0005604215.localdomain python3.9[254922]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:21 np0005604215.localdomain python3.9[254977]: ansible-ansible.legacy.file Invoked with mode=0755 setype=container_file_t dest=/var/lib/neutron/kill_scripts/haproxy-kill _original_basename=kill-script.j2 recurse=False state=file path=/var/lib/neutron/kill_scripts/haproxy-kill force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:22 np0005604215.localdomain python3.9[255085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/dnsmasq-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:22 np0005604215.localdomain python3.9[255171]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/dnsmasq-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938281.8483894-626-83065069201312/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:23 np0005604215.localdomain sudo[255189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:31:23 np0005604215.localdomain sudo[255189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:31:23 np0005604215.localdomain sudo[255189]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:23 np0005604215.localdomain sudo[255224]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:31:23 np0005604215.localdomain sudo[255224]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:31:23 np0005604215.localdomain python3.9[255315]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:31:23 np0005604215.localdomain sudo[255224]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:24 np0005604215.localdomain sudo[255457]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jacuihdvnbfbsjzhnznxzhuebyvvcdgo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938283.863275-732-50430272600173/AnsiballZ_file.py
Feb 01 09:31:24 np0005604215.localdomain sudo[255457]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:24 np0005604215.localdomain python3.9[255459]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:24 np0005604215.localdomain sudo[255457]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:24 np0005604215.localdomain sudo[255493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:31:24 np0005604215.localdomain sudo[255493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:31:24 np0005604215.localdomain sudo[255493]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:24 np0005604215.localdomain sudo[255585]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pphfrlsbtxvmqxcuyvzmtubgzalhgwls ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938284.6204035-755-198552698679713/AnsiballZ_stat.py
Feb 01 09:31:24 np0005604215.localdomain sudo[255585]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:25 np0005604215.localdomain python3.9[255587]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:25 np0005604215.localdomain sudo[255585]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:25 np0005604215.localdomain sudo[255642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uzimvrlrhdzorjvingjezcwtznxlekly ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938284.6204035-755-198552698679713/AnsiballZ_file.py
Feb 01 09:31:25 np0005604215.localdomain sudo[255642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:25 np0005604215.localdomain python3.9[255644]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:25 np0005604215.localdomain sudo[255642]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:25 np0005604215.localdomain sudo[255752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uonustfuuejytbpldvuzmtdcfzhwyzrq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938285.6977143-755-10816862868096/AnsiballZ_stat.py
Feb 01 09:31:25 np0005604215.localdomain sudo[255752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:26 np0005604215.localdomain python3.9[255754]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:26 np0005604215.localdomain sudo[255752]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:26 np0005604215.localdomain sudo[255809]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rxzjflvvcsvjfrenzfllqyrzpbniebav ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938285.6977143-755-10816862868096/AnsiballZ_file.py
Feb 01 09:31:26 np0005604215.localdomain sudo[255809]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:26 np0005604215.localdomain python3.9[255811]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:26 np0005604215.localdomain sudo[255809]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:27 np0005604215.localdomain sudo[255919]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzrxsdrzqphiqrkgkofvehmbubzoqrwl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938287.6052575-824-245691497690550/AnsiballZ_file.py
Feb 01 09:31:27 np0005604215.localdomain sudo[255919]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:28 np0005604215.localdomain python3.9[255921]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:28 np0005604215.localdomain sudo[255919]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:28 np0005604215.localdomain sudo[256029]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcvjgtklzmljhitskbhoeqznrgiqtysw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938288.307913-849-264346487605878/AnsiballZ_stat.py
Feb 01 09:31:28 np0005604215.localdomain sudo[256029]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:28 np0005604215.localdomain python3.9[256031]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:28 np0005604215.localdomain sudo[256029]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:29 np0005604215.localdomain sudo[256086]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-byjsedtnephlwkhdymywshmivheuvuzl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938288.307913-849-264346487605878/AnsiballZ_file.py
Feb 01 09:31:29 np0005604215.localdomain sudo[256086]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:29 np0005604215.localdomain python3.9[256088]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:29 np0005604215.localdomain sudo[256086]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:31:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:31:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:31:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 144458 "" "Go-http-client/1.1"
Feb 01 09:31:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:31:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 15871 "" "Go-http-client/1.1"
Feb 01 09:31:30 np0005604215.localdomain sudo[256196]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cpcdwpzccilsbwyobkqrbfvtaugavyen ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938290.0262299-884-254845908597614/AnsiballZ_stat.py
Feb 01 09:31:30 np0005604215.localdomain sudo[256196]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:30 np0005604215.localdomain python3.9[256198]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:30 np0005604215.localdomain sudo[256196]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:30 np0005604215.localdomain sudo[256253]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mwxgfysxjixiylocdqexkjvtltirnfsu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938290.0262299-884-254845908597614/AnsiballZ_file.py
Feb 01 09:31:30 np0005604215.localdomain sudo[256253]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:30 np0005604215.localdomain python3.9[256255]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:30 np0005604215.localdomain sudo[256253]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:31:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:31:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:31:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:31:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:31:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:31:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:31:31 np0005604215.localdomain systemd[1]: tmp-crun.HP7Tjn.mount: Deactivated successfully.
Feb 01 09:31:31 np0005604215.localdomain podman[256281]: 2026-02-01 09:31:31.867698989 +0000 UTC m=+0.083120199 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:31:31 np0005604215.localdomain podman[256281]: 2026-02-01 09:31:31.903659344 +0000 UTC m=+0.119080524 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:31:31 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:31:32 np0005604215.localdomain sudo[256387]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ojyyosvxsqgzpkcxcyvlcvdqwcycssnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938291.7938905-921-40716422672793/AnsiballZ_systemd.py
Feb 01 09:31:32 np0005604215.localdomain sudo[256387]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:32 np0005604215.localdomain python3.9[256389]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:31:32 np0005604215.localdomain systemd-rc-local-generator[256410]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:31:32 np0005604215.localdomain systemd-sysv-generator[256414]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:32 np0005604215.localdomain sudo[256387]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18572 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66456510000000001030307) 
Feb 01 09:31:34 np0005604215.localdomain sudo[256534]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mvpoypasyknfpppiznklmwjwkmdplrew ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938294.1311483-944-111965318481507/AnsiballZ_stat.py
Feb 01 09:31:34 np0005604215.localdomain sudo[256534]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18573 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6645A4D0000000001030307) 
Feb 01 09:31:34 np0005604215.localdomain python3.9[256536]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:34 np0005604215.localdomain sudo[256534]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:34 np0005604215.localdomain sudo[256591]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tzyviavschkovafdubswlnczijmeuixi ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938294.1311483-944-111965318481507/AnsiballZ_file.py
Feb 01 09:31:34 np0005604215.localdomain sudo[256591]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:35 np0005604215.localdomain python3.9[256593]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:35 np0005604215.localdomain sudo[256591]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57335 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6645D0D0000000001030307) 
Feb 01 09:31:35 np0005604215.localdomain sudo[256701]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zmkusirqfofdcngwqrmpsrublalhuwzs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938295.4419115-980-66601482605896/AnsiballZ_stat.py
Feb 01 09:31:35 np0005604215.localdomain sudo[256701]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:35 np0005604215.localdomain python3.9[256703]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:35 np0005604215.localdomain sudo[256701]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:36 np0005604215.localdomain sudo[256758]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wriffamuretovdvlmozhsfkayplgozia ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938295.4419115-980-66601482605896/AnsiballZ_file.py
Feb 01 09:31:36 np0005604215.localdomain sudo[256758]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:36 np0005604215.localdomain python3.9[256760]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:36 np0005604215.localdomain sudo[256758]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18574 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664624D0000000001030307) 
Feb 01 09:31:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:31:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:31:36 np0005604215.localdomain podman[256816]: 2026-02-01 09:31:36.833001015 +0000 UTC m=+0.084015727 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:31:36 np0005604215.localdomain podman[256816]: 2026-02-01 09:31:36.875370849 +0000 UTC m=+0.126385561 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:31:36 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:31:36 np0005604215.localdomain podman[256817]: 2026-02-01 09:31:36.896641499 +0000 UTC m=+0.143835378 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:31:36 np0005604215.localdomain podman[256817]: 2026-02-01 09:31:36.912738404 +0000 UTC m=+0.159932293 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:31:36 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:31:37 np0005604215.localdomain sudo[256916]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-axkeocfyjrqsqmocnsaghdolnbmpuith ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938296.6718755-1017-163014938793743/AnsiballZ_systemd.py
Feb 01 09:31:37 np0005604215.localdomain sudo[256916]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:37 np0005604215.localdomain python3.9[256918]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:31:37 np0005604215.localdomain systemd-sysv-generator[256946]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:31:37 np0005604215.localdomain systemd-rc-local-generator[256941]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: Starting Create netns directory...
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: netns-placeholder.service: Deactivated successfully.
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: Finished Create netns directory.
Feb 01 09:31:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=3000 DF PROTO=TCP SPT=58216 DPT=9102 SEQ=319385334 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664670D0000000001030307) 
Feb 01 09:31:37 np0005604215.localdomain sudo[256916]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:37 np0005604215.localdomain systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Feb 01 09:31:39 np0005604215.localdomain sudo[257068]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gjwlpbcqfsswytidqabrqbssmkusmiee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938299.2343225-1047-128829518416660/AnsiballZ_file.py
Feb 01 09:31:39 np0005604215.localdomain sudo[257068]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:39 np0005604215.localdomain python3.9[257070]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/edpm-config recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:39 np0005604215.localdomain sudo[257068]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:40 np0005604215.localdomain sudo[257178]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-puxprzbhfgxuldcrmunynytddlzxzkeb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938299.997689-1070-198119033585008/AnsiballZ_file.py
Feb 01 09:31:40 np0005604215.localdomain sudo[257178]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:40 np0005604215.localdomain python3.9[257180]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:31:40 np0005604215.localdomain sudo[257178]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18575 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664720D0000000001030307) 
Feb 01 09:31:41 np0005604215.localdomain sudo[257288]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mbbtkqpqnpmayhxcdqletmdnaswhvxtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938300.780264-1094-58194158394229/AnsiballZ_stat.py
Feb 01 09:31:41 np0005604215.localdomain sudo[257288]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:41 np0005604215.localdomain python3.9[257290]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/neutron_dhcp_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:41 np0005604215.localdomain sudo[257288]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:41 np0005604215.localdomain sudo[257376]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hpjmftrcenrctirvrofdfakzhqbiskhf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938300.780264-1094-58194158394229/AnsiballZ_copy.py
Feb 01 09:31:41 np0005604215.localdomain sudo[257376]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:31:41.748 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:31:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:31:41.749 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:31:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:31:41.750 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:31:41 np0005604215.localdomain python3.9[257378]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/neutron_dhcp_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938300.780264-1094-58194158394229/.source.json _original_basename=.2qojcxw3 follow=False checksum=c62829c98c0f9e788d62f52aa71fba276cd98270 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:41 np0005604215.localdomain sudo[257376]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:42 np0005604215.localdomain python3.9[257486]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/neutron_dhcp state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:31:44 np0005604215.localdomain podman[257698]: 2026-02-01 09:31:44.880954521 +0000 UTC m=+0.092249210 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:31:44 np0005604215.localdomain podman[257698]: 2026-02-01 09:31:44.89072841 +0000 UTC m=+0.102023109 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:31:44 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:31:45 np0005604215.localdomain sudo[257807]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-yhhxidnqxtjypzujanwaybgmimhxkzdx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938305.6279993-1214-95999152546704/AnsiballZ_container_config_data.py
Feb 01 09:31:45 np0005604215.localdomain sudo[257807]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:46 np0005604215.localdomain python3.9[257809]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_pattern=*.json debug=False
Feb 01 09:31:46 np0005604215.localdomain sudo[257807]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:46 np0005604215.localdomain sudo[257917]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tstigocmofxkmllzcqtayhremtxmkmeq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938306.586906-1247-72291982982682/AnsiballZ_container_config_hash.py
Feb 01 09:31:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:31:47 np0005604215.localdomain sudo[257917]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:47 np0005604215.localdomain podman[257919]: 2026-02-01 09:31:47.091588331 +0000 UTC m=+0.073601209 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, release=1769056855, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:31:47 np0005604215.localdomain podman[257919]: 2026-02-01 09:31:47.104336517 +0000 UTC m=+0.086349425 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.openshift.expose-services=, managed_by=edpm_ansible, distribution-scope=public, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, architecture=x86_64, build-date=2026-01-22T05:09:47Z, release=1769056855)
Feb 01 09:31:47 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:31:47 np0005604215.localdomain python3.9[257920]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:31:47 np0005604215.localdomain sudo[257917]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:48 np0005604215.localdomain sudo[258047]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vdqdnnzpduvyydeuqftjlmhecgkzcktk ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769938307.6226814-1277-230708390048053/AnsiballZ_edpm_container_manage.py
Feb 01 09:31:48 np0005604215.localdomain sudo[258047]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:48 np0005604215.localdomain python3[258049]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/neutron_dhcp config_id=neutron_dhcp config_overrides={} config_patterns=*.json containers=['neutron_dhcp_agent'] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:31:48 np0005604215.localdomain podman[258086]: 
Feb 01 09:31:48 np0005604215.localdomain podman[258086]: 2026-02-01 09:31:48.615399158 +0000 UTC m=+0.071499267 container create b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, container_name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, config_id=neutron_dhcp, managed_by=edpm_ansible)
Feb 01 09:31:48 np0005604215.localdomain podman[258086]: 2026-02-01 09:31:48.574610161 +0000 UTC m=+0.030710380 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:31:48 np0005604215.localdomain python3[258049]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name neutron_dhcp_agent --cgroupns=host --conmon-pidfile /run/neutron_dhcp_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23 --label config_id=neutron_dhcp --label container_name=neutron_dhcp_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/netns:/run/netns:shared --volume /var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /run/openvswitch:/run/openvswitch:shared,z --volume /var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:31:48 np0005604215.localdomain sudo[258047]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18576 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664930D0000000001030307) 
Feb 01 09:31:49 np0005604215.localdomain sudo[258231]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vaiupykfqxaubaybuwhuzidokgwoblwc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938309.0682833-1301-271464258435438/AnsiballZ_stat.py
Feb 01 09:31:49 np0005604215.localdomain sudo[258231]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:49 np0005604215.localdomain python3.9[258233]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:31:49 np0005604215.localdomain sudo[258231]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:50 np0005604215.localdomain sudo[258343]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-traawrbkmjlkqckaeijnjyxqmhaapyxa ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938309.7939208-1328-17758190246339/AnsiballZ_file.py
Feb 01 09:31:50 np0005604215.localdomain sudo[258343]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:50 np0005604215.localdomain python3.9[258345]: ansible-file Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:50 np0005604215.localdomain sudo[258343]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:50 np0005604215.localdomain sudo[258398]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltiykmgnqydlwivdogcyuodyuifvdkoy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938309.7939208-1328-17758190246339/AnsiballZ_stat.py
Feb 01 09:31:50 np0005604215.localdomain sudo[258398]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:50 np0005604215.localdomain python3.9[258400]: ansible-stat Invoked with path=/etc/systemd/system/edpm_neutron_dhcp_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:31:50 np0005604215.localdomain sudo[258398]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:51 np0005604215.localdomain sudo[258507]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rlhodfbfmnzcdygzynlxzotmmnymofym ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938310.8014548-1328-3997270296890/AnsiballZ_copy.py
Feb 01 09:31:51 np0005604215.localdomain sudo[258507]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:31:51 np0005604215.localdomain systemd[1]: tmp-crun.LMjYxa.mount: Deactivated successfully.
Feb 01 09:31:51 np0005604215.localdomain podman[258510]: 2026-02-01 09:31:51.395495516 +0000 UTC m=+0.091275561 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Feb 01 09:31:51 np0005604215.localdomain podman[258510]: 2026-02-01 09:31:51.430470421 +0000 UTC m=+0.126250416 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:31:51 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:31:51 np0005604215.localdomain python3.9[258509]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769938310.8014548-1328-3997270296890/source dest=/etc/systemd/system/edpm_neutron_dhcp_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:51 np0005604215.localdomain sudo[258507]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:51 np0005604215.localdomain sudo[258580]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tghxqjgdhzcymxlltmyirkhbohpaipvg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938310.8014548-1328-3997270296890/AnsiballZ_systemd.py
Feb 01 09:31:51 np0005604215.localdomain sudo[258580]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:52 np0005604215.localdomain python3.9[258582]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:31:52 np0005604215.localdomain systemd-rc-local-generator[258604]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:31:52 np0005604215.localdomain systemd-sysv-generator[258612]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:52 np0005604215.localdomain sudo[258580]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:52 np0005604215.localdomain sudo[258671]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgfckoectrlzktnxztrfrxwycbyyerho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938310.8014548-1328-3997270296890/AnsiballZ_systemd.py
Feb 01 09:31:52 np0005604215.localdomain sudo[258671]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:53 np0005604215.localdomain python3.9[258673]: ansible-systemd Invoked with state=restarted name=edpm_neutron_dhcp_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:31:53 np0005604215.localdomain systemd-sysv-generator[258704]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:31:53 np0005604215.localdomain systemd-rc-local-generator[258698]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:31:53 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a989f60b3a2080a5252e7eea19f705cae8c281a273a318f9e5a90544c50aa0b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 01 09:31:53 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a989f60b3a2080a5252e7eea19f705cae8c281a273a318f9e5a90544c50aa0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:31:53 np0005604215.localdomain podman[258713]: 2026-02-01 09:31:53.702772935 +0000 UTC m=+0.138106228 container init b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=neutron_dhcp, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=neutron_dhcp_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 09:31:53 np0005604215.localdomain podman[258713]: 2026-02-01 09:31:53.713357848 +0000 UTC m=+0.148691141 container start b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=neutron_dhcp, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:31:53 np0005604215.localdomain podman[258713]: neutron_dhcp_agent
Feb 01 09:31:53 np0005604215.localdomain systemd[1]: Started neutron_dhcp_agent container.
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + sudo -E kolla_set_configs
Feb 01 09:31:53 np0005604215.localdomain sudo[258671]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Validating config file
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Copying service configuration files
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Writing out command to execute
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: ++ cat /run_command
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + CMD=/usr/bin/neutron-dhcp-agent
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + ARGS=
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + sudo kolla_copy_cacerts
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + [[ ! -n '' ]]
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + . kolla_extend_start
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: Running command: '/usr/bin/neutron-dhcp-agent'
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + umask 0022
Feb 01 09:31:53 np0005604215.localdomain neutron_dhcp_agent[258727]: + exec /usr/bin/neutron-dhcp-agent
Feb 01 09:31:54 np0005604215.localdomain python3.9[258849]: ansible-ansible.builtin.slurp Invoked with src=/var/lib/edpm-config/deployed_services.yaml
Feb 01 09:31:55 np0005604215.localdomain neutron_dhcp_agent[258727]: 2026-02-01 09:31:55.056 258731 INFO neutron.common.config [-] Logging enabled!
Feb 01 09:31:55 np0005604215.localdomain neutron_dhcp_agent[258727]: 2026-02-01 09:31:55.056 258731 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44
Feb 01 09:31:55 np0005604215.localdomain neutron_dhcp_agent[258727]: 2026-02-01 09:31:55.417 258731 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 01 09:31:55 np0005604215.localdomain neutron_dhcp_agent[258727]: 2026-02-01 09:31:55.924 258731 INFO neutron.agent.dhcp.agent [None req-6c0f86bd-e9af-4b61-b1b2-4e920b9c8b83 - - - - - -] All active networks have been fetched through RPC.
Feb 01 09:31:55 np0005604215.localdomain neutron_dhcp_agent[258727]: 2026-02-01 09:31:55.924 258731 INFO neutron.agent.dhcp.agent [None req-6c0f86bd-e9af-4b61-b1b2-4e920b9c8b83 - - - - - -] Synchronizing state complete
Feb 01 09:31:56 np0005604215.localdomain neutron_dhcp_agent[258727]: 2026-02-01 09:31:56.028 258731 INFO neutron.agent.dhcp.agent [None req-6c0f86bd-e9af-4b61-b1b2-4e920b9c8b83 - - - - - -] DHCP agent started
Feb 01 09:31:56 np0005604215.localdomain sudo[258958]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqfqozosyqukiszskxczyqvfmsjeezvt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938315.8374841-1463-238373792297138/AnsiballZ_stat.py
Feb 01 09:31:56 np0005604215.localdomain sudo[258958]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:56 np0005604215.localdomain python3.9[258960]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/deployed_services.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:31:56 np0005604215.localdomain sudo[258958]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:31:56.439 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:31:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:31:56.440 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:31:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:31:56.441 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:31:56 np0005604215.localdomain sudo[259048]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jipgozgyuiullelxnntrkfmwrrrcuyfq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938315.8374841-1463-238373792297138/AnsiballZ_copy.py
Feb 01 09:31:56 np0005604215.localdomain sudo[259048]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:57 np0005604215.localdomain python3.9[259050]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/deployed_services.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1769938315.8374841-1463-238373792297138/.source.yaml _original_basename=.m90oye6s follow=False checksum=552a83c15bca59d2cd0078e31025ce01db8bbba5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:31:57 np0005604215.localdomain sudo[259048]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:58 np0005604215.localdomain sudo[259158]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxhatipchzksfxdbxlewsrlffyevbqqp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938317.8564227-1508-83021107184878/AnsiballZ_systemd.py
Feb 01 09:31:58 np0005604215.localdomain sudo[259158]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:31:58 np0005604215.localdomain python3.9[259160]: ansible-ansible.builtin.systemd Invoked with name=edpm_neutron_dhcp_agent.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: Stopping neutron_dhcp_agent container...
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: libpod-b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d.scope: Deactivated successfully.
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: libpod-b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d.scope: Consumed 1.773s CPU time.
Feb 01 09:31:58 np0005604215.localdomain podman[259164]: 2026-02-01 09:31:58.578702305 +0000 UTC m=+0.073640119 container died b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=neutron_dhcp_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=neutron_dhcp, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:31:58 np0005604215.localdomain podman[259164]: 2026-02-01 09:31:58.630085606 +0000 UTC m=+0.125023380 container cleanup b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=neutron_dhcp, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:31:58 np0005604215.localdomain podman[259164]: neutron_dhcp_agent
Feb 01 09:31:58 np0005604215.localdomain podman[259204]: error opening file `/run/crun/b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d/status`: No such file or directory
Feb 01 09:31:58 np0005604215.localdomain podman[259190]: 2026-02-01 09:31:58.733681481 +0000 UTC m=+0.068723294 container cleanup b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=neutron_dhcp, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=neutron_dhcp_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:31:58 np0005604215.localdomain podman[259190]: neutron_dhcp_agent
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: edpm_neutron_dhcp_agent.service: Deactivated successfully.
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: Stopped neutron_dhcp_agent container.
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: Starting neutron_dhcp_agent container...
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:31:58 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a989f60b3a2080a5252e7eea19f705cae8c281a273a318f9e5a90544c50aa0b/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Feb 01 09:31:58 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a989f60b3a2080a5252e7eea19f705cae8c281a273a318f9e5a90544c50aa0b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:31:58 np0005604215.localdomain podman[259206]: 2026-02-01 09:31:58.872037025 +0000 UTC m=+0.109553123 container init b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, config_id=neutron_dhcp, container_name=neutron_dhcp_agent)
Feb 01 09:31:58 np0005604215.localdomain podman[259206]: 2026-02-01 09:31:58.880724332 +0000 UTC m=+0.118240430 container start b750db517c59365e833262da7c272dcf9cf70ce4eee90892de4d10b758378d2d (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron_dhcp_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-8e84beda5eab58443510a4045d90bb0c7d6ea956fd7905f3c2809e1ba1dc4a23'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/netns:/run/netns:shared', '/var/lib/openstack/neutron-dhcp-agent:/etc/neutron.conf.d:z', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/kolla/config_files/neutron_dhcp_agent.json:/var/lib/kolla/config_files/config.json:ro', '/run/openvswitch:/run/openvswitch:shared,z', '/var/lib/neutron/dhcp_agent_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/dhcp_agent_dnsmasq_wrapper:/usr/local/bin/dnsmasq:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-dhcp/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=neutron_dhcp, container_name=neutron_dhcp_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 01 09:31:58 np0005604215.localdomain podman[259206]: neutron_dhcp_agent
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + sudo -E kolla_set_configs
Feb 01 09:31:58 np0005604215.localdomain systemd[1]: Started neutron_dhcp_agent container.
Feb 01 09:31:58 np0005604215.localdomain sudo[259158]: pam_unix(sudo:session): session closed for user root
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Validating config file
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Copying service configuration files
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Writing out command to execute
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/.cache
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/external
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/ns-metadata-proxy
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/metadata_proxy
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_haproxy_wrapper
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/dhcp_agent_dnsmasq_wrapper
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/dnsmasq-kill
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/b9146dd2a0dc3e0bc3fee7bb1b53fa22a55af280b3a177d7a47b63f92e7ebd29
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/.cache/python-entrypoints/d91d8a949a4b5272256c667b5094a15f5e397c6793efbfa4186752b765c6923b
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: ++ cat /run_command
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + CMD=/usr/bin/neutron-dhcp-agent
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + ARGS=
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + sudo kolla_copy_cacerts
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + [[ ! -n '' ]]
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + . kolla_extend_start
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: Running command: '/usr/bin/neutron-dhcp-agent'
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + echo 'Running command: '\''/usr/bin/neutron-dhcp-agent'\'''
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + umask 0022
Feb 01 09:31:58 np0005604215.localdomain neutron_dhcp_agent[259221]: + exec /usr/bin/neutron-dhcp-agent
Feb 01 09:31:59 np0005604215.localdomain sshd[252106]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:31:59 np0005604215.localdomain systemd-logind[761]: Session 58 logged out. Waiting for processes to exit.
Feb 01 09:31:59 np0005604215.localdomain systemd[1]: session-58.scope: Deactivated successfully.
Feb 01 09:31:59 np0005604215.localdomain systemd[1]: session-58.scope: Consumed 34.604s CPU time.
Feb 01 09:31:59 np0005604215.localdomain systemd-logind[761]: Removed session 58.
Feb 01 09:32:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:32:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:32:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:32:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:32:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:32:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16310 "" "Go-http-client/1.1"
Feb 01 09:32:00 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:32:00.149 259225 INFO neutron.common.config [-] Logging enabled!
Feb 01 09:32:00 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:32:00.150 259225 INFO neutron.common.config [-] /usr/bin/neutron-dhcp-agent version 22.2.2.dev44
Feb 01 09:32:00 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:32:00.523 259225 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 01 09:32:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:32:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:32:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:32:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:32:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:32:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:32:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:32:01.910 259225 INFO neutron.agent.dhcp.agent [None req-4946e334-211f-4f65-8db9-f55ae8c5f289 - - - - - -] All active networks have been fetched through RPC.
Feb 01 09:32:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:32:01.911 259225 INFO neutron.agent.dhcp.agent [None req-4946e334-211f-4f65-8db9-f55ae8c5f289 - - - - - -] Synchronizing state complete
Feb 01 09:32:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:32:01.938 259225 INFO neutron.agent.dhcp.agent [None req-4946e334-211f-4f65-8db9-f55ae8c5f289 - - - - - -] DHCP agent started
Feb 01 09:32:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:02.217 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:32:02 np0005604215.localdomain systemd[1]: tmp-crun.7r9AUN.mount: Deactivated successfully.
Feb 01 09:32:02 np0005604215.localdomain podman[259254]: 2026-02-01 09:32:02.869402311 +0000 UTC m=+0.084942543 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:32:02 np0005604215.localdomain podman[259254]: 2026-02-01 09:32:02.901781608 +0000 UTC m=+0.117321880 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:32:02 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:32:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:02.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29003 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664CB810000000001030307) 
Feb 01 09:32:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:03.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:03 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:03.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:32:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29004 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664CF8D0000000001030307) 
Feb 01 09:32:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:04.991 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:05.015 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18577 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664D30E0000000001030307) 
Feb 01 09:32:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29005 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664D78E0000000001030307) 
Feb 01 09:32:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:06.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:06.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:06.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.026 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.027 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.027 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.027 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.028 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.471 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:32:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=57336 DF PROTO=TCP SPT=45564 DPT=9102 SEQ=579589686 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664DB0E0000000001030307) 
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.633 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.635 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12923MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.635 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.635 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.719 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.720 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:32:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:07.735 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:32:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:32:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:32:07 np0005604215.localdomain systemd[1]: tmp-crun.6reW7a.mount: Deactivated successfully.
Feb 01 09:32:07 np0005604215.localdomain podman[259300]: 2026-02-01 09:32:07.881361567 +0000 UTC m=+0.096844457 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:32:07 np0005604215.localdomain podman[259300]: 2026-02-01 09:32:07.924316528 +0000 UTC m=+0.139799468 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 01 09:32:07 np0005604215.localdomain systemd[1]: tmp-crun.6FYcSA.mount: Deactivated successfully.
Feb 01 09:32:07 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:32:07 np0005604215.localdomain podman[259301]: 2026-02-01 09:32:07.929230194 +0000 UTC m=+0.141858729 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:32:08 np0005604215.localdomain podman[259301]: 2026-02-01 09:32:08.009503259 +0000 UTC m=+0.222131854 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:32:08 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:32:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:08.264 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.529s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:32:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:08.269 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:32:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:08.284 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:32:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:08.287 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:32:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:08.287 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.652s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:32:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:09.283 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:09.284 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:32:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:09.284 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:32:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:09.285 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:32:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:32:09.302 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:32:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29006 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA664E74D0000000001030307) 
Feb 01 09:32:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:32:15 np0005604215.localdomain podman[259366]: 2026-02-01 09:32:15.862436895 +0000 UTC m=+0.079223285 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:32:15 np0005604215.localdomain podman[259366]: 2026-02-01 09:32:15.896785961 +0000 UTC m=+0.113572311 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:32:15 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:32:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:32:17 np0005604215.localdomain podman[259385]: 2026-02-01 09:32:17.875447248 +0000 UTC m=+0.076495826 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal)
Feb 01 09:32:17 np0005604215.localdomain podman[259385]: 2026-02-01 09:32:17.887125157 +0000 UTC m=+0.088173705 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1769056855, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:32:17 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:32:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29007 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665070D0000000001030307) 
Feb 01 09:32:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:32:21 np0005604215.localdomain podman[259406]: 2026-02-01 09:32:21.866371249 +0000 UTC m=+0.077644011 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:32:21 np0005604215.localdomain podman[259406]: 2026-02-01 09:32:21.901755759 +0000 UTC m=+0.113028511 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:32:21 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:32:23 np0005604215.localdomain sshd[259426]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:32:24 np0005604215.localdomain sshd[259426]: Invalid user recepcion from 85.206.171.113 port 40238
Feb 01 09:32:24 np0005604215.localdomain sshd[259426]: Received disconnect from 85.206.171.113 port 40238:11: Bye Bye [preauth]
Feb 01 09:32:24 np0005604215.localdomain sshd[259426]: Disconnected from invalid user recepcion 85.206.171.113 port 40238 [preauth]
Feb 01 09:32:24 np0005604215.localdomain sudo[259428]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:32:24 np0005604215.localdomain sudo[259428]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:32:24 np0005604215.localdomain sudo[259428]: pam_unix(sudo:session): session closed for user root
Feb 01 09:32:24 np0005604215.localdomain sudo[259446]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:32:24 np0005604215.localdomain sudo[259446]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:32:25 np0005604215.localdomain sudo[259446]: pam_unix(sudo:session): session closed for user root
Feb 01 09:32:27 np0005604215.localdomain sudo[259496]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:32:27 np0005604215.localdomain sudo[259496]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:32:27 np0005604215.localdomain sudo[259496]: pam_unix(sudo:session): session closed for user root
Feb 01 09:32:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:32:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:32:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:32:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:32:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:32:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16300 "" "Go-http-client/1.1"
Feb 01 09:32:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:32:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:32:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:32:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:32:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:32:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:32:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36026 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66540B10000000001030307) 
Feb 01 09:32:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:32:33 np0005604215.localdomain podman[259514]: 2026-02-01 09:32:33.866003413 +0000 UTC m=+0.079993623 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:32:33 np0005604215.localdomain podman[259514]: 2026-02-01 09:32:33.873233405 +0000 UTC m=+0.087223595 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:32:33 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:32:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36027 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66544CE0000000001030307) 
Feb 01 09:32:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29008 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665470D0000000001030307) 
Feb 01 09:32:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36028 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6654CCE0000000001030307) 
Feb 01 09:32:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=18578 DF PROTO=TCP SPT=38724 DPT=9102 SEQ=758592592 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665510D0000000001030307) 
Feb 01 09:32:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:32:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:32:38 np0005604215.localdomain podman[259538]: 2026-02-01 09:32:38.867332833 +0000 UTC m=+0.078602940 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:32:38 np0005604215.localdomain podman[259538]: 2026-02-01 09:32:38.877709463 +0000 UTC m=+0.088979640 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:32:38 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:32:38 np0005604215.localdomain podman[259537]: 2026-02-01 09:32:38.965331419 +0000 UTC m=+0.179890467 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:32:39 np0005604215.localdomain podman[259537]: 2026-02-01 09:32:39.074775838 +0000 UTC m=+0.289334906 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller)
Feb 01 09:32:39 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:32:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36029 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6655C8D0000000001030307) 
Feb 01 09:32:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:32:41.749 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:32:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:32:41.749 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:32:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:32:41.750 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:32:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:32:46 np0005604215.localdomain podman[259584]: 2026-02-01 09:32:46.862338134 +0000 UTC m=+0.077653047 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:32:46 np0005604215.localdomain podman[259584]: 2026-02-01 09:32:46.892548406 +0000 UTC m=+0.107863259 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:32:46 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:32:48 np0005604215.localdomain sshd[259602]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:32:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:32:48 np0005604215.localdomain sshd[259602]: Accepted publickey for zuul from 192.168.122.30 port 43174 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:32:48 np0005604215.localdomain systemd-logind[761]: New session 59 of user zuul.
Feb 01 09:32:48 np0005604215.localdomain systemd[1]: Started Session 59 of User zuul.
Feb 01 09:32:48 np0005604215.localdomain sshd[259602]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:32:48 np0005604215.localdomain podman[259604]: 2026-02-01 09:32:48.744431231 +0000 UTC m=+0.077419560 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, release=1769056855, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, name=ubi9/ubi-minimal)
Feb 01 09:32:48 np0005604215.localdomain podman[259604]: 2026-02-01 09:32:48.760768856 +0000 UTC m=+0.093757225 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, release=1769056855, name=ubi9/ubi-minimal)
Feb 01 09:32:48 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:32:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36030 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6657D0D0000000001030307) 
Feb 01 09:32:49 np0005604215.localdomain python3.9[259732]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:32:51 np0005604215.localdomain python3.9[259844]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:32:51 np0005604215.localdomain network[259861]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:32:51 np0005604215.localdomain network[259862]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:32:51 np0005604215.localdomain network[259863]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:32:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:32:52 np0005604215.localdomain systemd[1]: tmp-crun.fsdkc1.mount: Deactivated successfully.
Feb 01 09:32:52 np0005604215.localdomain podman[259869]: 2026-02-01 09:32:52.881922654 +0000 UTC m=+0.093299210 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true)
Feb 01 09:32:52 np0005604215.localdomain podman[259869]: 2026-02-01 09:32:52.919741771 +0000 UTC m=+0.131118397 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:32:52 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:32:53 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:32:58 np0005604215.localdomain sudo[260111]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pxioiojfavkzrxnobdjqvriyfwwhwtmm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938377.8732584-98-208624085864063/AnsiballZ_setup.py
Feb 01 09:32:58 np0005604215.localdomain sudo[260111]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:32:58 np0005604215.localdomain python3.9[260113]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Feb 01 09:32:58 np0005604215.localdomain sudo[260111]: pam_unix(sudo:session): session closed for user root
Feb 01 09:32:59 np0005604215.localdomain sudo[260174]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-baunfhxkueuklwdrplpkagqjjwqpvksp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938377.8732584-98-208624085864063/AnsiballZ_dnf.py
Feb 01 09:32:59 np0005604215.localdomain sudo[260174]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:32:59 np0005604215.localdomain python3.9[260176]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:33:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:33:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:33:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:33:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:33:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:33:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16315 "" "Go-http-client/1.1"
Feb 01 09:33:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:33:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:33:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:33:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:33:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:33:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:33:02 np0005604215.localdomain sudo[260174]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:02.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:02.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:33:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:33:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5544 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665B5E10000000001030307) 
Feb 01 09:33:03 np0005604215.localdomain sudo[260286]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-onuktbtrymgjedzklwnqrsnchjxdtegs ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938383.4721172-135-123345502906524/AnsiballZ_stat.py
Feb 01 09:33:03 np0005604215.localdomain sudo[260286]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:04 np0005604215.localdomain python3.9[260288]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:33:04 np0005604215.localdomain sudo[260286]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5545 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665B9CD0000000001030307) 
Feb 01 09:33:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:33:04 np0005604215.localdomain podman[260360]: 2026-02-01 09:33:04.868978548 +0000 UTC m=+0.080846276 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:33:04 np0005604215.localdomain podman[260360]: 2026-02-01 09:33:04.881622248 +0000 UTC m=+0.093489936 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:33:04 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:33:04 np0005604215.localdomain sudo[260419]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rrxtogeftshbxoxzxgzyawqmjpqizsde ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938384.484961-165-5343903906162/AnsiballZ_command.py
Feb 01 09:33:04 np0005604215.localdomain sudo[260419]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:04.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:04.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:33:05 np0005604215.localdomain python3.9[260421]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:33:05 np0005604215.localdomain sudo[260419]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36031 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665BD0D0000000001030307) 
Feb 01 09:33:05 np0005604215.localdomain sudo[260530]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tcdjucvsyfbueavbnuohdzfzntobofwv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938385.5168946-194-182377616505599/AnsiballZ_stat.py
Feb 01 09:33:05 np0005604215.localdomain sudo[260530]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:05.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:05 np0005604215.localdomain python3.9[260532]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:33:06 np0005604215.localdomain sudo[260530]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5546 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665C1CD0000000001030307) 
Feb 01 09:33:06 np0005604215.localdomain sudo[260642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ionlhwpvmaaxvufszztvooqgdvihogrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938386.5239732-227-185434765412790/AnsiballZ_lineinfile.py
Feb 01 09:33:06 np0005604215.localdomain sudo[260642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:06.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:06.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.016 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.016 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.016 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.017 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.017 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:33:07 np0005604215.localdomain python3.9[260644]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:07 np0005604215.localdomain sudo[260642]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29009 DF PROTO=TCP SPT=55632 DPT=9102 SEQ=1180852269 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665C50D0000000001030307) 
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.482 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.670 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.672 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12912MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.673 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.674 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.750 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.751 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:33:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:07.770 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:33:08 np0005604215.localdomain sudo[260794]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-giqezwshungelgizxqbnsozsdejwquwj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938387.4942176-255-200037424987829/AnsiballZ_systemd_service.py
Feb 01 09:33:08 np0005604215.localdomain sudo[260794]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:08.234 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.463s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:33:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:08.240 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:33:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:08.256 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:33:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:08.258 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:33:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:08.259 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.585s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:33:08 np0005604215.localdomain python3.9[260796]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:33:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:33:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:33:09 np0005604215.localdomain sudo[260794]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:09 np0005604215.localdomain podman[260802]: 2026-02-01 09:33:09.522651737 +0000 UTC m=+0.075232102 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:33:09 np0005604215.localdomain podman[260802]: 2026-02-01 09:33:09.534586525 +0000 UTC m=+0.087166810 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:33:09 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:33:09 np0005604215.localdomain podman[260801]: 2026-02-01 09:33:09.581248155 +0000 UTC m=+0.137644237 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Feb 01 09:33:09 np0005604215.localdomain podman[260801]: 2026-02-01 09:33:09.640845344 +0000 UTC m=+0.197241426 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:33:09 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:33:10 np0005604215.localdomain sudo[260956]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-aczkgurftmtweyjyxefkedrdccwoiihr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938389.752051-278-15630373228227/AnsiballZ_systemd_service.py
Feb 01 09:33:10 np0005604215.localdomain sudo[260956]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:10.256 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:10.257 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:10.258 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:33:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:10.258 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:33:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:10.276 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:33:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:33:10.276 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:33:10 np0005604215.localdomain python3.9[260958]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:33:10 np0005604215.localdomain sudo[260956]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5547 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665D18E0000000001030307) 
Feb 01 09:33:12 np0005604215.localdomain python3.9[261068]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:33:12 np0005604215.localdomain network[261085]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:33:12 np0005604215.localdomain network[261086]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:33:12 np0005604215.localdomain network[261087]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:33:14 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:33:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:33:17 np0005604215.localdomain podman[261200]: 2026-02-01 09:33:17.864954299 +0000 UTC m=+0.079445793 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3)
Feb 01 09:33:17 np0005604215.localdomain podman[261200]: 2026-02-01 09:33:17.873646437 +0000 UTC m=+0.088137971 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:33:17 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:33:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5548 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA665F10D0000000001030307) 
Feb 01 09:33:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:33:19 np0005604215.localdomain podman[261226]: 2026-02-01 09:33:19.071806724 +0000 UTC m=+0.079786783 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:33:19 np0005604215.localdomain podman[261226]: 2026-02-01 09:33:19.084910428 +0000 UTC m=+0.092890467 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, version=9.7, architecture=x86_64, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible)
Feb 01 09:33:19 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:33:20 np0005604215.localdomain sudo[261357]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-wuzzujfbkxghxumaddkjbfaaurtltvft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938399.568118-347-62307399037856/AnsiballZ_dnf.py
Feb 01 09:33:20 np0005604215.localdomain sudo[261357]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:20 np0005604215.localdomain python3.9[261359]: ansible-ansible.legacy.dnf Invoked with name=['device-mapper-multipath'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:33:23 np0005604215.localdomain sudo[261357]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:33:23 np0005604215.localdomain podman[261379]: 2026-02-01 09:33:23.865796512 +0000 UTC m=+0.081436784 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute)
Feb 01 09:33:23 np0005604215.localdomain podman[261379]: 2026-02-01 09:33:23.876312676 +0000 UTC m=+0.091952988 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 01 09:33:23 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:33:25 np0005604215.localdomain sudo[261488]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kuqzijqfpdjmxzxrcvixxvovqecfnotn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938404.148151-375-47436287958211/AnsiballZ_file.py
Feb 01 09:33:25 np0005604215.localdomain sudo[261488]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:25 np0005604215.localdomain python3.9[261490]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 09:33:25 np0005604215.localdomain sudo[261488]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:26 np0005604215.localdomain sudo[261598]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-zgzketkkxgrwooowpeocioailjtfnnwk ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938405.691831-398-82614419924329/AnsiballZ_modprobe.py
Feb 01 09:33:26 np0005604215.localdomain sudo[261598]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:26 np0005604215.localdomain python3.9[261600]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Feb 01 09:33:26 np0005604215.localdomain sudo[261598]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:26 np0005604215.localdomain sudo[261708]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-htzefisvhfwvhpjyontvcuwzpvsdkbif ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938406.5587728-423-701232961335/AnsiballZ_stat.py
Feb 01 09:33:26 np0005604215.localdomain sudo[261708]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:27 np0005604215.localdomain python3.9[261710]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:33:27 np0005604215.localdomain sudo[261708]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:27 np0005604215.localdomain sudo[261765]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dmpufxsuugsqxsrzeutbbpbjfdmpctvo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938406.5587728-423-701232961335/AnsiballZ_file.py
Feb 01 09:33:27 np0005604215.localdomain sudo[261765]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:27 np0005604215.localdomain python3.9[261767]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/dm-multipath.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/dm-multipath.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:27 np0005604215.localdomain sudo[261765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:28 np0005604215.localdomain sudo[261823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:33:28 np0005604215.localdomain sudo[261823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:33:28 np0005604215.localdomain sudo[261823]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:28 np0005604215.localdomain sudo[261861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:33:28 np0005604215.localdomain sudo[261861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:33:28 np0005604215.localdomain sudo[261911]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-enpiaopvdqcvumzkxhzgtljoazepeozx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938407.9051352-461-131777177668408/AnsiballZ_lineinfile.py
Feb 01 09:33:28 np0005604215.localdomain sudo[261911]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:28 np0005604215.localdomain python3.9[261913]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:28 np0005604215.localdomain sudo[261911]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:28 np0005604215.localdomain sudo[261861]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:28 np0005604215.localdomain sudo[262053]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xthakdfjuosefvoixxftoxdnogzpsmdf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938408.6865044-489-218727067599879/AnsiballZ_command.py
Feb 01 09:33:28 np0005604215.localdomain sudo[262053]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:29 np0005604215.localdomain python3.9[262055]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:33:29 np0005604215.localdomain sudo[262053]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:29 np0005604215.localdomain sudo[262164]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxzfxyhicagklzisvvhtcroqlrqpmdsn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938409.4768605-512-161103564895540/AnsiballZ_command.py
Feb 01 09:33:29 np0005604215.localdomain sudo[262164]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:29 np0005604215.localdomain python3.9[262166]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -rF /etc/multipath _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:33:29 np0005604215.localdomain sudo[262164]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:33:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:33:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:33:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:33:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:33:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16316 "" "Go-http-client/1.1"
Feb 01 09:33:30 np0005604215.localdomain sudo[262186]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:33:30 np0005604215.localdomain sudo[262186]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:33:30 np0005604215.localdomain sudo[262186]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:30 np0005604215.localdomain sudo[262293]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rqwnjdydcahpbvooozkyhztzpdefasrj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938410.3178883-540-268912529150853/AnsiballZ_stat.py
Feb 01 09:33:30 np0005604215.localdomain sudo[262293]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:30 np0005604215.localdomain python3.9[262295]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:33:30 np0005604215.localdomain sudo[262293]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:33:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:33:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:33:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:33:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:33:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:33:32 np0005604215.localdomain sudo[262405]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ionkuwsmxntjrmjcdtjpiayadrvpyocy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938411.853525-570-16826534936887/AnsiballZ_command.py
Feb 01 09:33:32 np0005604215.localdomain sudo[262405]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:32 np0005604215.localdomain python3.9[262407]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:33:32 np0005604215.localdomain sudo[262405]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:33 np0005604215.localdomain sudo[262516]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vtgcwjxmwozosczdxeimbqvhpaddwrqv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938412.7031016-600-162007529201460/AnsiballZ_replace.py
Feb 01 09:33:33 np0005604215.localdomain sudo[262516]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:33 np0005604215.localdomain python3.9[262518]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:33 np0005604215.localdomain sudo[262516]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45624 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6662B110000000001030307) 
Feb 01 09:33:33 np0005604215.localdomain sudo[262626]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-iuzvmhaawpifxcvgipzvcjkswgnsbzkn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938413.6716468-627-95526068589695/AnsiballZ_lineinfile.py
Feb 01 09:33:33 np0005604215.localdomain sudo[262626]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:34 np0005604215.localdomain python3.9[262628]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:34 np0005604215.localdomain sudo[262626]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45625 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6662F0D0000000001030307) 
Feb 01 09:33:34 np0005604215.localdomain sudo[262736]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-szkvlstssrdvwuzvyrmwxliualwtvwbm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938414.3282757-627-33552543996311/AnsiballZ_lineinfile.py
Feb 01 09:33:34 np0005604215.localdomain sudo[262736]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:34 np0005604215.localdomain python3.9[262738]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:34 np0005604215.localdomain sudo[262736]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5549 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666310D0000000001030307) 
Feb 01 09:33:35 np0005604215.localdomain sudo[262846]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qiuykjlrbhvbvkrwgxvxgowvkvwlwwhq ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938414.9539254-627-52105363595007/AnsiballZ_lineinfile.py
Feb 01 09:33:35 np0005604215.localdomain sudo[262846]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:33:35 np0005604215.localdomain podman[262849]: 2026-02-01 09:33:35.366778228 +0000 UTC m=+0.083610850 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:33:35 np0005604215.localdomain podman[262849]: 2026-02-01 09:33:35.404767771 +0000 UTC m=+0.121600393 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:33:35 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:33:35 np0005604215.localdomain python3.9[262848]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:35 np0005604215.localdomain sudo[262846]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:35 np0005604215.localdomain sudo[262979]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mynruqpugdwphmqxideqezesdlsrrhre ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938415.6261492-627-195123480148243/AnsiballZ_lineinfile.py
Feb 01 09:33:35 np0005604215.localdomain sudo[262979]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:36 np0005604215.localdomain python3.9[262981]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:36 np0005604215.localdomain sudo[262979]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45626 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666370D0000000001030307) 
Feb 01 09:33:37 np0005604215.localdomain sudo[263089]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gfjnaxzaqtolgibexvrhedobdciqaerx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938417.0707152-714-277994986573428/AnsiballZ_stat.py
Feb 01 09:33:37 np0005604215.localdomain sudo[263089]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:37 np0005604215.localdomain python3.9[263091]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:33:37 np0005604215.localdomain sudo[263089]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=36032 DF PROTO=TCP SPT=49088 DPT=9102 SEQ=3408466566 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6663B0E0000000001030307) 
Feb 01 09:33:38 np0005604215.localdomain sudo[263201]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bhftltgpzavbkmdiobhluxcsqhzsxmrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938418.0077546-743-236824643521996/AnsiballZ_systemd_service.py
Feb 01 09:33:38 np0005604215.localdomain sudo[263201]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:38 np0005604215.localdomain python3.9[263203]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:33:38 np0005604215.localdomain sudo[263201]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:39 np0005604215.localdomain sudo[263313]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvlmuvzbanotpehvehczqgjwdscqtpxo ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938419.4006062-767-26877305275810/AnsiballZ_systemd_service.py
Feb 01 09:33:39 np0005604215.localdomain sudo[263313]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:33:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:33:39 np0005604215.localdomain podman[263316]: 2026-02-01 09:33:39.814363259 +0000 UTC m=+0.080942069 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:33:39 np0005604215.localdomain systemd[1]: tmp-crun.z8B5vx.mount: Deactivated successfully.
Feb 01 09:33:39 np0005604215.localdomain podman[263317]: 2026-02-01 09:33:39.901940431 +0000 UTC m=+0.163378022 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:33:39 np0005604215.localdomain podman[263316]: 2026-02-01 09:33:39.905573992 +0000 UTC m=+0.172152782 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:33:39 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:33:39 np0005604215.localdomain podman[263317]: 2026-02-01 09:33:39.964131349 +0000 UTC m=+0.225569000 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:33:39 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:33:40 np0005604215.localdomain python3.9[263315]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=multipathd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:33:40 np0005604215.localdomain sudo[263313]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45627 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66646CD0000000001030307) 
Feb 01 09:33:40 np0005604215.localdomain sudo[263473]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucmrvhcznesyesjqxtrlhmwffwsrkajx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938420.6043477-804-42698226796183/AnsiballZ_file.py
Feb 01 09:33:40 np0005604215.localdomain sudo[263473]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:41 np0005604215.localdomain python3.9[263475]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Feb 01 09:33:41 np0005604215.localdomain sudo[263473]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:41 np0005604215.localdomain sudo[263583]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ivtmlrybitsftjriooikpbeydfguaxwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938421.3518548-828-204743969668381/AnsiballZ_modprobe.py
Feb 01 09:33:41 np0005604215.localdomain sudo[263583]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:33:41.750 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:33:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:33:41.750 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:33:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:33:41.750 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:33:41 np0005604215.localdomain python3.9[263585]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Feb 01 09:33:41 np0005604215.localdomain sudo[263583]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:42 np0005604215.localdomain sudo[263693]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xwlefqflzxvneiicwrdmsabpgpbpsuho ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938422.104365-852-212271060510123/AnsiballZ_stat.py
Feb 01 09:33:42 np0005604215.localdomain sudo[263693]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:42 np0005604215.localdomain python3.9[263695]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:33:42 np0005604215.localdomain sudo[263693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:42 np0005604215.localdomain sudo[263750]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-powfhtjghrnmyyamxasjliegsylhwrtn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938422.104365-852-212271060510123/AnsiballZ_file.py
Feb 01 09:33:42 np0005604215.localdomain sudo[263750]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:43 np0005604215.localdomain python3.9[263752]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/modules-load.d/nvme-fabrics.conf _original_basename=module-load.conf.j2 recurse=False state=file path=/etc/modules-load.d/nvme-fabrics.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:43 np0005604215.localdomain sudo[263750]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:43 np0005604215.localdomain sudo[263860]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rtvgeqbegsbhfuffnvqvtefjnghullyv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938423.3981385-891-202339616851406/AnsiballZ_lineinfile.py
Feb 01 09:33:43 np0005604215.localdomain sudo[263860]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:43 np0005604215.localdomain python3.9[263862]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:43 np0005604215.localdomain sudo[263860]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:44 np0005604215.localdomain sudo[263970]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fitmitrswtzdcwofzjtisrminrtmkxdu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938424.3767068-918-140861947295269/AnsiballZ_dnf.py
Feb 01 09:33:44 np0005604215.localdomain sudo[263970]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:44 np0005604215.localdomain python3.9[263972]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Feb 01 09:33:46 np0005604215.localdomain sshd[263975]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:33:47 np0005604215.localdomain sshd[263975]: Invalid user test001 from 85.206.171.113 port 39846
Feb 01 09:33:47 np0005604215.localdomain sshd[263975]: Received disconnect from 85.206.171.113 port 39846:11: Bye Bye [preauth]
Feb 01 09:33:47 np0005604215.localdomain sshd[263975]: Disconnected from invalid user test001 85.206.171.113 port 39846 [preauth]
Feb 01 09:33:48 np0005604215.localdomain sudo[263970]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:33:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45628 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666670D0000000001030307) 
Feb 01 09:33:48 np0005604215.localdomain systemd[1]: tmp-crun.vyuohI.mount: Deactivated successfully.
Feb 01 09:33:48 np0005604215.localdomain podman[264079]: 2026-02-01 09:33:48.875227031 +0000 UTC m=+0.087323895 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:33:48 np0005604215.localdomain podman[264079]: 2026-02-01 09:33:48.881785584 +0000 UTC m=+0.093882458 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 01 09:33:48 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:33:49 np0005604215.localdomain python3.9[264090]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Feb 01 09:33:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:33:49 np0005604215.localdomain podman[264159]: 2026-02-01 09:33:49.86959421 +0000 UTC m=+0.084162258 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, name=ubi9/ubi-minimal)
Feb 01 09:33:49 np0005604215.localdomain podman[264159]: 2026-02-01 09:33:49.887569145 +0000 UTC m=+0.102137153 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Feb 01 09:33:49 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:33:50 np0005604215.localdomain sudo[264236]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fusqoffpcasgcddjcnrppttijkxftdnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938429.7275338-970-213478213291158/AnsiballZ_file.py
Feb 01 09:33:50 np0005604215.localdomain sudo[264236]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:50 np0005604215.localdomain python3.9[264238]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:33:50 np0005604215.localdomain sudo[264236]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:51 np0005604215.localdomain sudo[264346]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-voyytiwcuwfueivvsmbbnxntiggkshxe ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938431.3247712-1003-98886430389081/AnsiballZ_systemd_service.py
Feb 01 09:33:51 np0005604215.localdomain sudo[264346]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:51 np0005604215.localdomain python3.9[264348]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:33:51 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:33:52 np0005604215.localdomain systemd-sysv-generator[264378]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:33:52 np0005604215.localdomain systemd-rc-local-generator[264373]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:33:52 np0005604215.localdomain sudo[264346]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:53 np0005604215.localdomain python3.9[264492]: ansible-ansible.builtin.service_facts Invoked
Feb 01 09:33:53 np0005604215.localdomain network[264509]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Feb 01 09:33:53 np0005604215.localdomain network[264510]: 'network-scripts' will be removed from distribution in near future.
Feb 01 09:33:53 np0005604215.localdomain network[264511]: It is advised to switch to 'NetworkManager' instead for network management.
Feb 01 09:33:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:33:54 np0005604215.localdomain systemd[1]: tmp-crun.pQwHhm.mount: Deactivated successfully.
Feb 01 09:33:54 np0005604215.localdomain podman[264519]: 2026-02-01 09:33:54.525563249 +0000 UTC m=+0.101199013 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 01 09:33:54 np0005604215.localdomain podman[264519]: 2026-02-01 09:33:54.564425468 +0000 UTC m=+0.140061232 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127)
Feb 01 09:33:54 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:33:54 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client.service:23: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:33:58 np0005604215.localdomain sudo[264761]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gcpvitehuboktflqtktgpdlupgrhylmz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938438.2209656-1060-95532413585484/AnsiballZ_systemd_service.py
Feb 01 09:33:58 np0005604215.localdomain sudo[264761]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:58 np0005604215.localdomain python3.9[264763]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:33:58 np0005604215.localdomain sudo[264761]: pam_unix(sudo:session): session closed for user root
Feb 01 09:33:59 np0005604215.localdomain sudo[264872]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlppebtyzbjsahysrvjpkyqqlnotcjbf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938439.0407932-1060-115655487477825/AnsiballZ_systemd_service.py
Feb 01 09:33:59 np0005604215.localdomain sudo[264872]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:33:59 np0005604215.localdomain python3.9[264874]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:33:59 np0005604215.localdomain sudo[264872]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:34:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:34:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:34:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:34:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:34:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16312 "" "Go-http-client/1.1"
Feb 01 09:34:00 np0005604215.localdomain sudo[264983]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-oydfsrsxlpyuakcqsmpdawzfqsbtbelv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938439.8598294-1060-203508513213702/AnsiballZ_systemd_service.py
Feb 01 09:34:00 np0005604215.localdomain sudo[264983]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:00 np0005604215.localdomain python3.9[264985]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:34:00 np0005604215.localdomain sudo[264983]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:01 np0005604215.localdomain sudo[265094]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uxtriczrrqnqkzhbxttnmlkbpqpymkfu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938440.6953077-1060-69352803684158/AnsiballZ_systemd_service.py
Feb 01 09:34:01 np0005604215.localdomain sudo[265094]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:01 np0005604215.localdomain python3.9[265096]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:34:01 np0005604215.localdomain sudo[265094]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:34:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:34:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:34:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:34:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:34:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:34:01 np0005604215.localdomain sudo[265205]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ecxzlfaypberdilnonnhmyalcvqfmkga ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938441.4931004-1060-81839417081093/AnsiballZ_systemd_service.py
Feb 01 09:34:01 np0005604215.localdomain sudo[265205]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:02 np0005604215.localdomain python3.9[265207]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:34:02 np0005604215.localdomain sudo[265205]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:02 np0005604215.localdomain sudo[265316]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jfbnvxhpbhvrnkdcgtljgsgowvhoprpf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938442.2973032-1060-91600039834709/AnsiballZ_systemd_service.py
Feb 01 09:34:02 np0005604215.localdomain sudo[265316]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:02 np0005604215.localdomain python3.9[265318]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:34:02 np0005604215.localdomain sudo[265316]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:02.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:02 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:02.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 09:34:03 np0005604215.localdomain sudo[265427]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vhuhrtksvwnvcislxoaqdmsatgtmcrft ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938443.0076282-1060-25381806014002/AnsiballZ_systemd_service.py
Feb 01 09:34:03 np0005604215.localdomain sudo[265427]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51450 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666A0410000000001030307) 
Feb 01 09:34:03 np0005604215.localdomain python3.9[265429]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:34:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:04.013 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51451 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666A44D0000000001030307) 
Feb 01 09:34:04 np0005604215.localdomain sudo[265427]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:04 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:04.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:05 np0005604215.localdomain sudo[265538]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rmwjimjdpfcqwuoyjopnhouroocuadzv ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938444.7311969-1060-227943845174572/AnsiballZ_systemd_service.py
Feb 01 09:34:05 np0005604215.localdomain sudo[265538]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45629 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666A70E0000000001030307) 
Feb 01 09:34:05 np0005604215.localdomain python3.9[265540]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:34:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:34:05 np0005604215.localdomain podman[265542]: 2026-02-01 09:34:05.868990325 +0000 UTC m=+0.081754513 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:34:05 np0005604215.localdomain podman[265542]: 2026-02-01 09:34:05.881800051 +0000 UTC m=+0.094564239 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:34:05 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:34:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:05.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:05.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:05.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:34:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:05.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:05.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 09:34:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:06.020 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 09:34:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:06.020 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:06 np0005604215.localdomain sudo[265538]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51452 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666AC4D0000000001030307) 
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.034 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.066 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.066 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.067 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.067 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.067 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:34:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=5550 DF PROTO=TCP SPT=34100 DPT=9102 SEQ=2814826323 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666AF0D0000000001030307) 
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.516 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.704 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.705 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12909MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.706 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.706 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.826 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.827 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.885 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.969 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.970 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:34:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:07.986 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.009 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: HW_CPU_X86_BMI,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,HW_CPU_X86_CLMUL,COMPUTE_STORAGE_BUS_IDE,HW_CPU_X86_SSE,COMPUTE_STORAGE_BUS_FDC,HW_CPU_X86_SHA,COMPUTE_STORAGE_BUS_USB,COMPUTE_NODE,HW_CPU_X86_AESNI,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_BMI2,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE41,COMPUTE_RESCUE_BFV,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,HW_CPU_X86_AVX,HW_CPU_X86_ABM,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SATA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_FMA3,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE42,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_SECURITY_TPM_1_2,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_MMX,HW_CPU_X86_SSE4A,COMPUTE_TRUSTED_CERTS,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_DEVICE_TAGGING,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SSE2,COMPUTE_SECURITY_TPM_2_0,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_IMAGE_TYPE_RAW _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.032 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.489 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.495 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.509 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.512 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:34:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:08.512 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.806s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:34:08 np0005604215.localdomain sudo[265715]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpdwercksvihltiotbbzmmrzdwissbst ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938448.6420612-1237-230352826892550/AnsiballZ_file.py
Feb 01 09:34:08 np0005604215.localdomain sudo[265715]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:09 np0005604215.localdomain python3.9[265717]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:09 np0005604215.localdomain sudo[265715]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:09.469 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:09.470 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:09.490 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:09.490 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:09 np0005604215.localdomain sudo[265825]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uafwtepudwekeidsdvctttnvjzcazaqy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938449.3753219-1237-67752186886204/AnsiballZ_file.py
Feb 01 09:34:09 np0005604215.localdomain sudo[265825]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:09 np0005604215.localdomain python3.9[265827]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:09 np0005604215.localdomain sudo[265825]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:10 np0005604215.localdomain sudo[265935]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gpxisuosglifvqnluxjjfprwjeaecglw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938449.946639-1237-81514780503943/AnsiballZ_file.py
Feb 01 09:34:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:34:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:34:10 np0005604215.localdomain sudo[265935]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:10 np0005604215.localdomain systemd[1]: tmp-crun.rbARw2.mount: Deactivated successfully.
Feb 01 09:34:10 np0005604215.localdomain podman[265937]: 2026-02-01 09:34:10.327953576 +0000 UTC m=+0.080211665 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:34:10 np0005604215.localdomain podman[265938]: 2026-02-01 09:34:10.341259227 +0000 UTC m=+0.087202791 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:34:10 np0005604215.localdomain podman[265938]: 2026-02-01 09:34:10.377870157 +0000 UTC m=+0.123813751 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:34:10 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:34:10 np0005604215.localdomain podman[265937]: 2026-02-01 09:34:10.402885259 +0000 UTC m=+0.155143338 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:34:10 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:34:10 np0005604215.localdomain python3.9[265944]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:10 np0005604215.localdomain sudo[265935]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51453 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666BC0D0000000001030307) 
Feb 01 09:34:10 np0005604215.localdomain sudo[266092]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ntimkqwyrqyiuzcceoxalhgmfxmorbax ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938450.6054273-1237-53050431999064/AnsiballZ_file.py
Feb 01 09:34:10 np0005604215.localdomain sudo[266092]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:10.996 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:10.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:34:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:10.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:34:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:11.028 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:34:11 np0005604215.localdomain python3.9[266094]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:11 np0005604215.localdomain sudo[266092]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:11 np0005604215.localdomain sudo[266202]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jjagitqwtzrrdahwvdwokqxwqjyxalnp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938451.2870727-1237-194833126633158/AnsiballZ_file.py
Feb 01 09:34:11 np0005604215.localdomain sudo[266202]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:11 np0005604215.localdomain python3.9[266204]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:11 np0005604215.localdomain sudo[266202]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:12 np0005604215.localdomain sudo[266312]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mueoyaogskxrmygdkknulwavzycpjltg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938451.939515-1237-52274812600791/AnsiballZ_file.py
Feb 01 09:34:12 np0005604215.localdomain sudo[266312]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:12 np0005604215.localdomain python3.9[266314]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:12 np0005604215.localdomain sudo[266312]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:12 np0005604215.localdomain sudo[266422]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vrmqqdpamjchauziynruiarmepqjbydt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938452.5466163-1237-158233234799904/AnsiballZ_file.py
Feb 01 09:34:12 np0005604215.localdomain sudo[266422]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:13 np0005604215.localdomain python3.9[266424]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:13 np0005604215.localdomain sudo[266422]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:13 np0005604215.localdomain sudo[266532]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-jrzbkksoqakbiuqakqdxaezdsqkgught ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938453.1573963-1237-221328472081624/AnsiballZ_file.py
Feb 01 09:34:13 np0005604215.localdomain sudo[266532]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:13 np0005604215.localdomain python3.9[266534]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:13 np0005604215.localdomain sudo[266532]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:15 np0005604215.localdomain sudo[266642]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nifhzvpdhtfmvdqveuepjkosrecezzao ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938454.8293395-1408-115743373645245/AnsiballZ_file.py
Feb 01 09:34:15 np0005604215.localdomain sudo[266642]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:15 np0005604215.localdomain python3.9[266644]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:15 np0005604215.localdomain sudo[266642]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:15 np0005604215.localdomain sudo[266752]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fwrbmkcdrmeceurifasfxoijwiknrifm ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938455.4622667-1408-183096418950111/AnsiballZ_file.py
Feb 01 09:34:15 np0005604215.localdomain sudo[266752]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:15 np0005604215.localdomain python3.9[266754]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:15 np0005604215.localdomain sudo[266752]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:16 np0005604215.localdomain sudo[266862]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bejomuhcpytljsnodsgmgrpikvdhfnkj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938456.1002967-1408-15492794812418/AnsiballZ_file.py
Feb 01 09:34:16 np0005604215.localdomain sudo[266862]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:16 np0005604215.localdomain python3.9[266864]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:16 np0005604215.localdomain sudo[266862]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:17 np0005604215.localdomain sudo[266972]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qjobvxpdlezsybbtwczhkdammvmawnou ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938456.9720454-1408-120735906748795/AnsiballZ_file.py
Feb 01 09:34:17 np0005604215.localdomain sudo[266972]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:17 np0005604215.localdomain python3.9[266974]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:17 np0005604215.localdomain sudo[266972]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:17 np0005604215.localdomain sudo[267082]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-uymucabudxdgvsegcljlvjgrmlxatqwb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938457.6258063-1408-207804103906338/AnsiballZ_file.py
Feb 01 09:34:17 np0005604215.localdomain sudo[267082]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:18 np0005604215.localdomain python3.9[267084]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:18 np0005604215.localdomain sudo[267082]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:18 np0005604215.localdomain sudo[267192]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-opwpjsyygeygvfxduqimvmaakypvggwu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938458.2365732-1408-2100859881078/AnsiballZ_file.py
Feb 01 09:34:18 np0005604215.localdomain sudo[267192]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:18 np0005604215.localdomain python3.9[267194]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:18 np0005604215.localdomain sudo[267192]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:19 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51454 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA666DD0E0000000001030307) 
Feb 01 09:34:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:34:19 np0005604215.localdomain systemd[1]: tmp-crun.0edbUU.mount: Deactivated successfully.
Feb 01 09:34:19 np0005604215.localdomain podman[267212]: 2026-02-01 09:34:19.872462891 +0000 UTC m=+0.091078920 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:34:19 np0005604215.localdomain podman[267212]: 2026-02-01 09:34:19.884662598 +0000 UTC m=+0.103278607 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:34:19 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:34:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:34:19 np0005604215.localdomain podman[267280]: 2026-02-01 09:34:19.997266602 +0000 UTC m=+0.075104148 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1769056855, distribution-scope=public, vcs-type=git, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, version=9.7)
Feb 01 09:34:20 np0005604215.localdomain podman[267280]: 2026-02-01 09:34:20.014769762 +0000 UTC m=+0.092607268 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, version=9.7, vcs-type=git, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:34:20 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:34:20 np0005604215.localdomain sudo[267339]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-nfartclumdmbznimvzwjhcrbtcmrfsnt ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938459.8146176-1408-150024716116076/AnsiballZ_file.py
Feb 01 09:34:20 np0005604215.localdomain sudo[267339]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:20 np0005604215.localdomain python3.9[267341]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:20 np0005604215.localdomain sudo[267339]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:20 np0005604215.localdomain sudo[267449]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-orqzwgmhaamyssctztaozzifrszvvwmp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938460.42743-1408-125555359818940/AnsiballZ_file.py
Feb 01 09:34:20 np0005604215.localdomain sudo[267449]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:20 np0005604215.localdomain python3.9[267451]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:34:20 np0005604215.localdomain sudo[267449]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:21 np0005604215.localdomain sudo[267559]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rvajkpckptgnknowlpwlgafwtbzhxtee ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938461.388439-1581-21264148126147/AnsiballZ_command.py
Feb 01 09:34:21 np0005604215.localdomain sudo[267559]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:21 np0005604215.localdomain python3.9[267561]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then
                                                              systemctl disable --now certmonger.service
                                                              test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service
                                                            fi
                                                             _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:21 np0005604215.localdomain sudo[267559]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:22 np0005604215.localdomain python3.9[267671]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Feb 01 09:34:23 np0005604215.localdomain sudo[267779]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-cqvheavnhjyyqnplvqednaquvqigwjtw ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938463.2530406-1636-262692494267187/AnsiballZ_systemd_service.py
Feb 01 09:34:23 np0005604215.localdomain sudo[267779]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:23 np0005604215.localdomain python3.9[267781]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Feb 01 09:34:23 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:34:23 np0005604215.localdomain systemd-sysv-generator[267810]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:34:23 np0005604215.localdomain systemd-rc-local-generator[267803]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:34:24 np0005604215.localdomain sudo[267779]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: tmp-crun.KRO4PJ.mount: Deactivated successfully.
Feb 01 09:34:24 np0005604215.localdomain podman[267903]: 2026-02-01 09:34:24.891223564 +0000 UTC m=+0.095817667 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:34:24 np0005604215.localdomain sudo[267938]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fmferivboqdeoqfkczrrhktecfbynlmd ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938464.5915496-1659-66211007451726/AnsiballZ_command.py
Feb 01 09:34:24 np0005604215.localdomain sudo[267938]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:24 np0005604215.localdomain podman[267903]: 2026-02-01 09:34:24.90468909 +0000 UTC m=+0.109283233 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 01 09:34:24 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:34:25 np0005604215.localdomain python3.9[267946]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:25 np0005604215.localdomain sudo[267938]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:25 np0005604215.localdomain sudo[268055]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-padpyyveacrfbqkwswbwzeoradxqojlf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938465.2479846-1659-274570337534157/AnsiballZ_command.py
Feb 01 09:34:25 np0005604215.localdomain sudo[268055]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:25 np0005604215.localdomain python3.9[268057]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:25 np0005604215.localdomain sudo[268055]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:26 np0005604215.localdomain sudo[268166]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-macqhfcdiugkkxfgusbuvffrgtztplue ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938465.8618495-1659-70209149603396/AnsiballZ_command.py
Feb 01 09:34:26 np0005604215.localdomain sudo[268166]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:26 np0005604215.localdomain python3.9[268168]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:27 np0005604215.localdomain sudo[268166]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:27 np0005604215.localdomain sudo[268277]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hajbpblgwnzbngsghirjrkwftrdakujh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938467.5345428-1659-203767514419494/AnsiballZ_command.py
Feb 01 09:34:27 np0005604215.localdomain sudo[268277]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:27 np0005604215.localdomain python3.9[268279]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:29 np0005604215.localdomain sudo[268277]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:29 np0005604215.localdomain sudo[268388]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-lmexwocgsfigkncskqwldsasfbpadzhr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938469.1569645-1659-239205439595237/AnsiballZ_command.py
Feb 01 09:34:29 np0005604215.localdomain sudo[268388]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:29 np0005604215.localdomain python3.9[268390]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:29 np0005604215.localdomain sudo[268388]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:34:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:34:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:34:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:34:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:34:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16314 "" "Go-http-client/1.1"
Feb 01 09:34:30 np0005604215.localdomain sudo[268499]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-bcrprmsaogwvepfucnmylhmjxnfuxhex ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938469.8276901-1659-165033035423116/AnsiballZ_command.py
Feb 01 09:34:30 np0005604215.localdomain sudo[268499]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:30 np0005604215.localdomain python3.9[268501]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:30 np0005604215.localdomain sudo[268499]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:30 np0005604215.localdomain sudo[268503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:34:30 np0005604215.localdomain sudo[268503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:34:30 np0005604215.localdomain sudo[268503]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:30 np0005604215.localdomain sudo[268537]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 09:34:30 np0005604215.localdomain sudo[268537]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:34:30 np0005604215.localdomain sudo[268653]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mthdxqdsjvhzrbshhjoxoumjokrzewwf ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938470.5353277-1659-159090020220004/AnsiballZ_command.py
Feb 01 09:34:30 np0005604215.localdomain sudo[268653]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:30 np0005604215.localdomain sudo[268537]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:30 np0005604215.localdomain python3.9[268662]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:31 np0005604215.localdomain sudo[268671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:34:31 np0005604215.localdomain sudo[268671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:34:31 np0005604215.localdomain sudo[268671]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:31 np0005604215.localdomain sudo[268690]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:34:31 np0005604215.localdomain sudo[268690]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:34:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:34:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:34:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:34:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:34:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:34:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:34:31 np0005604215.localdomain sudo[268690]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:32 np0005604215.localdomain sudo[268653]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:32 np0005604215.localdomain sudo[268826]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:34:32 np0005604215.localdomain sudo[268826]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:34:32 np0005604215.localdomain sudo[268826]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:32 np0005604215.localdomain sudo[268864]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-brtszsnnkhjamckiwkazlqhcexthogkx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938472.1738942-1659-46711923551023/AnsiballZ_command.py
Feb 01 09:34:32 np0005604215.localdomain sudo[268864]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:32 np0005604215.localdomain python3.9[268866]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:34:32 np0005604215.localdomain sudo[268864]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43885 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66715710000000001030307) 
Feb 01 09:34:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43886 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667198D0000000001030307) 
Feb 01 09:34:35 np0005604215.localdomain sudo[268975]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-msnifchusqodzptekqdwhkonlvlqyqxp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938475.1071012-1867-153616684803297/AnsiballZ_file.py
Feb 01 09:34:35 np0005604215.localdomain sudo[268975]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51455 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6671D0E0000000001030307) 
Feb 01 09:34:35 np0005604215.localdomain python3.9[268977]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:35 np0005604215.localdomain sudo[268975]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:36 np0005604215.localdomain sudo[269085]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rcatigcvwcohyvrmlhavihybyyklpotp ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938475.7461228-1867-92195275637568/AnsiballZ_file.py
Feb 01 09:34:36 np0005604215.localdomain sudo[269085]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:34:36 np0005604215.localdomain systemd[1]: tmp-crun.K2bqA5.mount: Deactivated successfully.
Feb 01 09:34:36 np0005604215.localdomain podman[269088]: 2026-02-01 09:34:36.156234159 +0000 UTC m=+0.099483531 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:34:36 np0005604215.localdomain podman[269088]: 2026-02-01 09:34:36.165107813 +0000 UTC m=+0.108357165 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:34:36 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:34:36 np0005604215.localdomain python3.9[269087]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:36 np0005604215.localdomain sudo[269085]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43887 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667218D0000000001030307) 
Feb 01 09:34:36 np0005604215.localdomain sudo[269217]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tgjmbuljbormrixzulhilhqbthpnvcbn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938476.3975072-1867-24433928475279/AnsiballZ_file.py
Feb 01 09:34:36 np0005604215.localdomain sudo[269217]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:36 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:34:36.818 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:34:36 np0005604215.localdomain python3.9[269219]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:36 np0005604215.localdomain sudo[269217]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=45630 DF PROTO=TCP SPT=60988 DPT=9102 SEQ=3044147456 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667250D0000000001030307) 
Feb 01 09:34:37 np0005604215.localdomain sudo[269327]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qpsmcdnpdwyigsbfpnvpcwzxamdjoivb ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938477.2396944-1934-114752868039692/AnsiballZ_file.py
Feb 01 09:34:37 np0005604215.localdomain sudo[269327]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:37 np0005604215.localdomain python3.9[269329]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:37 np0005604215.localdomain sudo[269327]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:38 np0005604215.localdomain sudo[269437]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-caizpdjbytmtrtmjxnxemswivdqwwkrn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938477.845733-1934-58957088093478/AnsiballZ_file.py
Feb 01 09:34:38 np0005604215.localdomain sudo[269437]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:38 np0005604215.localdomain python3.9[269439]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:38 np0005604215.localdomain sudo[269437]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:38 np0005604215.localdomain sudo[269547]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-sudngormgxvjvnwktxgztxodpkavutfx ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938478.4237676-1934-202311043880224/AnsiballZ_file.py
Feb 01 09:34:38 np0005604215.localdomain sudo[269547]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:38 np0005604215.localdomain python3.9[269549]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:38 np0005604215.localdomain sudo[269547]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:39 np0005604215.localdomain sudo[269657]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dzryibsgwcdzenfngjcedkztdrwgwcbl ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938479.0524127-1934-78846731508254/AnsiballZ_file.py
Feb 01 09:34:39 np0005604215.localdomain sudo[269657]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:39 np0005604215.localdomain python3.9[269659]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:39 np0005604215.localdomain sudo[269657]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:39 np0005604215.localdomain sudo[269767]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ihmleitlqrbdiwyzuvrlsdmciigjfvvn ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938479.654239-1934-75169198358804/AnsiballZ_file.py
Feb 01 09:34:39 np0005604215.localdomain sudo[269767]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:40 np0005604215.localdomain python3.9[269769]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:40 np0005604215.localdomain sudo[269767]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:40 np0005604215.localdomain sudo[269877]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hwqljjadgncxuhnidloterhycgannkvu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938480.2833605-1934-270190271788324/AnsiballZ_file.py
Feb 01 09:34:40 np0005604215.localdomain sudo[269877]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:34:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:34:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43888 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667314E0000000001030307) 
Feb 01 09:34:40 np0005604215.localdomain systemd[1]: tmp-crun.OnqsCU.mount: Deactivated successfully.
Feb 01 09:34:40 np0005604215.localdomain systemd[1]: tmp-crun.JKNv6n.mount: Deactivated successfully.
Feb 01 09:34:40 np0005604215.localdomain podman[269880]: 2026-02-01 09:34:40.735236073 +0000 UTC m=+0.138965727 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:34:40 np0005604215.localdomain podman[269881]: 2026-02-01 09:34:40.70562587 +0000 UTC m=+0.107765876 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:34:40 np0005604215.localdomain podman[269881]: 2026-02-01 09:34:40.808630088 +0000 UTC m=+0.210770084 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:34:40 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:34:40 np0005604215.localdomain python3.9[269879]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:40 np0005604215.localdomain podman[269880]: 2026-02-01 09:34:40.852835802 +0000 UTC m=+0.256565496 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:34:40 np0005604215.localdomain sudo[269877]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:40 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:34:41 np0005604215.localdomain sudo[270036]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-gusmrcpirlcaxcvdxtxjnxfzysgfyxxz ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938480.9962509-1934-71125441864032/AnsiballZ_file.py
Feb 01 09:34:41 np0005604215.localdomain sudo[270036]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:41 np0005604215.localdomain python3.9[270038]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:41 np0005604215.localdomain sudo[270036]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:34:41.751 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:34:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:34:41.751 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:34:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:34:41.751 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:34:48 np0005604215.localdomain sudo[270146]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-czkfnxitpbugjahxewugbwmygpvvkbro ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938487.190571-2258-254425175905794/AnsiballZ_getent.py
Feb 01 09:34:48 np0005604215.localdomain sudo[270146]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:34:48 np0005604215.localdomain python3.9[270148]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Feb 01 09:34:48 np0005604215.localdomain sudo[270146]: pam_unix(sudo:session): session closed for user root
Feb 01 09:34:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43889 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667510D0000000001030307) 
Feb 01 09:34:50 np0005604215.localdomain sshd[270167]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:34:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:34:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:34:50 np0005604215.localdomain sshd[270167]: Accepted publickey for zuul from 192.168.122.30 port 34152 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:34:50 np0005604215.localdomain systemd-logind[761]: New session 60 of user zuul.
Feb 01 09:34:50 np0005604215.localdomain systemd[1]: Started Session 60 of User zuul.
Feb 01 09:34:50 np0005604215.localdomain sshd[270167]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:34:50 np0005604215.localdomain podman[270169]: 2026-02-01 09:34:50.598258281 +0000 UTC m=+0.095503150 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, architecture=x86_64, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.7, io.openshift.expose-services=, release=1769056855)
Feb 01 09:34:50 np0005604215.localdomain podman[270170]: 2026-02-01 09:34:50.640365977 +0000 UTC m=+0.135944545 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true)
Feb 01 09:34:50 np0005604215.localdomain podman[270170]: 2026-02-01 09:34:50.649770737 +0000 UTC m=+0.145349365 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:34:50 np0005604215.localdomain podman[270169]: 2026-02-01 09:34:50.660696073 +0000 UTC m=+0.157940962 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Feb 01 09:34:50 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:34:50 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:34:50 np0005604215.localdomain sshd[270192]: Received disconnect from 192.168.122.30 port 34152:11: disconnected by user
Feb 01 09:34:50 np0005604215.localdomain sshd[270192]: Disconnected from user zuul 192.168.122.30 port 34152
Feb 01 09:34:50 np0005604215.localdomain sshd[270167]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:34:50 np0005604215.localdomain systemd[1]: session-60.scope: Deactivated successfully.
Feb 01 09:34:50 np0005604215.localdomain systemd-logind[761]: Session 60 logged out. Waiting for processes to exit.
Feb 01 09:34:50 np0005604215.localdomain systemd-logind[761]: Removed session 60.
Feb 01 09:34:51 np0005604215.localdomain python3.9[270313]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:34:51 np0005604215.localdomain python3.9[270399]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938490.9162667-2338-38331407707067/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:52 np0005604215.localdomain python3.9[270507]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:34:52 np0005604215.localdomain python3.9[270562]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:53 np0005604215.localdomain python3.9[270670]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:34:54 np0005604215.localdomain python3.9[270756]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938493.128792-2338-81756465378164/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:54 np0005604215.localdomain python3.9[270864]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:34:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:34:56 np0005604215.localdomain podman[270914]: 2026-02-01 09:34:56.158093826 +0000 UTC m=+0.078961941 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:34:56 np0005604215.localdomain podman[270914]: 2026-02-01 09:34:56.19267586 +0000 UTC m=+0.113543925 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:34:56 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:34:57 np0005604215.localdomain python3.9[270969]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938494.229365-2338-84476054753385/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=f97201355591685d5a25f9693d35e9cd6d9ded96 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:58 np0005604215.localdomain python3.9[271077]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:34:58 np0005604215.localdomain python3.9[271163]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938497.680119-2338-102621600784515/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:34:59 np0005604215.localdomain python3.9[271271]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:34:59 np0005604215.localdomain python3.9[271357]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1769938498.8194146-2338-4323519888209/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:35:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:35:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:35:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:35:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:35:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:35:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16305 "" "Go-http-client/1.1"
Feb 01 09:35:00 np0005604215.localdomain sudo[271465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ucnfgxihefhaxeaxihyzjtgmpfbjueky ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938500.2185397-2588-278638682912365/AnsiballZ_file.py
Feb 01 09:35:00 np0005604215.localdomain sudo[271465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:00 np0005604215.localdomain python3.9[271467]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:35:00 np0005604215.localdomain sudo[271465]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:01 np0005604215.localdomain sudo[271575]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hllnrsgsonxmmmoepfzzcslgjzubemsc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938500.9526224-2612-206381645195773/AnsiballZ_copy.py
Feb 01 09:35:01 np0005604215.localdomain sudo[271575]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:01 np0005604215.localdomain python3.9[271577]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:35:01 np0005604215.localdomain sudo[271575]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:35:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:35:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:35:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:35:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:35:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:35:02 np0005604215.localdomain sudo[271685]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-auoxjiekkbemuowghajceatkjymknlzu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938502.1872108-2636-139020137950908/AnsiballZ_stat.py
Feb 01 09:35:02 np0005604215.localdomain sudo[271685]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:02 np0005604215.localdomain python3.9[271687]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:02 np0005604215.localdomain sudo[271685]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:03 np0005604215.localdomain sudo[271797]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xrkuwnxykesmjusaciswgarnafqvfsei ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938503.0633147-2663-75452949086225/AnsiballZ_file.py
Feb 01 09:35:03 np0005604215.localdomain sudo[271797]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.402 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:35:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:35:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29866 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6678AA10000000001030307) 
Feb 01 09:35:03 np0005604215.localdomain python3.9[271799]: ansible-ansible.builtin.file Invoked with group=nova mode=0400 owner=nova path=/var/lib/nova/compute_id state=file recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:35:03 np0005604215.localdomain sudo[271797]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29867 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6678E8E0000000001030307) 
Feb 01 09:35:04 np0005604215.localdomain python3.9[271907]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:05.014 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43890 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667910D0000000001030307) 
Feb 01 09:35:05 np0005604215.localdomain python3.9[272017]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:35:05 np0005604215.localdomain sshd[272020]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:35:05 np0005604215.localdomain python3.9[272074]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute.json _original_basename=nova_compute.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:35:05 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:05.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:06 np0005604215.localdomain sshd[272020]: Invalid user ospite from 85.206.171.113 port 37992
Feb 01 09:35:06 np0005604215.localdomain sshd[272020]: Received disconnect from 85.206.171.113 port 37992:11: Bye Bye [preauth]
Feb 01 09:35:06 np0005604215.localdomain sshd[272020]: Disconnected from invalid user ospite 85.206.171.113 port 37992 [preauth]
Feb 01 09:35:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:35:06 np0005604215.localdomain podman[272153]: 2026-02-01 09:35:06.375514686 +0000 UTC m=+0.088892607 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:35:06 np0005604215.localdomain podman[272153]: 2026-02-01 09:35:06.385190674 +0000 UTC m=+0.098568575 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:35:06 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:35:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29868 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667968D0000000001030307) 
Feb 01 09:35:06 np0005604215.localdomain python3.9[272195]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Feb 01 09:35:06 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:06.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:07 np0005604215.localdomain python3.9[272260]: ansible-ansible.legacy.file Invoked with mode=0700 setype=container_file_t dest=/var/lib/openstack/config/containers/nova_compute_init.json _original_basename=nova_compute_init.json.j2 recurse=False state=file path=/var/lib/openstack/config/containers/nova_compute_init.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Feb 01 09:35:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51456 DF PROTO=TCP SPT=43886 DPT=9102 SEQ=3716749661 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6679B0D0000000001030307) 
Feb 01 09:35:07 np0005604215.localdomain sudo[272368]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fqvalnuhjfkqaacaniqxxxplbytnjcmg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938507.4763448-2791-42491233840269/AnsiballZ_container_config_data.py
Feb 01 09:35:07 np0005604215.localdomain sudo[272368]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:07.994 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:07 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:07.995 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:35:08 np0005604215.localdomain python3.9[272370]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Feb 01 09:35:08 np0005604215.localdomain sudo[272368]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:08 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:08.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.010 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.011 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.011 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.011 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.011 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:35:09 np0005604215.localdomain sudo[272479]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mdvfbiyrbecotnrbhsohyvvkfwjqykun ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938508.6490993-2824-162544239060109/AnsiballZ_container_config_hash.py
Feb 01 09:35:09 np0005604215.localdomain sudo[272479]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:09 np0005604215.localdomain python3.9[272481]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:35:09 np0005604215.localdomain sudo[272479]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.436 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.626 225589 WARNING nova.virt.libvirt.driver [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.628 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12895MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.629 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.629 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.695 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.695 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:35:09 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:09.709 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:35:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:10.159 225589 DEBUG oslo_concurrency.processutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:35:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:10.166 225589 DEBUG nova.compute.provider_tree [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:35:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:10.182 225589 DEBUG nova.scheduler.client.report [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:35:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:10.185 225589 DEBUG nova.compute.resource_tracker [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:35:10 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:10.185 225589 DEBUG oslo_concurrency.lockutils [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.556s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:35:10 np0005604215.localdomain sudo[272632]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-dgogbixuqindypqdsnevdjljhssvouie ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769938509.9203587-2855-165968060515419/AnsiballZ_edpm_container_manage.py
Feb 01 09:35:10 np0005604215.localdomain sudo[272632]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29869 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667A64D0000000001030307) 
Feb 01 09:35:10 np0005604215.localdomain python3[272634]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:35:10 np0005604215.localdomain python3[272634]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",
                                                                    "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:31:38.534497001Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1214548351,
                                                                    "VirtualSize": 1214548351,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",
                                                                              "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:39.234075496Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.686286019Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.133364958Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:10.283411186Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:19.407054412Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:42.656365894Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:37.451289936Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.151652427Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532191009Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532298572Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:44.609081717Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 01 09:35:11 np0005604215.localdomain sudo[272632]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:11.186 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:11.187 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:11.187 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:11 np0005604215.localdomain sudo[272802]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xhdcmifvztfbjfvhwzigajsrsqalzpot ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938511.368663-2878-3498930787248/AnsiballZ_stat.py
Feb 01 09:35:11 np0005604215.localdomain sudo[272802]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:35:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:35:11 np0005604215.localdomain podman[272806]: 2026-02-01 09:35:11.775301664 +0000 UTC m=+0.072226994 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:35:11 np0005604215.localdomain podman[272806]: 2026-02-01 09:35:11.784614381 +0000 UTC m=+0.081539751 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:35:11 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:35:11 np0005604215.localdomain podman[272805]: 2026-02-01 09:35:11.836465317 +0000 UTC m=+0.135264485 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:35:11 np0005604215.localdomain podman[272805]: 2026-02-01 09:35:11.869838263 +0000 UTC m=+0.168637391 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller)
Feb 01 09:35:11 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:35:11 np0005604215.localdomain python3.9[272804]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:11 np0005604215.localdomain sudo[272802]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:11.995 225589 DEBUG oslo_service.periodic_task [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:35:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:11.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:35:11 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:11.996 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:35:12 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:12.031 225589 DEBUG nova.compute.manager [None req-f5d4d234-046d-40ce-a06c-dfcffd290e3b - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:35:12 np0005604215.localdomain sudo[272962]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-vcrolqrtarqbtwggqsyudbudjhqyifwy ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938512.540681-2914-278501015589530/AnsiballZ_container_config_data.py
Feb 01 09:35:12 np0005604215.localdomain sudo[272962]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:13 np0005604215.localdomain python3.9[272964]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Feb 01 09:35:13 np0005604215.localdomain sudo[272962]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:13 np0005604215.localdomain sudo[273072]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-kckjhutjtcuhrkrjamqpustlxdewjgsj ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938513.5063806-2948-58850187860457/AnsiballZ_container_config_hash.py
Feb 01 09:35:13 np0005604215.localdomain sudo[273072]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:13 np0005604215.localdomain python3.9[273074]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/openstack
Feb 01 09:35:14 np0005604215.localdomain sudo[273072]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:14 np0005604215.localdomain sudo[273182]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-pgbnymrfowmjviiuzcpnbfitekudjmya ; /usr/bin/python3 /home/zuul/.ansible/tmp/ansible-tmp-1769938514.4798014-2977-85428968576862/AnsiballZ_edpm_container_manage.py
Feb 01 09:35:14 np0005604215.localdomain sudo[273182]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:15 np0005604215.localdomain python3[273184]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json containers=[] log_base_path=/var/log/containers/stdouts debug=False
Feb 01 09:35:15 np0005604215.localdomain python3[273184]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: [
                                                               {
                                                                    "Id": "f4e0688689eb3c524117ae65df199eeb4e620e591d26898b5cb25b819a2d79fd",
                                                                    "Digest": "sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83",
                                                                    "RepoTags": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ],
                                                                    "RepoDigests": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute@sha256:f96bd21c79ae0d7e8e17010c5e2573637d6c0f47f03e63134c477edd8ad73d83"
                                                                    ],
                                                                    "Parent": "",
                                                                    "Comment": "",
                                                                    "Created": "2026-01-30T06:31:38.534497001Z",
                                                                    "Config": {
                                                                         "User": "nova",
                                                                         "Env": [
                                                                              "PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                                                                              "LANG=en_US.UTF-8",
                                                                              "TZ=UTC",
                                                                              "container=oci"
                                                                         ],
                                                                         "Entrypoint": [
                                                                              "dumb-init",
                                                                              "--single-child",
                                                                              "--"
                                                                         ],
                                                                         "Cmd": [
                                                                              "kolla_start"
                                                                         ],
                                                                         "Labels": {
                                                                              "io.buildah.version": "1.41.3",
                                                                              "maintainer": "OpenStack Kubernetes Operator team",
                                                                              "org.label-schema.build-date": "20260127",
                                                                              "org.label-schema.license": "GPLv2",
                                                                              "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                              "org.label-schema.schema-version": "1.0",
                                                                              "org.label-schema.vendor": "CentOS",
                                                                              "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                              "tcib_managed": "true"
                                                                         },
                                                                         "StopSignal": "SIGTERM"
                                                                    },
                                                                    "Version": "",
                                                                    "Author": "",
                                                                    "Architecture": "amd64",
                                                                    "Os": "linux",
                                                                    "Size": 1214548351,
                                                                    "VirtualSize": 1214548351,
                                                                    "GraphDriver": {
                                                                         "Name": "overlay",
                                                                         "Data": {
                                                                              "LowerDir": "/var/lib/containers/storage/overlay/f4838a4ef132546976a08c48bf55f89a91b54cc7f0728a84d5c77d24ba7a8992/diff:/var/lib/containers/storage/overlay/1d7b7d3208029afb8b179e48c365354efe7c39d41194e42a7d13168820ab51ad/diff:/var/lib/containers/storage/overlay/1ad843ea4b31b05bcf49ccd6faa74bd0d6976ffabe60466fd78caf7ec41bf4ac/diff:/var/lib/containers/storage/overlay/57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595/diff",
                                                                              "UpperDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/diff",
                                                                              "WorkDir": "/var/lib/containers/storage/overlay/426448257cd6d6837b598e532a79ac3a86475cfca86b72c882b04ab6e3f65424/work"
                                                                         }
                                                                    },
                                                                    "RootFS": {
                                                                         "Type": "layers",
                                                                         "Layers": [
                                                                              "sha256:57c9a356b8a6d9095c1e6bfd1bb5d3b87c9d1b944c2c5d8a1da6e61dd690c595",
                                                                              "sha256:315008a247098d7a6218ae8aaacc68c9c19036e3778f3bb6313e5d0200cfa613",
                                                                              "sha256:d3142d7a25f00adc375557623676c786baeb2b8fec29945db7fe79212198a495",
                                                                              "sha256:6cac2e473d63cf2a9b8ef2ea3f4fbc7fb780c57021c3588efd56da3aa8cf8843",
                                                                              "sha256:927dd86a09392106af537557be80232b7e8ca154daa00857c24fe20f9e550a50"
                                                                         ]
                                                                    },
                                                                    "Labels": {
                                                                         "io.buildah.version": "1.41.3",
                                                                         "maintainer": "OpenStack Kubernetes Operator team",
                                                                         "org.label-schema.build-date": "20260127",
                                                                         "org.label-schema.license": "GPLv2",
                                                                         "org.label-schema.name": "CentOS Stream 9 Base Image",
                                                                         "org.label-schema.schema-version": "1.0",
                                                                         "org.label-schema.vendor": "CentOS",
                                                                         "tcib_build_tag": "b85d0548925081ae8c6bdd697658cec4",
                                                                         "tcib_managed": "true"
                                                                    },
                                                                    "Annotations": {},
                                                                    "ManifestType": "application/vnd.docker.distribution.manifest.v2+json",
                                                                    "User": "nova",
                                                                    "History": [
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126388624Z",
                                                                              "created_by": "/bin/sh -c #(nop) ADD file:54935d5b0598cdb1451aeae3c8627aade8d55dcef2e876b35185c8e36be64256 in / ",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:51.126459235Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL org.label-schema.schema-version=\"1.0\"     org.label-schema.name=\"CentOS Stream 9 Base Image\"     org.label-schema.vendor=\"CentOS\"     org.label-schema.license=\"GPLv2\"     org.label-schema.build-date=\"20260127\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-28T05:56:53.726938221Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"/bin/bash\"]"
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890429494Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL maintainer=\"OpenStack Kubernetes Operator team\"",
                                                                              "comment": "FROM quay.io/centos/centos:stream9",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890534417Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL tcib_managed=true",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890553228Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV LANG=\"en_US.UTF-8\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890570688Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV TZ=\"UTC\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890616649Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENV container=\"oci\"",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:18.890659121Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:19.232761948Z",
                                                                              "created_by": "/bin/sh -c if [ -f \"/etc/yum.repos.d/ubi.repo\" ]; then rm -f /etc/yum.repos.d/ubi.repo && dnf clean all && rm -rf /var/cache/dnf; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:52.670543613Z",
                                                                              "created_by": "/bin/sh -c dnf install -y crudini && crudini --del /etc/dnf/dnf.conf main override_install_langs && crudini --set /etc/dnf/dnf.conf main clean_requirements_on_remove True && crudini --set /etc/dnf/dnf.conf main exactarch 1 && crudini --set /etc/dnf/dnf.conf main gpgcheck 1 && crudini --set /etc/dnf/dnf.conf main install_weak_deps False && if [ 'centos' == 'centos' ];then crudini --set /etc/dnf/dnf.conf main best False; fi && crudini --set /etc/dnf/dnf.conf main installonly_limit 0 && crudini --set /etc/dnf/dnf.conf main keepcache 0 && crudini --set /etc/dnf/dnf.conf main obsoletes 1 && crudini --set /etc/dnf/dnf.conf main plugins 1 && crudini --set /etc/dnf/dnf.conf main skip_missing_names_on_install False && crudini --set /etc/dnf/dnf.conf main tsflags nodocs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.650316471Z",
                                                                              "created_by": "/bin/sh -c dnf install -y ca-certificates dumb-init glibc-langpack-en procps-ng python3 sudo util-linux-user which python-tcib-containers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:55.970652058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/uid_gid_manage.sh /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.274301506Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/uid_gid_manage",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:56.82928237Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage kolla hugetlbfs libvirt qemu",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.134416869Z",
                                                                              "created_by": "/bin/sh -c touch /usr/local/bin/kolla_extend_start && chmod 755 /usr/local/bin/kolla_extend_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.444274899Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/set_configs.py /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:57.746599531Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_set_configs",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.041383545Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/start.sh /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.352119949Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_start",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.671042058Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/httpd_setup.sh /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:58.969834612Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_httpd_setup",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.264649297Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/copy_cacerts.sh /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.518696627Z",
                                                                              "created_by": "/bin/sh -c chmod 755 /usr/local/bin/kolla_copy_cacerts",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:10:59.800434902Z",
                                                                              "created_by": "/bin/sh -c cp /usr/share/tcib/container-images/kolla/base/sudoers /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.115933627Z",
                                                                              "created_by": "/bin/sh -c chmod 440 /etc/sudoers",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:00.41398479Z",
                                                                              "created_by": "/bin/sh -c sed -ri '/^(passwd:|group:)/ s/systemd//g' /etc/nsswitch.conf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.414738437Z",
                                                                              "created_by": "/bin/sh -c dnf -y reinstall which && rpm -e --nodeps tzdata && dnf -y install tzdata",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:03.709666444Z",
                                                                              "created_by": "/bin/sh -c if [ ! -f \"/etc/localtime\" ]; then ln -s /usr/share/zoneinfo/Etc/UTC /etc/localtime; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:04.019868523Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /openstack",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:05.41751141Z",
                                                                              "created_by": "/bin/sh -c if [ 'centos' == 'centos' ];then if [ -n \"$(rpm -qa redhat-release)\" ];then rpm -e --nodeps redhat-release; fi ; dnf -y install centos-stream-release; fi",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124324267Z",
                                                                              "created_by": "/bin/sh -c dnf update --excludepkgs redhat-release -y && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124384329Z",
                                                                              "created_by": "/bin/sh -c #(nop) STOPSIGNAL SIGTERM",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124399349Z",
                                                                              "created_by": "/bin/sh -c #(nop) ENTRYPOINT [\"dumb-init\", \"--single-child\", \"--\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:07.124410339Z",
                                                                              "created_by": "/bin/sh -c #(nop) CMD [\"kolla_start\"]",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:11:08.028503475Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:12:56.089921987Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:34.524252589Z",
                                                                              "created_by": "/bin/sh -c dnf install -y python3-barbicanclient python3-cinderclient python3-designateclient python3-glanceclient python3-ironicclient python3-keystoneclient python3-manilaclient python3-neutronclient python3-novaclient python3-observabilityclient python3-octaviaclient python3-openstackclient python3-swiftclient python3-pymemcache && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:13:37.262239859Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:39.234075496Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-os:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:42.686286019Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:18:54.133364958Z",
                                                                              "created_by": "/bin/sh -c mkdir -p /etc/ssh && touch /etc/ssh/ssh_known_host",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:10.283411186Z",
                                                                              "created_by": "/bin/sh -c dnf install -y openstack-nova-common && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:20:19.407054412Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:29:42.656365894Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER root",
                                                                              "comment": "FROM quay.rdoproject.org/podified-antelope-centos9/openstack-nova-base:b85d0548925081ae8c6bdd697658cec4",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:37.451289936Z",
                                                                              "created_by": "/bin/sh -c dnf -y install e2fsprogs xfsprogs xorriso iscsi-initiator-utils nfs-utils targetcli nvme-cli device-mapper-multipath ceph-common openssh-clients openstack-nova-compute openvswitch swtpm swtpm-tools && dnf clean all && rm -rf /var/cache/dnf",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.151652427Z",
                                                                              "created_by": "/bin/sh -c bash /usr/local/bin/uid_gid_manage nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532191009Z",
                                                                              "created_by": "/bin/sh -c rm -f /etc/machine-id",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:38.532298572Z",
                                                                              "created_by": "/bin/sh -c #(nop) USER nova",
                                                                              "empty_layer": true
                                                                         },
                                                                         {
                                                                              "created": "2026-01-30T06:31:44.609081717Z",
                                                                              "created_by": "/bin/sh -c #(nop) LABEL \"tcib_build_tag\"=\"b85d0548925081ae8c6bdd697658cec4\""
                                                                         }
                                                                    ],
                                                                    "NamesHistory": [
                                                                         "quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified"
                                                                    ]
                                                               }
                                                          ]
                                                          : quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Feb 01 09:35:15 np0005604215.localdomain sudo[273182]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:16 np0005604215.localdomain sudo[273353]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-eegwsicqvskzuvntwtrxeiaecluikmyh ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938515.716775-3001-34658042971289/AnsiballZ_stat.py
Feb 01 09:35:16 np0005604215.localdomain sudo[273353]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:16 np0005604215.localdomain python3.9[273355]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:16 np0005604215.localdomain sudo[273353]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:17 np0005604215.localdomain sudo[273465]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-hcxsxskzolaeyusminwwgjiavfyaurdg ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938516.9794145-3028-169808300894305/AnsiballZ_file.py
Feb 01 09:35:17 np0005604215.localdomain sudo[273465]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:17 np0005604215.localdomain python3.9[273467]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:35:17 np0005604215.localdomain sudo[273465]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:18 np0005604215.localdomain sudo[273574]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ayfkxoyatxplgzlqwpsmccrmwxhbzajc ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938517.5381455-3028-115428751625794/AnsiballZ_copy.py
Feb 01 09:35:18 np0005604215.localdomain sudo[273574]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29870 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667C7140000000001030307) 
Feb 01 09:35:18 np0005604215.localdomain python3.9[273576]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1769938517.5381455-3028-115428751625794/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:35:18 np0005604215.localdomain sudo[273574]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:19 np0005604215.localdomain sudo[273629]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ltxmszwfrpfkcjboutiaoisqeepwyiix ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938517.5381455-3028-115428751625794/AnsiballZ_systemd.py
Feb 01 09:35:19 np0005604215.localdomain sudo[273629]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:19 np0005604215.localdomain python3.9[273631]: ansible-systemd Invoked with state=started name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:35:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:35:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:35:20 np0005604215.localdomain systemd[1]: tmp-crun.VPj4L5.mount: Deactivated successfully.
Feb 01 09:35:20 np0005604215.localdomain podman[273635]: 2026-02-01 09:35:20.876914574 +0000 UTC m=+0.087059530 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:35:20 np0005604215.localdomain podman[273634]: 2026-02-01 09:35:20.924569511 +0000 UTC m=+0.135237664 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, release=1769056855, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, version=9.7, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64)
Feb 01 09:35:20 np0005604215.localdomain sudo[273629]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:20 np0005604215.localdomain podman[273635]: 2026-02-01 09:35:20.937651093 +0000 UTC m=+0.147796089 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 01 09:35:20 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:35:20 np0005604215.localdomain podman[273634]: 2026-02-01 09:35:20.987807566 +0000 UTC m=+0.198475769 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, container_name=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:35:20 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:35:22 np0005604215.localdomain python3.9[273778]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:23 np0005604215.localdomain python3.9[273886]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:23 np0005604215.localdomain python3.9[273994]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Feb 01 09:35:24 np0005604215.localdomain sudo[274102]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-filtwiwwjqwmccvgijhjrugccroljhpu ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938524.2520268-3197-36910393442074/AnsiballZ_podman_container.py
Feb 01 09:35:24 np0005604215.localdomain sudo[274102]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:24 np0005604215.localdomain python3.9[274104]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 01 09:35:25 np0005604215.localdomain sudo[274102]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:25 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 119.2 (397 of 333 items), suggesting rotation.
Feb 01 09:35:25 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 09:35:25 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:35:25 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:35:25 np0005604215.localdomain sudo[274235]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-fdioiaqtlkqpsfhjiiyyryrkidvkslxr ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938525.3603878-3221-50710474231240/AnsiballZ_systemd.py
Feb 01 09:35:25 np0005604215.localdomain sudo[274235]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:25 np0005604215.localdomain python3.9[274237]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Feb 01 09:35:26 np0005604215.localdomain systemd[1]: Stopping nova_compute container...
Feb 01 09:35:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:35:26 np0005604215.localdomain podman[274254]: 2026-02-01 09:35:26.859204529 +0000 UTC m=+0.077980361 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:35:26 np0005604215.localdomain podman[274254]: 2026-02-01 09:35:26.897801407 +0000 UTC m=+0.116577209 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:35:26 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:35:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:35:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:35:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:35:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146809 "" "Go-http-client/1.1"
Feb 01 09:35:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:35:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16189 "" "Go-http-client/1.1"
Feb 01 09:35:31 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:31.430 225589 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored
Feb 01 09:35:31 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:31.432 225589 DEBUG oslo_concurrency.lockutils [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:35:31 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:31.433 225589 DEBUG oslo_concurrency.lockutils [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:35:31 np0005604215.localdomain nova_compute[225585]: 2026-02-01 09:35:31.433 225589 DEBUG oslo_concurrency.lockutils [None req-a2410d21-6318-468a-bc6e-9c23e9caa3c1 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:35:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:35:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:35:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:35:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:35:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:35:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:35:31 np0005604215.localdomain virtqemud[224673]: End of file while reading data: Input/output error
Feb 01 09:35:31 np0005604215.localdomain systemd[1]: libpod-6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b.scope: Deactivated successfully.
Feb 01 09:35:31 np0005604215.localdomain systemd[1]: libpod-6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b.scope: Consumed 16.385s CPU time.
Feb 01 09:35:31 np0005604215.localdomain podman[274241]: 2026-02-01 09:35:31.779335304 +0000 UTC m=+5.761665395 container died 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0)
Feb 01 09:35:31 np0005604215.localdomain systemd[1]: tmp-crun.CHK18T.mount: Deactivated successfully.
Feb 01 09:35:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b-userdata-shm.mount: Deactivated successfully.
Feb 01 09:35:31 np0005604215.localdomain podman[274241]: 2026-02-01 09:35:31.9439394 +0000 UTC m=+5.926269461 container cleanup 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=nova_compute)
Feb 01 09:35:31 np0005604215.localdomain podman[274241]: nova_compute
Feb 01 09:35:32 np0005604215.localdomain podman[274300]: error opening file `/run/crun/6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b/status`: No such file or directory
Feb 01 09:35:32 np0005604215.localdomain podman[274288]: 2026-02-01 09:35:32.043468103 +0000 UTC m=+0.067004873 container cleanup 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:35:32 np0005604215.localdomain podman[274288]: nova_compute
Feb 01 09:35:32 np0005604215.localdomain systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Feb 01 09:35:32 np0005604215.localdomain systemd[1]: Stopped nova_compute container.
Feb 01 09:35:32 np0005604215.localdomain systemd[1]: Starting nova_compute container...
Feb 01 09:35:32 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:35:32 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:32 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:32 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:32 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:32 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/54995506782eae95026b9f51b92ada7520fd2a9b50cd4bb5084f3bc717596538/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:32 np0005604215.localdomain podman[274302]: 2026-02-01 09:35:32.188954401 +0000 UTC m=+0.114943838 container init 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:35:32 np0005604215.localdomain podman[274302]: 2026-02-01 09:35:32.198477794 +0000 UTC m=+0.124467251 container start 6376ec1aa7e36dfa9d6482e2e2c123bbd0b30c6c27b1932547ca6ecc8b480f1b (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath', '/etc/multipath.conf:/etc/multipath.conf:ro,Z', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:35:32 np0005604215.localdomain podman[274302]: nova_compute
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + sudo -E kolla_set_configs
Feb 01 09:35:32 np0005604215.localdomain systemd[1]: Started nova_compute container.
Feb 01 09:35:32 np0005604215.localdomain sudo[274235]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Validating config file
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying service configuration files
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/nova/nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/03-ceph-nova.conf to /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/03-ceph-nova.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/99-nova-compute-cells-workarounds.conf to /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/99-nova-compute-cells-workarounds.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /etc/ceph
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Creating directory /etc/ceph
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/ceph
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.client.openstack.keyring to /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/ceph/ceph.conf to /etc/ceph/ceph.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Writing out command to execute
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/ceph/ceph.client.openstack.keyring
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /etc/ceph/ceph.conf
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: ++ cat /run_command
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + CMD=nova-compute
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + ARGS=
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + sudo kolla_copy_cacerts
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + [[ ! -n '' ]]
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + . kolla_extend_start
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: Running command: 'nova-compute'
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + echo 'Running command: '\''nova-compute'\'''
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + umask 0022
Feb 01 09:35:32 np0005604215.localdomain nova_compute[274317]: + exec nova-compute
Feb 01 09:35:32 np0005604215.localdomain sudo[274346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:35:32 np0005604215.localdomain sudo[274346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:35:32 np0005604215.localdomain sudo[274346]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:32 np0005604215.localdomain sudo[274364]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:35:32 np0005604215.localdomain sudo[274364]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:35:33 np0005604215.localdomain sudo[274364]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65433 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA667FFD10000000001030307) 
Feb 01 09:35:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:33.950 274321 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:35:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:33.950 274321 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:35:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:33.950 274321 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44
Feb 01 09:35:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:33.950 274321 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs
Feb 01 09:35:34 np0005604215.localdomain sudo[274417]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:35:34 np0005604215.localdomain sudo[274417]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:35:34 np0005604215.localdomain sudo[274417]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.067 274321 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.091 274321 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.024s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.091 274321 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473
Feb 01 09:35:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65434 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66803CE0000000001030307) 
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.548 274321 INFO nova.virt.driver [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.660 274321 INFO nova.compute.provider_config [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.667 274321 DEBUG oslo_concurrency.lockutils [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.667 274321 DEBUG oslo_concurrency.lockutils [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.667 274321 DEBUG oslo_concurrency.lockutils [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.667 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.668 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.669 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] console_host                   = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.670 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.671 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.672 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] host                           = np0005604215.localdomain log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.673 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.674 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.675 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.676 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.677 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.677 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.677 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.677 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.677 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.678 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.678 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.679 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.679 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.679 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.679 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.680 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.680 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.680 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] my_block_storage_ip            = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.680 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] my_ip                          = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.680 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.681 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.682 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.683 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.684 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.685 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.686 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.687 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.688 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.688 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.688 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.688 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.688 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.688 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.689 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.690 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.691 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.692 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.693 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.694 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.695 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.696 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.697 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.698 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.699 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.700 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.701 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.702 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.703 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.704 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.705 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.706 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.707 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.708 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.709 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.710 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.711 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.712 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.713 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.714 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.715 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.716 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.717 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.718 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.719 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.720 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.721 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.722 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.723 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.724 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.725 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.726 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.727 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.728 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.729 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.730 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_mode               = host-model log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_models             = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.731 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.732 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_rbd_ceph_conf   = /etc/ceph/ceph.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.733 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_rbd_glance_store_name = default_backend log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_rbd_pool        = vms log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_type            = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.734 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.735 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.736 274321 WARNING oslo_config.cfg [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: live_migration_uri is deprecated for removal in favor of two other options that
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: and ``live_migration_inbound_addr`` respectively.
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: ).  Its value may be silently ignored in the future.
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.737 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_uri     = qemu+ssh://nova@%s/system?keyfile=/var/lib/nova/.ssh/ssh-privatekey log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.737 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.live_migration_with_native_tls = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.737 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.737 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.737 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.737 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.738 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rbd_secret_uuid        = 33fac0b9-80c7-560f-918a-c92d3021ca1e log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rbd_user               = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.739 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.740 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.741 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.741 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.741 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.741 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.741 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.741 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.742 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.743 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.744 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.745 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.746 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.747 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.748 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.749 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.749 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.749 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.749 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.749 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.749 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.auth_url             = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.750 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.751 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.752 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.753 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.754 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.755 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.756 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.756 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.756 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.756 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.756 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.756 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.757 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.758 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.759 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.760 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.761 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.762 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.763 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.764 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.765 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.765 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.765 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.765 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.765 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.765 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.766 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.767 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.768 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.769 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.770 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.771 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.novncproxy_base_url        = http://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.server_proxyclient_address = 192.168.122.108 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.772 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.773 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.774 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.775 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.776 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.777 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.778 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.779 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.779 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.779 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.779 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.779 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.779 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.780 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.781 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.782 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.783 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.784 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.auth_url            = http://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.785 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.786 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.787 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.788 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.788 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.788 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.788 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.788 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.788 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.789 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.790 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.791 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.792 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.793 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.794 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.795 274321 DEBUG oslo_service.service [None req-d404af50-9067-4cca-9774-28a606849fca - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.796 274321 INFO nova.service [-] Starting compute node (version 27.5.2-0.20260127144738.eaa65f0.el9)
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.809 274321 INFO nova.virt.node [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Determined node identity d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from /var/lib/nova/compute_id
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.809 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.810 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.810 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.810 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.820 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7fa30f6e9190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.822 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7fa30f6e9190> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.823 274321 INFO nova.virt.libvirt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Connection event '1' reason 'None'
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.828 274321 INFO nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Libvirt host capabilities <capabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <host>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <uuid>b72fb799-3472-4728-b6e2-ec98d2bbb61b</uuid>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <arch>x86_64</arch>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model>EPYC-Rome-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <vendor>AMD</vendor>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <microcode version='16777317'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <signature family='23' model='49' stepping='0'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <maxphysaddr mode='emulate' bits='40'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='x2apic'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='tsc-deadline'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='osxsave'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='hypervisor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='tsc_adjust'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='spec-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='stibp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='arch-capabilities'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='cmp_legacy'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='topoext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='virt-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='lbrv'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='tsc-scale'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='vmcb-clean'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='pause-filter'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='pfthreshold'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='svme-addr-chk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='rdctl-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='skip-l1dfl-vmentry'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='mds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature name='pschange-mc-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <pages unit='KiB' size='4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <pages unit='KiB' size='2048'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <pages unit='KiB' size='1048576'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <power_management>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <suspend_mem/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <suspend_disk/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <suspend_hybrid/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </power_management>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <iommu support='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <migration_features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <live/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <uri_transports>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <uri_transport>tcp</uri_transport>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <uri_transport>rdma</uri_transport>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </uri_transports>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </migration_features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <topology>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <cells num='1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <cell id='0'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           <memory unit='KiB'>16116604</memory>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           <pages unit='KiB' size='4'>4029151</pages>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           <pages unit='KiB' size='2048'>0</pages>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           <pages unit='KiB' size='1048576'>0</pages>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           <distances>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <sibling id='0' value='10'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           </distances>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           <cpus num='8'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:             <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:           </cpus>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         </cell>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </cells>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </topology>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <cache>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </cache>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <secmodel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model>selinux</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <doi>0</doi>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </secmodel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <secmodel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model>dac</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <doi>0</doi>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <baselabel type='kvm'>+107:+107</baselabel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <baselabel type='qemu'>+107:+107</baselabel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </secmodel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </host>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <guest>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <os_type>hvm</os_type>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <arch name='i686'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <wordsize>32</wordsize>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <domain type='qemu'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <domain type='kvm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </arch>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <pae/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <nonpae/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <acpi default='on' toggle='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <apic default='on' toggle='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <cpuselection/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <deviceboot/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <disksnapshot default='on' toggle='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <externalSnapshot/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </guest>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <guest>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <os_type>hvm</os_type>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <arch name='x86_64'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <wordsize>64</wordsize>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <emulator>/usr/libexec/qemu-kvm</emulator>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <domain type='qemu'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <domain type='kvm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </arch>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <acpi default='on' toggle='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <apic default='on' toggle='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <cpuselection/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <deviceboot/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <disksnapshot default='on' toggle='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <externalSnapshot/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </guest>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: </capabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.833 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.835 274321 DEBUG nova.virt.libvirt.volume.mount [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.838 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: <domainCapabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <domain>kvm</domain>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <arch>i686</arch>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <vcpu max='240'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <iothreads supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <os supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <enum name='firmware'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <loader supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>rom</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pflash</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='readonly'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>yes</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='secure'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </loader>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </os>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='maximum' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='maximumMigratable'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='host-model' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <vendor>AMD</vendor>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='x2apic'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='stibp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='succor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lbrv'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='custom' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Dhyana-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <memoryBacking supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <enum name='sourceType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>file</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>anonymous</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>memfd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </memoryBacking>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <devices>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <disk supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='diskDevice'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>disk</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>cdrom</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>floppy</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>lun</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>ide</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>fdc</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>sata</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <graphics supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vnc</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>egl-headless</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </graphics>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <video supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='modelType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vga</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>cirrus</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>none</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>bochs</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>ramfb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </video>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <hostdev supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='mode'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>subsystem</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='startupPolicy'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>mandatory</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>requisite</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>optional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='subsysType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pci</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='capsType'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='pciBackend'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </hostdev>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <rng supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>random</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>egd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </rng>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <filesystem supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='driverType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>path</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>handle</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtiofs</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </filesystem>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <tpm supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tpm-tis</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tpm-crb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>emulator</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>external</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendVersion'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>2.0</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </tpm>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <redirdev supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </redirdev>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <channel supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </channel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <crypto supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>qemu</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </crypto>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <interface supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>passt</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </interface>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <panic supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>isa</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>hyperv</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </panic>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <console supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>null</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vc</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>dev</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>file</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pipe</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>stdio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>udp</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tcp</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>qemu-vdagent</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </console>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </devices>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <gic supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <vmcoreinfo supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <genid supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <backingStoreInput supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <backup supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <async-teardown supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <s390-pv supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <ps2 supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <tdx supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <sev supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <sgx supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <hyperv supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='features'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>relaxed</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vapic</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>spinlocks</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vpindex</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>runtime</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>synic</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>stimer</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>reset</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vendor_id</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>frequencies</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>reenlightenment</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tlbflush</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>ipi</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>avic</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>emsr_bitmap</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>xmm_input</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <defaults>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <spinlocks>4095</spinlocks>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <stimer_direct>on</stimer_direct>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </defaults>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </hyperv>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <launchSecurity supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: </domainCapabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.844 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: <domainCapabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <domain>kvm</domain>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <arch>i686</arch>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <vcpu max='1024'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <iothreads supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <os supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <enum name='firmware'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <loader supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>rom</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pflash</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='readonly'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>yes</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='secure'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </loader>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </os>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='maximum' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='maximumMigratable'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='host-model' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <vendor>AMD</vendor>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='x2apic'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='stibp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='succor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lbrv'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='custom' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Dhyana-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <memoryBacking supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <enum name='sourceType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>file</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>anonymous</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>memfd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </memoryBacking>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <devices>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <disk supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='diskDevice'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>disk</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>cdrom</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>floppy</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>lun</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>fdc</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>sata</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <graphics supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vnc</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>egl-headless</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </graphics>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <video supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='modelType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vga</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>cirrus</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>none</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>bochs</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>ramfb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </video>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <hostdev supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='mode'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>subsystem</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='startupPolicy'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>mandatory</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>requisite</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>optional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='subsysType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pci</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='capsType'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='pciBackend'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </hostdev>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <rng supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>random</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>egd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </rng>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <filesystem supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='driverType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>path</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>handle</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>virtiofs</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </filesystem>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <tpm supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tpm-tis</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tpm-crb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>emulator</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>external</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendVersion'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>2.0</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </tpm>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <redirdev supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </redirdev>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <channel supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </channel>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <crypto supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>qemu</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </crypto>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <interface supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='backendType'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>passt</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </interface>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <panic supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>isa</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>hyperv</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </panic>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <console supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>null</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vc</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>dev</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>file</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pipe</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>stdio</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>udp</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tcp</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>qemu-vdagent</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </console>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </devices>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <gic supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <vmcoreinfo supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <genid supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <backingStoreInput supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <backup supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <async-teardown supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <s390-pv supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <ps2 supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <tdx supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <sev supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <sgx supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <hyperv supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='features'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>relaxed</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vapic</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>spinlocks</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vpindex</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>runtime</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>synic</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>stimer</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>reset</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>vendor_id</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>frequencies</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>reenlightenment</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>tlbflush</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>ipi</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>avic</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>emsr_bitmap</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>xmm_input</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <defaults>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <spinlocks>4095</spinlocks>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <stimer_direct>on</stimer_direct>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </defaults>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </hyperv>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <launchSecurity supported='no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </features>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: </domainCapabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.918 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.924 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]: <domainCapabilities>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <domain>kvm</domain>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <machine>pc-i440fx-rhel7.6.0</machine>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <arch>x86_64</arch>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <vcpu max='240'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <iothreads supported='yes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <os supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <enum name='firmware'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <loader supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>rom</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>pflash</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='readonly'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>yes</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='secure'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </loader>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   </os>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:   <cpu>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='maximum' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <enum name='maximumMigratable'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='host-model' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <vendor>AMD</vendor>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='x2apic'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='stibp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='succor'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lbrv'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:     <mode name='custom' supported='yes'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v3'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='Dhyana-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan'>
Feb 01 09:35:34 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </cpu>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <memoryBacking supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <enum name='sourceType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>file</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>anonymous</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>memfd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </memoryBacking>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <devices>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <disk supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='diskDevice'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>disk</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>cdrom</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>floppy</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>lun</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>ide</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>fdc</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>sata</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <graphics supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vnc</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>egl-headless</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </graphics>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <video supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='modelType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vga</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>cirrus</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>none</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>bochs</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>ramfb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </video>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <hostdev supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='mode'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>subsystem</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='startupPolicy'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>mandatory</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>requisite</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>optional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='subsysType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pci</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='capsType'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='pciBackend'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </hostdev>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <rng supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>random</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>egd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </rng>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <filesystem supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='driverType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>path</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>handle</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtiofs</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </filesystem>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <tpm supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tpm-tis</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tpm-crb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>emulator</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>external</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendVersion'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>2.0</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </tpm>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <redirdev supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </redirdev>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <channel supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </channel>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <crypto supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>qemu</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </crypto>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <interface supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>passt</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </interface>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <panic supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>isa</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>hyperv</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </panic>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <console supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>null</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vc</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>dev</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>file</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pipe</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>stdio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>udp</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tcp</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>qemu-vdagent</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </console>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </devices>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <features>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <gic supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <vmcoreinfo supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <genid supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <backingStoreInput supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <backup supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <async-teardown supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <s390-pv supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <ps2 supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <tdx supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <sev supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <sgx supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <hyperv supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='features'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>relaxed</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vapic</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>spinlocks</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vpindex</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>runtime</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>synic</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>stimer</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>reset</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vendor_id</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>frequencies</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>reenlightenment</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tlbflush</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>ipi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>avic</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>emsr_bitmap</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>xmm_input</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <defaults>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <spinlocks>4095</spinlocks>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <stimer_direct>on</stimer_direct>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </defaults>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </hyperv>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <launchSecurity supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </features>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: </domainCapabilities>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:34.991 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: <domainCapabilities>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <path>/usr/libexec/qemu-kvm</path>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <domain>kvm</domain>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <machine>pc-q35-rhel9.8.0</machine>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <arch>x86_64</arch>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <vcpu max='1024'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <iothreads supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <os supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <enum name='firmware'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>efi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <loader supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>rom</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pflash</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='readonly'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>yes</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='secure'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>yes</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>no</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </loader>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </os>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <cpu>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <mode name='host-passthrough' supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='hostPassthroughMigratable'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <mode name='maximum' supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='maximumMigratable'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>on</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>off</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <mode name='host-model' supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model fallback='forbid'>EPYC-Rome</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <vendor>AMD</vendor>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <maxphysaddr mode='passthrough' limit='40'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='x2apic'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-deadline'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='hypervisor'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc_adjust'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='spec-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='stibp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ssbd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='cmp_legacy'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='overflow-recov'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='succor'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='amd-ssbd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='virt-ssbd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lbrv'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='tsc-scale'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='vmcb-clean'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pause-filter'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='pfthreshold'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='svme-addr-chk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='require' name='lfence-always-serializing'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <feature policy='disable' name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <mode name='custom' supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Broadwell-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Broadwell-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-noTSX'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cascadelake-Server-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='ClearwaterForest-v1'>ClearwaterForest</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>ClearwaterForest-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='ClearwaterForest-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ddpd-u'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sha512'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sm3'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sm4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Cooperlake-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Denverton-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Denverton-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Dhyana-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Genoa-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Genoa-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Milan-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Milan-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Rome-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-Rome-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='EPYC-Turin-v1'>EPYC-Turin</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-Turin-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-Turin-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amd-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='auto-ibrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vp2intersect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fs-gs-base-ns'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibpb-brtype'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='no-nested-data-bp'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='null-sel-clr-base'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='perfmon-v2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbpb'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='srso-user-kernel-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='stibp-always-on'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='AMD'>EPYC-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>EPYC-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='EPYC-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>GraniteRapids-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='GraniteRapids-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-128'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-256'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx10-512'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='prefetchiti'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Haswell-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Haswell-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-noTSX'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v6'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Icelake-Server-v7'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='IvyBridge-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='KnightsMill-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4fmaps'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-4vnniw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512er'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512pf'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G4-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Opteron_G5-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fma4'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tbm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xop'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SapphireRapids-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SapphireRapids-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='amx-tile'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-bf16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-fp16'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512-vpopcntdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bitalg'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vbmi2'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrc'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fzrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='la57'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='taa-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='tsx-ldtrk'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>SierraForest-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='SierraForest-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ifma'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-ne-convert'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx-vnni-int8'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bhi-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='bus-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cmpccxadd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fbsdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='fsrs'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ibrs-all'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='intel-psfd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ipred-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='lam'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mcdt-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pbrsb-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='psdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rfds-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rrsba-ctrl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='sbdr-ssdp-no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='serialize'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vaes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='vpclmulqdq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Client-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-noTSX-IBRS'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='hle'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='rtm'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Skylake-Server-v5'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512bw'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512cd'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512dq'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512f'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='avx512vl'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='invpcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pcid'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='pku'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='mpx'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v2'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v3'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='core-capability'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='split-lock-detect'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' vendor='Intel'>Snowridge-v4</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='Snowridge-v4'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='cldemote'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='erms'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='gfni'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdir64b'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='movdiri'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='xsaves'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' vendor='Intel'>Westmere-v2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='athlon-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='core2duo-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='coreduo-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='n270-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='ss'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <blockers model='phenom-v1'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnow'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <feature name='3dnowext'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </blockers>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </mode>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </cpu>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <memoryBacking supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <enum name='sourceType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>file</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>anonymous</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <value>memfd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </memoryBacking>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <devices>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <disk supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='diskDevice'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>disk</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>cdrom</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>floppy</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>lun</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>fdc</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>sata</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <graphics supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vnc</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>egl-headless</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </graphics>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <video supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='modelType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vga</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>cirrus</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>none</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>bochs</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>ramfb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </video>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <hostdev supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='mode'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>subsystem</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='startupPolicy'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>mandatory</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>requisite</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>optional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='subsysType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pci</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>scsi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='capsType'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='pciBackend'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </hostdev>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <rng supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtio-non-transitional</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>random</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>egd</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </rng>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <filesystem supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='driverType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>path</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>handle</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>virtiofs</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </filesystem>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <tpm supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tpm-tis</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tpm-crb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>emulator</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>external</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendVersion'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>2.0</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </tpm>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <redirdev supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='bus'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>usb</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </redirdev>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <channel supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </channel>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <crypto supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>qemu</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendModel'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>builtin</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </crypto>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <interface supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='backendType'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>default</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>passt</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </interface>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <panic supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='model'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>isa</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>hyperv</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </panic>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <console supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='type'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>null</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vc</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pty</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>dev</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>file</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>pipe</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>stdio</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>udp</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tcp</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>unix</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>qemu-vdagent</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>dbus</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </console>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </devices>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   <features>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <gic supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <vmcoreinfo supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <genid supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <backingStoreInput supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <backup supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <async-teardown supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <s390-pv supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <ps2 supported='yes'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <tdx supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <sev supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <sgx supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <hyperv supported='yes'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <enum name='features'>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>relaxed</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vapic</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>spinlocks</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vpindex</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>runtime</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>synic</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>stimer</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>reset</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>vendor_id</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>frequencies</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>reenlightenment</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>tlbflush</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>ipi</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>avic</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>emsr_bitmap</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <value>xmm_input</value>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </enum>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       <defaults>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <spinlocks>4095</spinlocks>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <stimer_direct>on</stimer_direct>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <tlbflush_direct>off</tlbflush_direct>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <tlbflush_extended>off</tlbflush_extended>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:         <vendor_id>Linux KVM Hv</vendor_id>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:       </defaults>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     </hyperv>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:     <launchSecurity supported='no'/>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:   </features>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: </domainCapabilities>
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]:  _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.053 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.053 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.058 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.058 274321 INFO nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Secure Boot support detected
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.060 274321 INFO nova.virt.libvirt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.060 274321 INFO nova.virt.libvirt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.069 274321 DEBUG nova.virt.libvirt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.087 274321 INFO nova.virt.node [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Determined node identity d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from /var/lib/nova/compute_id
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.102 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Verified node d5eeed9a-e4d0-4244-8d4e-39e5c8263590 matches my host np0005604215.localdomain _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.143 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.281 274321 DEBUG oslo_concurrency.lockutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.282 274321 DEBUG oslo_concurrency.lockutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.282 274321 DEBUG oslo_concurrency.lockutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.282 274321 DEBUG nova.compute.resource_tracker [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.282 274321 DEBUG oslo_concurrency.processutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:35:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29871 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668070E0000000001030307) 
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.774 274321 DEBUG oslo_concurrency.processutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.492s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:35:35 np0005604215.localdomain sudo[274570]:     zuul : TTY=pts/0 ; PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mnplwxtwcpjfxgeekxzhmlxgqghhrxea ; /usr/bin/python3.9 /home/zuul/.ansible/tmp/ansible-tmp-1769938535.5251293-3248-174577067705260/AnsiballZ_podman_container.py
Feb 01 09:35:35 np0005604215.localdomain sudo[274570]: pam_unix(sudo:session): session opened for user root(uid=0) by zuul(uid=1000)
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.955 274321 WARNING nova.virt.libvirt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.956 274321 DEBUG nova.compute.resource_tracker [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12906MB free_disk=41.83720779418945GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.956 274321 DEBUG oslo_concurrency.lockutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:35:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:35.956 274321 DEBUG oslo_concurrency.lockutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:35:36 np0005604215.localdomain python3.9[274572]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.136 274321 DEBUG nova.compute.resource_tracker [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.138 274321 DEBUG nova.compute.resource_tracker [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.197 274321 DEBUG nova.scheduler.client.report [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.224 274321 DEBUG nova.scheduler.client.report [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.225 274321 DEBUG nova.compute.provider_tree [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.242 274321 DEBUG nova.scheduler.client.report [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.275 274321 DEBUG nova.scheduler.client.report [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.299 274321 DEBUG oslo_concurrency.processutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:35:36 np0005604215.localdomain systemd[1]: Started libpod-conmon-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d.scope.
Feb 01 09:35:36 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:35:36 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:36 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:36 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Feb 01 09:35:36 np0005604215.localdomain podman[274596]: 2026-02-01 09:35:36.370349181 +0000 UTC m=+0.132242071 container init 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3)
Feb 01 09:35:36 np0005604215.localdomain podman[274596]: 2026-02-01 09:35:36.379897525 +0000 UTC m=+0.141790415 container start 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_managed=true, container_name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']})
Feb 01 09:35:36 np0005604215.localdomain python3.9[274572]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Applying nova statedir ownership
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 0 gid: 0 path: /var/lib/nova/delay-nova-compute
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache already 42436:42436
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache to system_u:object_r:container_file_t:s0
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Ownership of /var/lib/nova/.cache/python-entrypoints already 42436:42436
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.cache/python-entrypoints to system_u:object_r:container_file_t:s0
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/fc52238ffcbdcb325c6bf3fe6412477fc4bdb6cd9151f39289b74f25e08e0db9
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.cache/python-entrypoints/d301d14069645d8c23fee2987984776b3e88a570e1aa96d6cf3e31fa880385fd
Feb 01 09:35:36 np0005604215.localdomain nova_compute_init[274616]: INFO:nova_statedir:Nova statedir ownership complete
Feb 01 09:35:36 np0005604215.localdomain systemd[1]: libpod-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d.scope: Deactivated successfully.
Feb 01 09:35:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:35:36 np0005604215.localdomain podman[274648]: 2026-02-01 09:35:36.523230416 +0000 UTC m=+0.059893865 container died 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:35:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65435 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6680BCD0000000001030307) 
Feb 01 09:35:36 np0005604215.localdomain sudo[274570]: pam_unix(sudo:session): session closed for user root
Feb 01 09:35:36 np0005604215.localdomain podman[274655]: 2026-02-01 09:35:36.616705103 +0000 UTC m=+0.132482668 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:35:36 np0005604215.localdomain podman[274655]: 2026-02-01 09:35:36.628556967 +0000 UTC m=+0.144334452 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:35:36 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:35:36 np0005604215.localdomain podman[274648]: 2026-02-01 09:35:36.661180892 +0000 UTC m=+0.197844281 container cleanup 01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, container_name=nova_compute_init, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:35:36 np0005604215.localdomain systemd[1]: libpod-conmon-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d.scope: Deactivated successfully.
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.763 274321 DEBUG oslo_concurrency.processutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.769 274321 DEBUG nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.769 274321 INFO nova.virt.libvirt.host [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] kernel doesn't support AMD SEV
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.771 274321 DEBUG nova.compute.provider_tree [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.771 274321 DEBUG nova.virt.libvirt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.803 274321 DEBUG nova.scheduler.client.report [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.883 274321 DEBUG nova.compute.resource_tracker [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.883 274321 DEBUG oslo_concurrency.lockutils [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.884 274321 DEBUG nova.service [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.967 274321 DEBUG nova.service [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199
Feb 01 09:35:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:35:36.968 274321 DEBUG nova.servicegroup.drivers.db [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] DB_Driver: join new ServiceGroup member np0005604215.localdomain to the compute group, service = <Service: host=np0005604215.localdomain, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44
Feb 01 09:35:37 np0005604215.localdomain sshd[259602]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:35:37 np0005604215.localdomain systemd[1]: session-59.scope: Deactivated successfully.
Feb 01 09:35:37 np0005604215.localdomain systemd[1]: session-59.scope: Consumed 1min 15.889s CPU time.
Feb 01 09:35:37 np0005604215.localdomain systemd-logind[761]: Session 59 logged out. Waiting for processes to exit.
Feb 01 09:35:37 np0005604215.localdomain systemd-logind[761]: Removed session 59.
Feb 01 09:35:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a02df9e1e28dd9ec9663cf49666c784b8876dc545b7721fca6e88de98c1c0890-merged.mount: Deactivated successfully.
Feb 01 09:35:37 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-01ee0565d74fab69009e84c8e8c677af0f5369f8891e268438df8736d6cfb27d-userdata-shm.mount: Deactivated successfully.
Feb 01 09:35:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=43891 DF PROTO=TCP SPT=55848 DPT=9102 SEQ=2030436982 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6680F0D0000000001030307) 
Feb 01 09:35:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65436 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6681B8D0000000001030307) 
Feb 01 09:35:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:35:41.751 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:35:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:35:41.752 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:35:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:35:41.752 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:35:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:35:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:35:42 np0005604215.localdomain podman[274715]: 2026-02-01 09:35:42.863972384 +0000 UTC m=+0.078222909 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, container_name=ovn_controller)
Feb 01 09:35:42 np0005604215.localdomain podman[274715]: 2026-02-01 09:35:42.909599469 +0000 UTC m=+0.123849964 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:35:42 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:35:42 np0005604215.localdomain podman[274716]: 2026-02-01 09:35:42.926367495 +0000 UTC m=+0.135297856 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:35:42 np0005604215.localdomain podman[274716]: 2026-02-01 09:35:42.936883578 +0000 UTC m=+0.145813979 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:35:42 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:35:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65437 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6683B0D0000000001030307) 
Feb 01 09:35:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:35:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:35:51 np0005604215.localdomain podman[274763]: 2026-02-01 09:35:51.86902212 +0000 UTC m=+0.083912014 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, name=ubi9/ubi-minimal, release=1769056855)
Feb 01 09:35:51 np0005604215.localdomain podman[274763]: 2026-02-01 09:35:51.883726532 +0000 UTC m=+0.098616446 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1769056855, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:35:51 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:35:51 np0005604215.localdomain podman[274764]: 2026-02-01 09:35:51.966890292 +0000 UTC m=+0.177923757 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:35:51 np0005604215.localdomain podman[274764]: 2026-02-01 09:35:51.974679601 +0000 UTC m=+0.185713086 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:35:51 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:35:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:35:54.651 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:35:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:35:54.653 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:35:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:35:57 np0005604215.localdomain podman[274798]: 2026-02-01 09:35:57.905143602 +0000 UTC m=+0.081730147 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:35:57 np0005604215.localdomain podman[274798]: 2026-02-01 09:35:57.913739207 +0000 UTC m=+0.090325732 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:35:57 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:36:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:36:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:36:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:36:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:36:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:36:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16316 "" "Go-http-client/1.1"
Feb 01 09:36:00 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:36:00.656 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:36:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:36:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:36:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:36:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:36:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:36:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:36:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50868 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66875020000000001030307) 
Feb 01 09:36:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50869 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668790D0000000001030307) 
Feb 01 09:36:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65438 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6687B0D0000000001030307) 
Feb 01 09:36:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50870 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668810D0000000001030307) 
Feb 01 09:36:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:36:06 np0005604215.localdomain podman[274815]: 2026-02-01 09:36:06.830196186 +0000 UTC m=+0.081783488 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:36:06 np0005604215.localdomain podman[274815]: 2026-02-01 09:36:06.842809954 +0000 UTC m=+0.094397236 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:36:06 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:36:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=29872 DF PROTO=TCP SPT=58798 DPT=9102 SEQ=3651054409 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668850E0000000001030307) 
Feb 01 09:36:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50871 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66890CD0000000001030307) 
Feb 01 09:36:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:36:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:36:13 np0005604215.localdomain podman[274840]: 2026-02-01 09:36:13.87167871 +0000 UTC m=+0.086584026 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:36:13 np0005604215.localdomain podman[274840]: 2026-02-01 09:36:13.881679608 +0000 UTC m=+0.096584954 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:36:13 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:36:13 np0005604215.localdomain podman[274839]: 2026-02-01 09:36:13.979586291 +0000 UTC m=+0.196561931 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:36:14 np0005604215.localdomain podman[274839]: 2026-02-01 09:36:14.057736586 +0000 UTC m=+0.274712176 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3)
Feb 01 09:36:14 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:36:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50872 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668B10D0000000001030307) 
Feb 01 09:36:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:36:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:36:22 np0005604215.localdomain podman[274889]: 2026-02-01 09:36:22.864159692 +0000 UTC m=+0.074217496 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.7, container_name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, architecture=x86_64, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 09:36:22 np0005604215.localdomain podman[274889]: 2026-02-01 09:36:22.876539062 +0000 UTC m=+0.086596936 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, architecture=x86_64, container_name=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855)
Feb 01 09:36:22 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:36:22 np0005604215.localdomain podman[274890]: 2026-02-01 09:36:22.930566276 +0000 UTC m=+0.137412670 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 01 09:36:22 np0005604215.localdomain podman[274890]: 2026-02-01 09:36:22.959713202 +0000 UTC m=+0.166559626 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:36:22 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:36:25 np0005604215.localdomain sshd[274928]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:36:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:25.970 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:25.988 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:26 np0005604215.localdomain sshd[274928]: Invalid user qcp from 85.206.171.113 port 44752
Feb 01 09:36:26 np0005604215.localdomain sshd[274928]: Received disconnect from 85.206.171.113 port 44752:11: Bye Bye [preauth]
Feb 01 09:36:26 np0005604215.localdomain sshd[274928]: Disconnected from invalid user qcp 85.206.171.113 port 44752 [preauth]
Feb 01 09:36:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:36:28 np0005604215.localdomain podman[274930]: 2026-02-01 09:36:28.864183553 +0000 UTC m=+0.079290792 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 01 09:36:28 np0005604215.localdomain podman[274930]: 2026-02-01 09:36:28.876634556 +0000 UTC m=+0.091741795 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:36:28 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:36:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:36:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:36:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:36:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:36:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:36:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16304 "" "Go-http-client/1.1"
Feb 01 09:36:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:36:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:36:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:36:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:36:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:36:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:36:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51381 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668EA310000000001030307) 
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.102 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:36:34 np0005604215.localdomain sudo[274950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:36:34 np0005604215.localdomain sudo[274950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:36:34 np0005604215.localdomain sudo[274950]: pam_unix(sudo:session): session closed for user root
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.120 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.120 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.121 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.121 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.122 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.122 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.122 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.123 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.123 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.142 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.143 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.143 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.143 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.144 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:36:34 np0005604215.localdomain sudo[274968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:36:34 np0005604215.localdomain sudo[274968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:36:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51382 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668EE4E0000000001030307) 
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.587 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:36:34 np0005604215.localdomain sudo[274968]: pam_unix(sudo:session): session closed for user root
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.819 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.821 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12874MB free_disk=41.8370475769043GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.822 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.822 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.914 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.914 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:36:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:34.941 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:36:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50873 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668F10D0000000001030307) 
Feb 01 09:36:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:35.392 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:36:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:35.398 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:36:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:35.422 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:36:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:35.425 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:36:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:36:35.425 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.603s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:36:35 np0005604215.localdomain sudo[275063]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:36:35 np0005604215.localdomain sudo[275063]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:36:35 np0005604215.localdomain sudo[275063]: pam_unix(sudo:session): session closed for user root
Feb 01 09:36:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51383 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668F64D0000000001030307) 
Feb 01 09:36:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=65439 DF PROTO=TCP SPT=45760 DPT=9102 SEQ=4111040520 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA668F90D0000000001030307) 
Feb 01 09:36:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:36:37 np0005604215.localdomain podman[275081]: 2026-02-01 09:36:37.871763897 +0000 UTC m=+0.082526011 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:36:37 np0005604215.localdomain podman[275081]: 2026-02-01 09:36:37.879094233 +0000 UTC m=+0.089856327 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:36:37 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:36:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51384 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669060D0000000001030307) 
Feb 01 09:36:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:36:41.752 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:36:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:36:41.752 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:36:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:36:41.752 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:36:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:36:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:36:44 np0005604215.localdomain systemd[1]: tmp-crun.44i2Q3.mount: Deactivated successfully.
Feb 01 09:36:44 np0005604215.localdomain podman[275105]: 2026-02-01 09:36:44.87193157 +0000 UTC m=+0.086417121 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:36:44 np0005604215.localdomain podman[275106]: 2026-02-01 09:36:44.917579654 +0000 UTC m=+0.129804045 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:36:44 np0005604215.localdomain podman[275106]: 2026-02-01 09:36:44.927654385 +0000 UTC m=+0.139878766 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:36:44 np0005604215.localdomain podman[275105]: 2026-02-01 09:36:44.937723904 +0000 UTC m=+0.152209485 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:36:44 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:36:44 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:36:49 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51385 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669270D0000000001030307) 
Feb 01 09:36:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:36:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:36:53 np0005604215.localdomain systemd[1]: tmp-crun.Y9AB3S.mount: Deactivated successfully.
Feb 01 09:36:53 np0005604215.localdomain podman[275156]: 2026-02-01 09:36:53.881403354 +0000 UTC m=+0.090429384 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:36:53 np0005604215.localdomain podman[275156]: 2026-02-01 09:36:53.915623927 +0000 UTC m=+0.124649917 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:36:53 np0005604215.localdomain systemd[1]: tmp-crun.HtWLGq.mount: Deactivated successfully.
Feb 01 09:36:53 np0005604215.localdomain podman[275155]: 2026-02-01 09:36:53.930877827 +0000 UTC m=+0.142468606 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, version=9.7, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:36:53 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:36:53 np0005604215.localdomain podman[275155]: 2026-02-01 09:36:53.943524685 +0000 UTC m=+0.155115454 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, architecture=x86_64, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, version=9.7)
Feb 01 09:36:53 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:36:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:36:59 np0005604215.localdomain podman[275193]: 2026-02-01 09:36:59.906875288 +0000 UTC m=+0.123336746 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:36:59 np0005604215.localdomain podman[275193]: 2026-02-01 09:36:59.918712483 +0000 UTC m=+0.135173901 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:36:59 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:37:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:37:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:37:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:37:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:37:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:37:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16316 "" "Go-http-client/1.1"
Feb 01 09:37:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:37:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:37:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:37:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:37:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:37:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.403 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:37:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:37:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28220 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6695F610000000001030307) 
Feb 01 09:37:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28221 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669634D0000000001030307) 
Feb 01 09:37:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51386 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669670D0000000001030307) 
Feb 01 09:37:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28222 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6696B4D0000000001030307) 
Feb 01 09:37:06 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 01 09:37:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=50874 DF PROTO=TCP SPT=33804 DPT=9102 SEQ=429368433 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6696F0E0000000001030307) 
Feb 01 09:37:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:37:08 np0005604215.localdomain podman[275213]: 2026-02-01 09:37:08.862914991 +0000 UTC m=+0.078334959 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:37:08 np0005604215.localdomain podman[275213]: 2026-02-01 09:37:08.870237107 +0000 UTC m=+0.085657105 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:37:08 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:37:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28223 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6697B0D0000000001030307) 
Feb 01 09:37:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:37:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:37:15 np0005604215.localdomain podman[275236]: 2026-02-01 09:37:15.867601241 +0000 UTC m=+0.080956190 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:37:15 np0005604215.localdomain systemd[1]: tmp-crun.up0L2I.mount: Deactivated successfully.
Feb 01 09:37:15 np0005604215.localdomain podman[275237]: 2026-02-01 09:37:15.921477785 +0000 UTC m=+0.132942655 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:37:15 np0005604215.localdomain podman[275236]: 2026-02-01 09:37:15.928264404 +0000 UTC m=+0.141619353 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:37:15 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:37:15 np0005604215.localdomain podman[275237]: 2026-02-01 09:37:15.97934351 +0000 UTC m=+0.190808390 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:37:15 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:37:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28224 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA6699B0D0000000001030307) 
Feb 01 09:37:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:37:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:37:24 np0005604215.localdomain podman[275284]: 2026-02-01 09:37:24.865717462 +0000 UTC m=+0.076971257 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:37:24 np0005604215.localdomain podman[275284]: 2026-02-01 09:37:24.874575925 +0000 UTC m=+0.085829700 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Feb 01 09:37:24 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:37:24 np0005604215.localdomain podman[275283]: 2026-02-01 09:37:24.91748835 +0000 UTC m=+0.131366516 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z)
Feb 01 09:37:24 np0005604215.localdomain podman[275283]: 2026-02-01 09:37:24.958713783 +0000 UTC m=+0.172591959 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 01 09:37:24 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:37:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:37:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:37:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:37:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:37:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:37:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16318 "" "Go-http-client/1.1"
Feb 01 09:37:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:37:30 np0005604215.localdomain podman[275321]: 2026-02-01 09:37:30.865283918 +0000 UTC m=+0.077139993 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute)
Feb 01 09:37:30 np0005604215.localdomain podman[275321]: 2026-02-01 09:37:30.899164894 +0000 UTC m=+0.111020999 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 01 09:37:30 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:37:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:37:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:37:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:37:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:37:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:37:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:37:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32298 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669D4910000000001030307) 
Feb 01 09:37:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32299 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669D88D0000000001030307) 
Feb 01 09:37:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28225 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669DB0D0000000001030307) 
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.418 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.440 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.441 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.441 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.457 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.457 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.457 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.458 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.458 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.459 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.480 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.481 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.481 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.482 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.482 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:37:35 np0005604215.localdomain sudo[275360]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:37:35 np0005604215.localdomain sudo[275360]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:37:35 np0005604215.localdomain sudo[275360]: pam_unix(sudo:session): session closed for user root
Feb 01 09:37:35 np0005604215.localdomain sudo[275378]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:37:35 np0005604215.localdomain sudo[275378]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:37:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:35.923 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.096 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.097 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12889MB free_disk=41.83699035644531GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.098 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.098 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.190 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.190 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.212 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:37:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32300 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669E08E0000000001030307) 
Feb 01 09:37:36 np0005604215.localdomain sudo[275378]: pam_unix(sudo:session): session closed for user root
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.667 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.673 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.695 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.698 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:37:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:36.698 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.600s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:37:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:37.339 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:37.340 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:37.340 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:37:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:37:37.340 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:37:37 np0005604215.localdomain sudo[275452]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:37:37 np0005604215.localdomain sudo[275452]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:37:37 np0005604215.localdomain sudo[275452]: pam_unix(sudo:session): session closed for user root
Feb 01 09:37:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=51387 DF PROTO=TCP SPT=49496 DPT=9102 SEQ=1413387119 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669E50D0000000001030307) 
Feb 01 09:37:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:37:39 np0005604215.localdomain podman[275470]: 2026-02-01 09:37:39.8656687 +0000 UTC m=+0.080104925 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:37:39 np0005604215.localdomain podman[275470]: 2026-02-01 09:37:39.876177594 +0000 UTC m=+0.090613829 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:37:39 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:37:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32301 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA669F04E0000000001030307) 
Feb 01 09:37:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:37:41.752 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:37:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:37:41.753 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:37:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:37:41.753 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:37:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:37:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:37:46 np0005604215.localdomain systemd[1]: tmp-crun.pnIEZP.mount: Deactivated successfully.
Feb 01 09:37:46 np0005604215.localdomain podman[275493]: 2026-02-01 09:37:46.917596118 +0000 UTC m=+0.126075502 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_managed=true)
Feb 01 09:37:46 np0005604215.localdomain systemd[1]: tmp-crun.SfGiBk.mount: Deactivated successfully.
Feb 01 09:37:46 np0005604215.localdomain podman[275494]: 2026-02-01 09:37:46.932333293 +0000 UTC m=+0.137806175 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:37:46 np0005604215.localdomain podman[275494]: 2026-02-01 09:37:46.94810399 +0000 UTC m=+0.153576882 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:37:46 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:37:46 np0005604215.localdomain podman[275493]: 2026-02-01 09:37:46.990381545 +0000 UTC m=+0.198860919 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:37:47 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:37:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32302 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A110D0000000001030307) 
Feb 01 09:37:49 np0005604215.localdomain sshd[275542]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:37:50 np0005604215.localdomain sshd[275542]: Invalid user patrick from 85.206.171.113 port 45604
Feb 01 09:37:50 np0005604215.localdomain sshd[275542]: Received disconnect from 85.206.171.113 port 45604:11: Bye Bye [preauth]
Feb 01 09:37:50 np0005604215.localdomain sshd[275542]: Disconnected from invalid user patrick 85.206.171.113 port 45604 [preauth]
Feb 01 09:37:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:37:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:37:56 np0005604215.localdomain podman[275544]: 2026-02-01 09:37:56.389920637 +0000 UTC m=+0.082230529 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:37:56 np0005604215.localdomain podman[275544]: 2026-02-01 09:37:56.405690924 +0000 UTC m=+0.098000786 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-type=git, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, release=1769056855, io.openshift.tags=minimal rhel9, architecture=x86_64, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter)
Feb 01 09:37:56 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:37:56 np0005604215.localdomain systemd[1]: tmp-crun.m3mhsb.mount: Deactivated successfully.
Feb 01 09:37:56 np0005604215.localdomain podman[275545]: 2026-02-01 09:37:56.490212902 +0000 UTC m=+0.180123930 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:37:56 np0005604215.localdomain podman[275545]: 2026-02-01 09:37:56.520737385 +0000 UTC m=+0.210648413 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 01 09:37:56 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:38:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:38:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:38:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:38:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:38:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:38:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16313 "" "Go-http-client/1.1"
Feb 01 09:38:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:38:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:38:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:38:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:38:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:38:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:38:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:38:01 np0005604215.localdomain podman[275582]: 2026-02-01 09:38:01.869092719 +0000 UTC m=+0.084114968 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:38:01 np0005604215.localdomain podman[275582]: 2026-02-01 09:38:01.904813112 +0000 UTC m=+0.119835291 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:38:01 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:38:03 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20001 DF PROTO=TCP SPT=56740 DPT=9102 SEQ=1505691368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A49C10000000001030307) 
Feb 01 09:38:04 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20002 DF PROTO=TCP SPT=56740 DPT=9102 SEQ=1505691368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A4DCE0000000001030307) 
Feb 01 09:38:05 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32303 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A510D0000000001030307) 
Feb 01 09:38:06 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20003 DF PROTO=TCP SPT=56740 DPT=9102 SEQ=1505691368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A55CD0000000001030307) 
Feb 01 09:38:07 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=28226 DF PROTO=TCP SPT=46326 DPT=9102 SEQ=2536956989 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A590D0000000001030307) 
Feb 01 09:38:10 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20004 DF PROTO=TCP SPT=56740 DPT=9102 SEQ=1505691368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A658D0000000001030307) 
Feb 01 09:38:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:38:10 np0005604215.localdomain podman[275600]: 2026-02-01 09:38:10.863975142 +0000 UTC m=+0.079777923 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:38:10 np0005604215.localdomain podman[275600]: 2026-02-01 09:38:10.871604488 +0000 UTC m=+0.087407259 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:38:10 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:38:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:38:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:38:17 np0005604215.localdomain podman[275622]: 2026-02-01 09:38:17.86277902 +0000 UTC m=+0.080285359 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:38:17 np0005604215.localdomain podman[275622]: 2026-02-01 09:38:17.919798051 +0000 UTC m=+0.137304390 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Feb 01 09:38:17 np0005604215.localdomain podman[275623]: 2026-02-01 09:38:17.931379909 +0000 UTC m=+0.145003759 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:38:17 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:38:17 np0005604215.localdomain podman[275623]: 2026-02-01 09:38:17.942739059 +0000 UTC m=+0.156362919 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:38:17 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:38:18 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20005 DF PROTO=TCP SPT=56740 DPT=9102 SEQ=1505691368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66A850D0000000001030307) 
Feb 01 09:38:20 np0005604215.localdomain sshd[275669]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:38:20 np0005604215.localdomain sshd[275669]: Accepted publickey for zuul from 38.102.83.114 port 60422 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:38:21 np0005604215.localdomain systemd-logind[761]: New session 61 of user zuul.
Feb 01 09:38:21 np0005604215.localdomain systemd[1]: Started Session 61 of User zuul.
Feb 01 09:38:21 np0005604215.localdomain sshd[275669]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 09:38:21 np0005604215.localdomain sudo[275689]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ipyqoijrwjgoicsaemrbqmpkohguzmjq ; /usr/bin/python3
Feb 01 09:38:21 np0005604215.localdomain sudo[275689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 09:38:21 np0005604215.localdomain python3[275691]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:38:21 np0005604215.localdomain subscription-manager[275692]: Unregistered machine with identity: 228c691b-7b73-45e5-afcd-2aea3d003268
Feb 01 09:38:21 np0005604215.localdomain sudo[275689]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:38:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:38:26 np0005604215.localdomain podman[275695]: 2026-02-01 09:38:26.875177164 +0000 UTC m=+0.084982386 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 09:38:26 np0005604215.localdomain podman[275695]: 2026-02-01 09:38:26.910696499 +0000 UTC m=+0.120501731 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:38:26 np0005604215.localdomain podman[275694]: 2026-02-01 09:38:26.922116752 +0000 UTC m=+0.133916525 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, release=1769056855, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Feb 01 09:38:26 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:38:26 np0005604215.localdomain podman[275694]: 2026-02-01 09:38:26.935192946 +0000 UTC m=+0.146992749 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, distribution-scope=public, vcs-type=git, release=1769056855, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9)
Feb 01 09:38:26 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:38:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:38:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:38:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:38:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:38:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:38:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16321 "" "Go-http-client/1.1"
Feb 01 09:38:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:38:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:38:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:38:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:38:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:38:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:38:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:38:32 np0005604215.localdomain systemd[1]: tmp-crun.17CGPm.mount: Deactivated successfully.
Feb 01 09:38:32 np0005604215.localdomain podman[275734]: 2026-02-01 09:38:32.901458444 +0000 UTC m=+0.117072355 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 01 09:38:32 np0005604215.localdomain podman[275734]: 2026-02-01 09:38:32.939039644 +0000 UTC m=+0.154653515 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 01 09:38:32 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:38:33 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22243 DF PROTO=TCP SPT=42806 DPT=9102 SEQ=3675944437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66ABEF10000000001030307) 
Feb 01 09:38:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:34.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:34.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:34 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22244 DF PROTO=TCP SPT=42806 DPT=9102 SEQ=3675944437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66AC30D0000000001030307) 
Feb 01 09:38:35 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=20006 DF PROTO=TCP SPT=56740 DPT=9102 SEQ=1505691368 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66AC50E0000000001030307) 
Feb 01 09:38:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:35.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:35.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:38:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:35.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:38:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:35.121 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:38:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:36.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:36.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:36.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:36 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22245 DF PROTO=TCP SPT=42806 DPT=9102 SEQ=3675944437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66ACB0D0000000001030307) 
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.134 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.134 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.135 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.135 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.136 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:38:37 np0005604215.localdomain sudo[275771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:38:37 np0005604215.localdomain sudo[275771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:37 np0005604215.localdomain sudo[275771]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:37 np0005604215.localdomain sudo[275789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:38:37 np0005604215.localdomain sudo[275789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.594 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:38:37 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=32304 DF PROTO=TCP SPT=56726 DPT=9102 SEQ=3542751615 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66ACF0E0000000001030307) 
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.802 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.804 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12894MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.805 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.805 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.888 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.888 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:38:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:37.914 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:38:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:38.421 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.506s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:38:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:38.428 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:38:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:38.447 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:38:38 np0005604215.localdomain systemd[1]: tmp-crun.B2HWan.mount: Deactivated successfully.
Feb 01 09:38:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:38.450 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:38:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:38.450 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.645s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:38:38 np0005604215.localdomain podman[275903]: 2026-02-01 09:38:38.456571862 +0000 UTC m=+0.097696547 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.tags=rhceph ceph, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=rhceph, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:38:38 np0005604215.localdomain podman[275903]: 2026-02-01 09:38:38.590714172 +0000 UTC m=+0.231838817 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, ceph=True, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, io.openshift.expose-services=, RELEASE=main, vcs-type=git)
Feb 01 09:38:38 np0005604215.localdomain sudo[275789]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:38 np0005604215.localdomain sudo[275969]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:38:39 np0005604215.localdomain sudo[275969]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:39 np0005604215.localdomain sudo[275969]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:39 np0005604215.localdomain sudo[275987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:38:39 np0005604215.localdomain sudo[275987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:38:39.451 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:38:39 np0005604215.localdomain sudo[275987]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:40 np0005604215.localdomain sudo[276038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:38:40 np0005604215.localdomain sudo[276038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:40 np0005604215.localdomain sudo[276038]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:40 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22246 DF PROTO=TCP SPT=42806 DPT=9102 SEQ=3675944437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66ADACD0000000001030307) 
Feb 01 09:38:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:38:41.753 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:38:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:38:41.754 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:38:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:38:41.754 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:38:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:38:41 np0005604215.localdomain podman[276056]: 2026-02-01 09:38:41.877020209 +0000 UTC m=+0.083749036 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:38:41 np0005604215.localdomain podman[276056]: 2026-02-01 09:38:41.885001935 +0000 UTC m=+0.091730792 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:38:41 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:38:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:38:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:38:48 np0005604215.localdomain podman[276081]: 2026-02-01 09:38:48.881940986 +0000 UTC m=+0.093805536 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 01 09:38:48 np0005604215.localdomain kernel: DROPPING: IN=br-ex OUT= MACSRC=fa:16:3e:39:49:6a MACDST=fa:16:3e:17:45:d1 MACPROTO=0800 SRC=192.168.122.10 DST=192.168.122.108 LEN=60 TOS=0x00 PREC=0x00 TTL=62 ID=22247 DF PROTO=TCP SPT=42806 DPT=9102 SEQ=3675944437 ACK=0 WINDOW=32640 RES=0x00 SYN URGP=0 OPT (020405500402080AA66AFB0D0000000001030307) 
Feb 01 09:38:48 np0005604215.localdomain podman[276081]: 2026-02-01 09:38:48.922699295 +0000 UTC m=+0.134563615 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 01 09:38:48 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:38:48 np0005604215.localdomain podman[276082]: 2026-02-01 09:38:48.928616307 +0000 UTC m=+0.137493515 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:38:49 np0005604215.localdomain podman[276082]: 2026-02-01 09:38:49.01165337 +0000 UTC m=+0.220530528 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:38:49 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:38:50 np0005604215.localdomain sudo[276127]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:38:50 np0005604215.localdomain sudo[276127]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:50 np0005604215.localdomain sudo[276127]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:51 np0005604215.localdomain sudo[276145]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:38:51 np0005604215.localdomain sudo[276145]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:51 np0005604215.localdomain sudo[276145]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:53 np0005604215.localdomain sudo[276163]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:38:53 np0005604215.localdomain sudo[276163]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:38:53 np0005604215.localdomain sudo[276163]: pam_unix(sudo:session): session closed for user root
Feb 01 09:38:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:38:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:38:57 np0005604215.localdomain podman[276181]: 2026-02-01 09:38:57.869924306 +0000 UTC m=+0.085745028 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=9.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, release=1769056855, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc.)
Feb 01 09:38:57 np0005604215.localdomain podman[276181]: 2026-02-01 09:38:57.883512925 +0000 UTC m=+0.099333617 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container)
Feb 01 09:38:57 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:38:57 np0005604215.localdomain podman[276182]: 2026-02-01 09:38:57.97050761 +0000 UTC m=+0.181712279 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 01 09:38:58 np0005604215.localdomain podman[276182]: 2026-02-01 09:38:58.005640895 +0000 UTC m=+0.216845554 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 01 09:38:58 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:38:59 np0005604215.localdomain sshd[276219]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:38:59 np0005604215.localdomain sshd[276219]: Accepted publickey for tripleo-admin from 192.168.122.11 port 43160 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:38:59 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 01 09:38:59 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 01 09:38:59 np0005604215.localdomain systemd-logind[761]: New session 62 of user tripleo-admin.
Feb 01 09:38:59 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 01 09:38:59 np0005604215.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Queued start job for default target Main User Target.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Created slice User Application Slice.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Reached target Paths.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Reached target Timers.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Starting D-Bus User Message Bus Socket...
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Starting Create User's Volatile Files and Directories...
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Listening on D-Bus User Message Bus Socket.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Reached target Sockets.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Finished Create User's Volatile Files and Directories.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Reached target Basic System.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Reached target Main User Target.
Feb 01 09:38:59 np0005604215.localdomain systemd[276223]: Startup finished in 149ms.
Feb 01 09:38:59 np0005604215.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 01 09:38:59 np0005604215.localdomain systemd[1]: Started Session 62 of User tripleo-admin.
Feb 01 09:38:59 np0005604215.localdomain sshd[276219]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 01 09:39:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:39:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:39:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:39:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 146808 "" "Go-http-client/1.1"
Feb 01 09:39:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:39:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16321 "" "Go-http-client/1.1"
Feb 01 09:39:00 np0005604215.localdomain sudo[276364]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-mhaohmkryjfbzaginaapalmrpdygifba ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769938740.074101-60270-239655706492526/AnsiballZ_blockinfile.py
Feb 01 09:39:00 np0005604215.localdomain sudo[276364]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 09:39:00 np0005604215.localdomain python3[276366]: ansible-ansible.builtin.blockinfile Invoked with marker_begin=BEGIN ceph firewall rules marker_end=END ceph firewall rules path=/etc/nftables/edpm-rules.nft mode=0644 block=# 100 ceph_alertmanager (9093)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9093 } ct state new counter accept comment "100 ceph_alertmanager"
                                                          # 100 ceph_dashboard (8443)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8443 } ct state new counter accept comment "100 ceph_dashboard"
                                                          # 100 ceph_grafana (3100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 3100 } ct state new counter accept comment "100 ceph_grafana"
                                                          # 100 ceph_prometheus (9092)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9092 } ct state new counter accept comment "100 ceph_prometheus"
                                                          # 100 ceph_rgw (8080)
                                                          add rule inet filter EDPM_INPUT tcp dport { 8080 } ct state new counter accept comment "100 ceph_rgw"
                                                          # 110 ceph_mon (6789, 3300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6789,3300,9100 } ct state new counter accept comment "110 ceph_mon"
                                                          # 112 ceph_mds (6800-7300, 9100)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,9100 } ct state new counter accept comment "112 ceph_mds"
                                                          # 113 ceph_mgr (6800-7300, 8444)
                                                          add rule inet filter EDPM_INPUT tcp dport { 6800-7300,8444 } ct state new counter accept comment "113 ceph_mgr"
                                                          # 120 ceph_nfs (2049, 12049)
                                                          add rule inet filter EDPM_INPUT tcp dport { 2049,12049 } ct state new counter accept comment "120 ceph_nfs"
                                                          # 123 ceph_dashboard (9090, 9094, 9283)
                                                          add rule inet filter EDPM_INPUT tcp dport { 9090,9094,9283 } ct state new counter accept comment "123 ceph_dashboard"
                                                           insertbefore=^# Lock down INPUT chains state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False unsafe_writes=False insertafter=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:39:00 np0005604215.localdomain sudo[276364]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:00 np0005604215.localdomain systemd-journald[47940]: Field hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 80.5 (268 of 333 items), suggesting rotation.
Feb 01 09:39:00 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 09:39:00 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:39:00 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 09:39:01 np0005604215.localdomain sudo[276509]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-tyidrqzbtzwzpwvhalovfshznboskdxb ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769938740.8052518-60284-249907877640772/AnsiballZ_systemd.py
Feb 01 09:39:01 np0005604215.localdomain sudo[276509]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 09:39:01 np0005604215.localdomain python3[276511]: ansible-ansible.builtin.systemd Invoked with name=nftables state=restarted enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Feb 01 09:39:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:39:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:39:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:39:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:39:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:39:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:39:01 np0005604215.localdomain systemd[1]: Stopping Netfilter Tables...
Feb 01 09:39:01 np0005604215.localdomain systemd[1]: nftables.service: Deactivated successfully.
Feb 01 09:39:01 np0005604215.localdomain systemd[1]: Stopped Netfilter Tables.
Feb 01 09:39:01 np0005604215.localdomain systemd[1]: Starting Netfilter Tables...
Feb 01 09:39:01 np0005604215.localdomain systemd[1]: Finished Netfilter Tables.
Feb 01 09:39:01 np0005604215.localdomain sudo[276509]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:39:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:39:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:39:03 np0005604215.localdomain podman[276535]: 2026-02-01 09:39:03.884515065 +0000 UTC m=+0.091348821 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:39:03 np0005604215.localdomain podman[276535]: 2026-02-01 09:39:03.920385962 +0000 UTC m=+0.127219718 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 01 09:39:03 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:39:04 np0005604215.localdomain sudo[276555]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:04 np0005604215.localdomain sudo[276555]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:04 np0005604215.localdomain sudo[276555]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:05 np0005604215.localdomain sudo[276573]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:05 np0005604215.localdomain sudo[276573]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:05 np0005604215.localdomain sudo[276573]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:07 np0005604215.localdomain sudo[276591]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:07 np0005604215.localdomain sudo[276591]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:07 np0005604215.localdomain sudo[276591]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:08 np0005604215.localdomain sudo[276609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:08 np0005604215.localdomain sudo[276609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:08 np0005604215.localdomain sudo[276609]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:09 np0005604215.localdomain sudo[276627]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:09 np0005604215.localdomain sudo[276627]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:09 np0005604215.localdomain sudo[276627]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:11 np0005604215.localdomain sudo[276645]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:11 np0005604215.localdomain sudo[276645]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:11 np0005604215.localdomain sudo[276645]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:12 np0005604215.localdomain sudo[276663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:39:12 np0005604215.localdomain sudo[276663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:39:12 np0005604215.localdomain sudo[276663]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:12 np0005604215.localdomain sudo[276682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:39:12 np0005604215.localdomain sudo[276682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:12 np0005604215.localdomain podman[276681]: 2026-02-01 09:39:12.329560913 +0000 UTC m=+0.108790672 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:39:12 np0005604215.localdomain podman[276681]: 2026-02-01 09:39:12.365960288 +0000 UTC m=+0.145190087 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:39:12 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:39:12 np0005604215.localdomain podman[276768]: 
Feb 01 09:39:12 np0005604215.localdomain podman[276768]: 2026-02-01 09:39:12.955609222 +0000 UTC m=+0.087021243 container create 0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_cohen, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, distribution-scope=public, version=7, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, CEPH_POINT_RELEASE=, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: Started libpod-conmon-0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f.scope.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:39:13 np0005604215.localdomain podman[276768]: 2026-02-01 09:39:12.919370893 +0000 UTC m=+0.050782944 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:39:13 np0005604215.localdomain podman[276768]: 2026-02-01 09:39:13.031795467 +0000 UTC m=+0.163207498 container init 0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_cohen, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-12-08T17:28:53Z, release=1764794109, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vcs-type=git, GIT_BRANCH=main)
Feb 01 09:39:13 np0005604215.localdomain podman[276768]: 2026-02-01 09:39:13.043114581 +0000 UTC m=+0.174526612 container start 0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_cohen, GIT_BRANCH=main, ceph=True, io.buildah.version=1.41.4, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:39:13 np0005604215.localdomain podman[276768]: 2026-02-01 09:39:13.043510254 +0000 UTC m=+0.174922275 container attach 0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_cohen, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, distribution-scope=public, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, architecture=x86_64, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, name=rhceph, vcs-type=git, version=7)
Feb 01 09:39:13 np0005604215.localdomain blissful_cohen[276783]: 167 167
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: libpod-0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f.scope: Deactivated successfully.
Feb 01 09:39:13 np0005604215.localdomain podman[276768]: 2026-02-01 09:39:13.046787005 +0000 UTC m=+0.178199066 container died 0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_cohen, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, release=1764794109, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:39:13 np0005604215.localdomain podman[276788]: 2026-02-01 09:39:13.160005847 +0000 UTC m=+0.104976043 container remove 0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_cohen, CEPH_POINT_RELEASE=, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, io.openshift.tags=rhceph ceph)
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: libpod-conmon-0b5bd4915eb063cd625a2e0f2b0ef572f82700514af29d46155b41174cefc25f.scope: Deactivated successfully.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:39:13 np0005604215.localdomain systemd-rc-local-generator[276827]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:39:13 np0005604215.localdomain systemd-sysv-generator[276831]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-56125ce038aae7582b72251e3f01def7b65b146c40eacb0a4e07adc2a2360160-merged.mount: Deactivated successfully.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:39:13 np0005604215.localdomain systemd-rc-local-generator[276872]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:39:13 np0005604215.localdomain systemd-sysv-generator[276875]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:13 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:39:14 np0005604215.localdomain systemd[1]: Starting Ceph mds.mds.np0005604215.rwvxvg for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 09:39:14 np0005604215.localdomain podman[276934]: 
Feb 01 09:39:14 np0005604215.localdomain podman[276934]: 2026-02-01 09:39:14.362128918 +0000 UTC m=+0.083522924 container create 80bcdc70795a926b997388022a5fe5b4e291e6a23c353a1e18858d4b05d1df54 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604215-rwvxvg, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, vendor=Red Hat, Inc., RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public)
Feb 01 09:39:14 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aaa67d8cb79d39b31bd82202d5bf1b420176af306b56745e9b5840ff65ee52/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 09:39:14 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aaa67d8cb79d39b31bd82202d5bf1b420176af306b56745e9b5840ff65ee52/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:39:14 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aaa67d8cb79d39b31bd82202d5bf1b420176af306b56745e9b5840ff65ee52/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 09:39:14 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/81aaa67d8cb79d39b31bd82202d5bf1b420176af306b56745e9b5840ff65ee52/merged/var/lib/ceph/mds/ceph-mds.np0005604215.rwvxvg supports timestamps until 2038 (0x7fffffff)
Feb 01 09:39:14 np0005604215.localdomain podman[276934]: 2026-02-01 09:39:14.327200953 +0000 UTC m=+0.048594969 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:39:14 np0005604215.localdomain podman[276934]: 2026-02-01 09:39:14.430109414 +0000 UTC m=+0.151503420 container init 80bcdc70795a926b997388022a5fe5b4e291e6a23c353a1e18858d4b05d1df54 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604215-rwvxvg, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, version=7, build-date=2025-12-08T17:28:53Z, release=1764794109, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, vcs-type=git, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, ceph=True, GIT_CLEAN=True, GIT_BRANCH=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container)
Feb 01 09:39:14 np0005604215.localdomain podman[276934]: 2026-02-01 09:39:14.441946936 +0000 UTC m=+0.163340942 container start 80bcdc70795a926b997388022a5fe5b4e291e6a23c353a1e18858d4b05d1df54 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604215-rwvxvg, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, version=7, release=1764794109, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z)
Feb 01 09:39:14 np0005604215.localdomain bash[276934]: 80bcdc70795a926b997388022a5fe5b4e291e6a23c353a1e18858d4b05d1df54
Feb 01 09:39:14 np0005604215.localdomain systemd[1]: Started Ceph mds.mds.np0005604215.rwvxvg for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 09:39:14 np0005604215.localdomain ceph-mds[276952]: set uid:gid to 167:167 (ceph:ceph)
Feb 01 09:39:14 np0005604215.localdomain ceph-mds[276952]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mds, pid 2
Feb 01 09:39:14 np0005604215.localdomain ceph-mds[276952]: main not setting numa affinity
Feb 01 09:39:14 np0005604215.localdomain ceph-mds[276952]: pidfile_write: ignore empty --pid-file
Feb 01 09:39:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mds-mds-np0005604215-rwvxvg[276948]: starting mds.mds.np0005604215.rwvxvg at 
Feb 01 09:39:14 np0005604215.localdomain sudo[276682]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:14 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Updating MDS map to version 6 from mon.0
Feb 01 09:39:15 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Updating MDS map to version 7 from mon.0
Feb 01 09:39:15 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Monitors have assigned me to become a standby.
Feb 01 09:39:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:39:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:39:19 np0005604215.localdomain podman[276973]: 2026-02-01 09:39:19.875336924 +0000 UTC m=+0.086382302 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:39:19 np0005604215.localdomain podman[276973]: 2026-02-01 09:39:19.908682965 +0000 UTC m=+0.119728353 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:39:19 np0005604215.localdomain sudo[276995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:19 np0005604215.localdomain sudo[276995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:19 np0005604215.localdomain podman[276972]: 2026-02-01 09:39:19.923697984 +0000 UTC m=+0.134616497 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:39:19 np0005604215.localdomain sudo[276995]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:19 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:39:19 np0005604215.localdomain podman[276972]: 2026-02-01 09:39:19.960682699 +0000 UTC m=+0.171601232 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:39:19 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:39:20 np0005604215.localdomain sudo[277037]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:39:20 np0005604215.localdomain sudo[277037]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:20 np0005604215.localdomain sudo[277037]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:20 np0005604215.localdomain sudo[277056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:39:20 np0005604215.localdomain sudo[277056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:20 np0005604215.localdomain systemd[1]: tmp-crun.5IFeOy.mount: Deactivated successfully.
Feb 01 09:39:20 np0005604215.localdomain podman[277145]: 2026-02-01 09:39:20.843221239 +0000 UTC m=+0.090704158 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z)
Feb 01 09:39:20 np0005604215.localdomain podman[277145]: 2026-02-01 09:39:20.947999554 +0000 UTC m=+0.195482523 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, architecture=x86_64, RELEASE=main, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, ceph=True, CEPH_POINT_RELEASE=, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public)
Feb 01 09:39:21 np0005604215.localdomain sudo[277056]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:21 np0005604215.localdomain sshd[275672]: Received disconnect from 38.102.83.114 port 60422:11: disconnected by user
Feb 01 09:39:21 np0005604215.localdomain sshd[275672]: Disconnected from user zuul 38.102.83.114 port 60422
Feb 01 09:39:21 np0005604215.localdomain sshd[275669]: pam_unix(sshd:session): session closed for user zuul
Feb 01 09:39:21 np0005604215.localdomain systemd-logind[761]: Session 61 logged out. Waiting for processes to exit.
Feb 01 09:39:21 np0005604215.localdomain systemd[1]: session-61.scope: Deactivated successfully.
Feb 01 09:39:21 np0005604215.localdomain systemd-logind[761]: Removed session 61.
Feb 01 09:39:22 np0005604215.localdomain sudo[277225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:22 np0005604215.localdomain sudo[277225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:22 np0005604215.localdomain sudo[277225]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:22 np0005604215.localdomain sudo[277243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:39:22 np0005604215.localdomain sudo[277243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:22 np0005604215.localdomain sudo[277243]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:39:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:39:28 np0005604215.localdomain systemd[1]: tmp-crun.iGqWnf.mount: Deactivated successfully.
Feb 01 09:39:28 np0005604215.localdomain podman[277261]: 2026-02-01 09:39:28.873788536 +0000 UTC m=+0.080641957 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, container_name=openstack_network_exporter, vendor=Red Hat, Inc.)
Feb 01 09:39:28 np0005604215.localdomain podman[277261]: 2026-02-01 09:39:28.915339665 +0000 UTC m=+0.122193036 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, release=1769056855)
Feb 01 09:39:28 np0005604215.localdomain podman[277262]: 2026-02-01 09:39:28.927023182 +0000 UTC m=+0.134112711 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127)
Feb 01 09:39:28 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:39:28 np0005604215.localdomain podman[277262]: 2026-02-01 09:39:28.95970317 +0000 UTC m=+0.166792749 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:39:28 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:39:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:39:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:39:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:39:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149013 "" "Go-http-client/1.1"
Feb 01 09:39:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:39:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16801 "" "Go-http-client/1.1"
Feb 01 09:39:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:39:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:39:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:39:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:39:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:39:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:39:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:39:34 np0005604215.localdomain systemd[1]: tmp-crun.7MDCDr.mount: Deactivated successfully.
Feb 01 09:39:34 np0005604215.localdomain podman[277299]: 2026-02-01 09:39:34.86509388 +0000 UTC m=+0.081197045 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:39:34 np0005604215.localdomain podman[277299]: 2026-02-01 09:39:34.87484371 +0000 UTC m=+0.090946855 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:39:34 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:39:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:35.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:35.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:39:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:35.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:39:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:35.119 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:39:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:35.120 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:36.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:36.116 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:36.116 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.129 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.129 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.130 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.130 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.130 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.572 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.711 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.712 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12867MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.712 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.712 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.786 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.787 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:39:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:38.812 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:39:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:39.222 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:39:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:39.228 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:39:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:39.242 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:39:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:39.244 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:39:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:39.245 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.532s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:39:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:39:41.245 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:39:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:39:41.755 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:39:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:39:41.755 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:39:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:39:41.755 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:39:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:39:42 np0005604215.localdomain systemd[1]: tmp-crun.gZndlo.mount: Deactivated successfully.
Feb 01 09:39:42 np0005604215.localdomain podman[277361]: 2026-02-01 09:39:42.859412376 +0000 UTC m=+0.074936884 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:39:42 np0005604215.localdomain podman[277361]: 2026-02-01 09:39:42.896568346 +0000 UTC m=+0.112092884 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:39:42 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:39:46 np0005604215.localdomain sudo[277384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:39:46 np0005604215.localdomain sudo[277384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:46 np0005604215.localdomain sudo[277384]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:46 np0005604215.localdomain sudo[277402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:39:46 np0005604215.localdomain sudo[277402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:39:47 np0005604215.localdomain sudo[277402]: pam_unix(sudo:session): session closed for user root
Feb 01 09:39:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:39:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:39:50 np0005604215.localdomain podman[277451]: 2026-02-01 09:39:50.869159615 +0000 UTC m=+0.082929924 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:39:50 np0005604215.localdomain podman[277452]: 2026-02-01 09:39:50.921079567 +0000 UTC m=+0.131072818 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:39:50 np0005604215.localdomain podman[277451]: 2026-02-01 09:39:50.95153725 +0000 UTC m=+0.165307609 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:39:50 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:39:51 np0005604215.localdomain podman[277452]: 2026-02-01 09:39:51.004841909 +0000 UTC m=+0.214834940 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:39:51 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:39:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:39:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:39:59 np0005604215.localdomain systemd[1]: tmp-crun.nvqiz1.mount: Deactivated successfully.
Feb 01 09:39:59 np0005604215.localdomain podman[277501]: 2026-02-01 09:39:59.864818883 +0000 UTC m=+0.073218426 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:39:59 np0005604215.localdomain podman[277501]: 2026-02-01 09:39:59.896547189 +0000 UTC m=+0.104946742 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 01 09:39:59 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:39:59 np0005604215.localdomain systemd[1]: tmp-crun.6dLjqS.mount: Deactivated successfully.
Feb 01 09:39:59 np0005604215.localdomain podman[277500]: 2026-02-01 09:39:59.968115776 +0000 UTC m=+0.180049049 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, architecture=x86_64, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:40:00 np0005604215.localdomain podman[277500]: 2026-02-01 09:40:00.005586438 +0000 UTC m=+0.217519741 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, maintainer=Red Hat, Inc.)
Feb 01 09:40:00 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:40:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:40:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:40:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:40:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149013 "" "Go-http-client/1.1"
Feb 01 09:40:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:40:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16804 "" "Go-http-client/1.1"
Feb 01 09:40:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:40:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:40:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:40:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:40:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:40:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:40:01 np0005604215.localdomain sshd[276239]: Received disconnect from 192.168.122.11 port 43160:11: disconnected by user
Feb 01 09:40:01 np0005604215.localdomain sshd[276239]: Disconnected from user tripleo-admin 192.168.122.11 port 43160
Feb 01 09:40:01 np0005604215.localdomain sshd[276219]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 01 09:40:01 np0005604215.localdomain systemd[1]: session-62.scope: Deactivated successfully.
Feb 01 09:40:01 np0005604215.localdomain systemd[1]: session-62.scope: Consumed 1.345s CPU time.
Feb 01 09:40:01 np0005604215.localdomain systemd-logind[761]: Session 62 logged out. Waiting for processes to exit.
Feb 01 09:40:01 np0005604215.localdomain systemd-logind[761]: Removed session 62.
Feb 01 09:40:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:40:05 np0005604215.localdomain podman[277539]: 2026-02-01 09:40:05.879407946 +0000 UTC m=+0.090734889 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:40:05 np0005604215.localdomain podman[277539]: 2026-02-01 09:40:05.946021176 +0000 UTC m=+0.157348119 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 01 09:40:05 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Activating special unit Exit the Session...
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped target Main User Target.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped target Basic System.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped target Paths.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped target Sockets.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped target Timers.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Closed D-Bus User Message Bus Socket.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Stopped Create User's Volatile Files and Directories.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Removed slice User Application Slice.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Reached target Shutdown.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Finished Exit the Session.
Feb 01 09:40:11 np0005604215.localdomain systemd[276223]: Reached target Exit the Session.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 01 09:40:11 np0005604215.localdomain systemd[1]: user-1003.slice: Consumed 1.717s CPU time.
Feb 01 09:40:13 np0005604215.localdomain sudo[277557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:13 np0005604215.localdomain sudo[277557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:40:13 np0005604215.localdomain sudo[277557]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:13 np0005604215.localdomain podman[277575]: 2026-02-01 09:40:13.653239153 +0000 UTC m=+0.083023127 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:40:13 np0005604215.localdomain podman[277575]: 2026-02-01 09:40:13.686722939 +0000 UTC m=+0.116506923 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:40:13 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:40:15 np0005604215.localdomain sudo[277598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:15 np0005604215.localdomain sudo[277598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:15 np0005604215.localdomain sudo[277598]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:16 np0005604215.localdomain sudo[277616]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:16 np0005604215.localdomain sudo[277616]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:16 np0005604215.localdomain sudo[277616]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:40:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:40:21 np0005604215.localdomain podman[277635]: 2026-02-01 09:40:21.876408004 +0000 UTC m=+0.086859928 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:40:21 np0005604215.localdomain podman[277635]: 2026-02-01 09:40:21.883968251 +0000 UTC m=+0.094420215 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:40:21 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:40:21 np0005604215.localdomain podman[277634]: 2026-02-01 09:40:21.925120376 +0000 UTC m=+0.138885962 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:40:21 np0005604215.localdomain podman[277634]: 2026-02-01 09:40:21.991276051 +0000 UTC m=+0.205041627 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:40:22 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:40:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:40:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:40:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:40:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 149013 "" "Go-http-client/1.1"
Feb 01 09:40:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:40:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 16804 "" "Go-http-client/1.1"
Feb 01 09:40:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:40:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:40:30 np0005604215.localdomain podman[277684]: 2026-02-01 09:40:30.876315135 +0000 UTC m=+0.082194979 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 01 09:40:30 np0005604215.localdomain podman[277684]: 2026-02-01 09:40:30.906780069 +0000 UTC m=+0.112659923 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 09:40:30 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:40:30 np0005604215.localdomain podman[277683]: 2026-02-01 09:40:30.984214816 +0000 UTC m=+0.194556591 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, version=9.7, vcs-type=git, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7)
Feb 01 09:40:31 np0005604215.localdomain podman[277683]: 2026-02-01 09:40:31.025817148 +0000 UTC m=+0.236158913 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, release=1769056855, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible)
Feb 01 09:40:31 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:40:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:40:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:40:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:40:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:40:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:40:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:40:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:34.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:34.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 09:40:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:34.123 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 09:40:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:34.124 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:34.125 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 09:40:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:34.136 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:36.145 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:40:36 np0005604215.localdomain podman[277722]: 2026-02-01 09:40:36.836518626 +0000 UTC m=+0.088468232 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:40:36 np0005604215.localdomain podman[277722]: 2026-02-01 09:40:36.845842703 +0000 UTC m=+0.097792359 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:40:36 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:40:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:37.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:37.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:40:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:37.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:40:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:37.121 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:40:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:37.121 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:37 np0005604215.localdomain sudo[277741]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:38 np0005604215.localdomain sudo[277741]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:38 np0005604215.localdomain sudo[277741]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:38.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:38.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:38.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.124 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.124 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.124 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.125 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.125 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:40:39 np0005604215.localdomain sudo[277770]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:39 np0005604215.localdomain sudo[277770]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:39 np0005604215.localdomain sudo[277770]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.567 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.769 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.771 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12872MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.771 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.771 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.989 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:40:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:39.989 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.130 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.225 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.226 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.252 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.271 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.288 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:40:40 np0005604215.localdomain sudo[277819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:40 np0005604215.localdomain sudo[277819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:40 np0005604215.localdomain sudo[277819]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.749 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.461s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.757 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.781 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.784 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:40:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:40.784 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.013s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:40:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:40:41.756 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:40:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:40:41.757 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:40:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:40:41.758 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:40:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:40:42.785 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:40:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:40:43 np0005604215.localdomain podman[277839]: 2026-02-01 09:40:43.91457424 +0000 UTC m=+0.089951812 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:40:43 np0005604215.localdomain podman[277839]: 2026-02-01 09:40:43.92341014 +0000 UTC m=+0.098787702 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:40:43 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:40:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5505 writes, 24K keys, 5505 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5505 writes, 787 syncs, 6.99 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 72 writes, 206 keys, 72 commit groups, 1.0 writes per commit group, ingest: 0.36 MB, 0.00 MB/s
                                                          Interval WAL: 72 writes, 36 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:40:46 np0005604215.localdomain sudo[277864]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:40:46 np0005604215.localdomain sudo[277864]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:46 np0005604215.localdomain sudo[277864]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:46 np0005604215.localdomain sudo[277882]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:40:46 np0005604215.localdomain sudo[277882]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:46 np0005604215.localdomain podman[277940]: 
Feb 01 09:40:46 np0005604215.localdomain podman[277940]: 2026-02-01 09:40:46.949439139 +0000 UTC m=+0.082091776 container create 5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_grothendieck, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z)
Feb 01 09:40:46 np0005604215.localdomain systemd[1]: Started libpod-conmon-5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037.scope.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:40:47 np0005604215.localdomain podman[277940]: 2026-02-01 09:40:46.913717936 +0000 UTC m=+0.046370573 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:40:47 np0005604215.localdomain podman[277940]: 2026-02-01 09:40:47.020054614 +0000 UTC m=+0.152707261 container init 5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_grothendieck, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, RELEASE=main, GIT_BRANCH=main, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, distribution-scope=public, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Feb 01 09:40:47 np0005604215.localdomain podman[277940]: 2026-02-01 09:40:47.032217867 +0000 UTC m=+0.164870514 container start 5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_grothendieck, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, ceph=True, description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:40:47 np0005604215.localdomain podman[277940]: 2026-02-01 09:40:47.03260621 +0000 UTC m=+0.165258887 container attach 5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_grothendieck, vcs-type=git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1764794109, name=rhceph, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:40:47 np0005604215.localdomain reverent_grothendieck[277955]: 167 167
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: libpod-5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037.scope: Deactivated successfully.
Feb 01 09:40:47 np0005604215.localdomain podman[277940]: 2026-02-01 09:40:47.036334526 +0000 UTC m=+0.168987193 container died 5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_grothendieck, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-type=git, ceph=True, release=1764794109, io.openshift.expose-services=)
Feb 01 09:40:47 np0005604215.localdomain podman[277960]: 2026-02-01 09:40:47.137130646 +0000 UTC m=+0.088057358 container remove 5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_grothendieck, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.openshift.expose-services=, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, CEPH_POINT_RELEASE=)
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: libpod-conmon-5d054bf9791526d04c802ba00d7f85b01b870783cbea9284629f2489e3a32037.scope: Deactivated successfully.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:40:47 np0005604215.localdomain systemd-rc-local-generator[278000]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:40:47 np0005604215.localdomain systemd-sysv-generator[278005]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-9f75573cc0ffefa021332aff24a0c053100173a63aaee9f562d403af5a323898-merged.mount: Deactivated successfully.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:40:47 np0005604215.localdomain systemd-rc-local-generator[278040]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:40:47 np0005604215.localdomain systemd-sysv-generator[278046]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:40:47 np0005604215.localdomain systemd[1]: Starting Ceph mgr.np0005604215.uhhqtv for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 09:40:48 np0005604215.localdomain podman[278107]: 
Feb 01 09:40:48 np0005604215.localdomain podman[278107]: 2026-02-01 09:40:48.324602861 +0000 UTC m=+0.077812901 container create 3e1e2afd626fdcaa281a17632f441dfc56dec59adcb1f539c6182d51b14f5b79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, RELEASE=main, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/127ade48dc529dd3f486f353811e1b3227dc30b60c3f77c1d2176b946810a8e6/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/127ade48dc529dd3f486f353811e1b3227dc30b60c3f77c1d2176b946810a8e6/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 09:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/127ade48dc529dd3f486f353811e1b3227dc30b60c3f77c1d2176b946810a8e6/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 09:40:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/127ade48dc529dd3f486f353811e1b3227dc30b60c3f77c1d2176b946810a8e6/merged/var/lib/ceph/mgr/ceph-np0005604215.uhhqtv supports timestamps until 2038 (0x7fffffff)
Feb 01 09:40:48 np0005604215.localdomain podman[278107]: 2026-02-01 09:40:48.386575253 +0000 UTC m=+0.139785293 container init 3e1e2afd626fdcaa281a17632f441dfc56dec59adcb1f539c6182d51b14f5b79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, release=1764794109, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=)
Feb 01 09:40:48 np0005604215.localdomain podman[278107]: 2026-02-01 09:40:48.292766991 +0000 UTC m=+0.045977061 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:40:48 np0005604215.localdomain podman[278107]: 2026-02-01 09:40:48.39562628 +0000 UTC m=+0.148836330 container start 3e1e2afd626fdcaa281a17632f441dfc56dec59adcb1f539c6182d51b14f5b79 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, version=7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public)
Feb 01 09:40:48 np0005604215.localdomain bash[278107]: 3e1e2afd626fdcaa281a17632f441dfc56dec59adcb1f539c6182d51b14f5b79
Feb 01 09:40:48 np0005604215.localdomain systemd[1]: Started Ceph mgr.np0005604215.uhhqtv for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 09:40:48 np0005604215.localdomain sudo[277882]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: set uid:gid to 167:167 (ceph:ceph)
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: pidfile_write: ignore empty --pid-file
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'alerts'
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'balancer'
Feb 01 09:40:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:48.569+0000 7fcf09996140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 01 09:40:48 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'cephadm'
Feb 01 09:40:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:48.635+0000 7fcf09996140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain sudo[278151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:49 np0005604215.localdomain sudo[278151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:49 np0005604215.localdomain sudo[278151]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:49 np0005604215.localdomain sudo[278169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:40:49 np0005604215.localdomain sudo[278169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:49 np0005604215.localdomain sudo[278169]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:49 np0005604215.localdomain sudo[278192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:40:49 np0005604215.localdomain sudo[278192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'crash'
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'dashboard'
Feb 01 09:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:49.272+0000 7fcf09996140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'devicehealth'
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'diskprediction_local'
Feb 01 09:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:49.811+0000 7fcf09996140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 01 09:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 01 09:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]:   from numpy import show_config as show_numpy_config
Feb 01 09:40:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:49.945+0000 7fcf09996140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 01 09:40:49 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'influx'
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'insights'
Feb 01 09:40:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:50.002+0000 7fcf09996140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 01 09:40:50 np0005604215.localdomain systemd[1]: tmp-crun.t2YFWX.mount: Deactivated successfully.
Feb 01 09:40:50 np0005604215.localdomain podman[278280]: 2026-02-01 09:40:50.027281075 +0000 UTC m=+0.112441486 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_CLEAN=True, name=rhceph, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, distribution-scope=public, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=)
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'iostat'
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'k8sevents'
Feb 01 09:40:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:50.118+0000 7fcf09996140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 01 09:40:50 np0005604215.localdomain podman[278280]: 2026-02-01 09:40:50.15921529 +0000 UTC m=+0.244375691 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, RELEASE=main, com.redhat.component=rhceph-container, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:40:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7200.1 total, 600.0 interval
                                                          Cumulative writes: 5317 writes, 23K keys, 5317 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5317 writes, 693 syncs, 7.67 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 94 writes, 314 keys, 94 commit groups, 1.0 writes per commit group, ingest: 0.36 MB, 0.00 MB/s
                                                          Interval WAL: 94 writes, 35 syncs, 2.69 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'localpool'
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'mds_autoscaler'
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'mirroring'
Feb 01 09:40:50 np0005604215.localdomain sudo[278192]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'nfs'
Feb 01 09:40:50 np0005604215.localdomain sudo[278382]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:40:50 np0005604215.localdomain sudo[278382]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:50 np0005604215.localdomain sudo[278382]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:50 np0005604215.localdomain sudo[278400]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:40:50 np0005604215.localdomain sudo[278400]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 01 09:40:50 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'orchestrator'
Feb 01 09:40:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:50.887+0000 7fcf09996140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'osd_perf_query'
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.034+0000 7fcf09996140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'osd_support'
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.099+0000 7fcf09996140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'pg_autoscaler'
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.156+0000 7fcf09996140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.224+0000 7fcf09996140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'progress'
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'prometheus'
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.284+0000 7fcf09996140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain sudo[278400]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:51 np0005604215.localdomain sudo[278450]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:40:51 np0005604215.localdomain sudo[278450]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:51 np0005604215.localdomain sudo[278450]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'rbd_support'
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.588+0000 7fcf09996140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'restful'
Feb 01 09:40:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:51.670+0000 7fcf09996140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 01 09:40:51 np0005604215.localdomain sudo[278468]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:40:51 np0005604215.localdomain sudo[278468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:51 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'rgw'
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:52.043+0000 7fcf09996140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'rook'
Feb 01 09:40:52 np0005604215.localdomain sudo[278468]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'selftest'
Feb 01 09:40:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:52.491+0000 7fcf09996140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'snap_schedule'
Feb 01 09:40:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:52.554+0000 7fcf09996140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'stats'
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'status'
Feb 01 09:40:52 np0005604215.localdomain sudo[278507]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:40:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'telegraf'
Feb 01 09:40:52 np0005604215.localdomain sudo[278507]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:52 np0005604215.localdomain sudo[278507]: pam_unix(sudo:session): session closed for user root
Feb 01 09:40:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:52.745+0000 7fcf09996140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:52.807+0000 7fcf09996140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'telemetry'
Feb 01 09:40:52 np0005604215.localdomain systemd[1]: tmp-crun.UyNTar.mount: Deactivated successfully.
Feb 01 09:40:52 np0005604215.localdomain podman[278524]: 2026-02-01 09:40:52.859618252 +0000 UTC m=+0.097402746 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127)
Feb 01 09:40:52 np0005604215.localdomain systemd[1]: tmp-crun.174CZk.mount: Deactivated successfully.
Feb 01 09:40:52 np0005604215.localdomain podman[278524]: 2026-02-01 09:40:52.897806596 +0000 UTC m=+0.135591110 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:40:52 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'test_orchestrator'
Feb 01 09:40:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:52.943+0000 7fcf09996140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 01 09:40:52 np0005604215.localdomain podman[278525]: 2026-02-01 09:40:52.909700151 +0000 UTC m=+0.147384982 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:40:52 np0005604215.localdomain podman[278525]: 2026-02-01 09:40:52.992662855 +0000 UTC m=+0.230347696 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:40:53 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'volumes'
Feb 01 09:40:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:53.090+0000 7fcf09996140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'zabbix'
Feb 01 09:40:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:53.278+0000 7fcf09996140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 01 09:40:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:40:53.337+0000 7fcf09996140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f91e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Feb 01 09:40:53 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.103:6800/1614340691
Feb 01 09:40:58 np0005604215.localdomain sudo[278571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:40:58 np0005604215.localdomain sudo[278571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:40:58 np0005604215.localdomain sudo[278571]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:41:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:41:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:41:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 151207 "" "Go-http-client/1.1"
Feb 01 09:41:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:41:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17288 "" "Go-http-client/1.1"
Feb 01 09:41:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:41:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:41:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:41:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:41:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:41:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:41:01 np0005604215.localdomain sudo[278589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:01 np0005604215.localdomain sudo[278589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:01 np0005604215.localdomain sudo[278589]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:41:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:41:01 np0005604215.localdomain sudo[278609]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:01 np0005604215.localdomain sudo[278609]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:01 np0005604215.localdomain podman[278607]: 2026-02-01 09:41:01.747646806 +0000 UTC m=+0.085204232 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, release=1769056855, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Feb 01 09:41:01 np0005604215.localdomain podman[278607]: 2026-02-01 09:41:01.75454825 +0000 UTC m=+0.092105676 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vcs-type=git, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64)
Feb 01 09:41:01 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:41:01 np0005604215.localdomain podman[278608]: 2026-02-01 09:41:01.791242858 +0000 UTC m=+0.127893866 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:41:01 np0005604215.localdomain podman[278608]: 2026-02-01 09:41:01.825815689 +0000 UTC m=+0.162466727 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:41:01 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 2026-02-01 09:41:02.313643532 +0000 UTC m=+0.076641477 container create da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_faraday, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, release=1764794109, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, version=7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: Started libpod-conmon-da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5.scope.
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 2026-02-01 09:41:02.281261458 +0000 UTC m=+0.044259433 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 2026-02-01 09:41:02.390716041 +0000 UTC m=+0.153713976 container init da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_faraday, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 2026-02-01 09:41:02.401040642 +0000 UTC m=+0.164038587 container start da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_faraday, vcs-type=git, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.buildah.version=1.41.4, release=1764794109, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 2026-02-01 09:41:02.401669201 +0000 UTC m=+0.164667216 container attach da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_faraday, com.redhat.component=rhceph-container, name=rhceph, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, GIT_BRANCH=main, CEPH_POINT_RELEASE=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:41:02 np0005604215.localdomain kind_faraday[278720]: 167 167
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: libpod-da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5.scope: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain podman[278705]: 2026-02-01 09:41:02.404332124 +0000 UTC m=+0.167330079 container died da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_faraday, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.buildah.version=1.41.4, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, RELEASE=main, GIT_CLEAN=True)
Feb 01 09:41:02 np0005604215.localdomain podman[278725]: 2026-02-01 09:41:02.472066693 +0000 UTC m=+0.060042942 container remove da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_faraday, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, release=1764794109, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, name=rhceph)
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: libpod-conmon-da202e30075422f929b294226b6db783a1ccde3b106e2aacd84eb9f560d669c5.scope: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 2026-02-01 09:41:02.574186358 +0000 UTC m=+0.067069879 container create fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, build-date=2025-12-08T17:28:53Z, release=1764794109, GIT_CLEAN=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, description=Red Hat Ceph Storage 7, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: Started libpod-conmon-fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14.scope.
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:41:02 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c702e3ad5d22d14a762824bbbd34de918998d569142807e785cde7a03f4abe1b/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:02 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c702e3ad5d22d14a762824bbbd34de918998d569142807e785cde7a03f4abe1b/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:02 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c702e3ad5d22d14a762824bbbd34de918998d569142807e785cde7a03f4abe1b/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:02 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c702e3ad5d22d14a762824bbbd34de918998d569142807e785cde7a03f4abe1b/merged/var/lib/ceph/mon/ceph-np0005604215 supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 2026-02-01 09:41:02.632704713 +0000 UTC m=+0.125588264 container init fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, release=1764794109, ceph=True, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, version=7, name=rhceph, io.buildah.version=1.41.4, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 2026-02-01 09:41:02.643064204 +0000 UTC m=+0.135947745 container start fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, vendor=Red Hat, Inc., architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 2026-02-01 09:41:02.643329122 +0000 UTC m=+0.136212673 container attach fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, release=1764794109, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 2026-02-01 09:41:02.550417862 +0000 UTC m=+0.043301443 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-674c251a4adb6769dffcdf1ab7f1f6b0328d62740216215cf5b16aa3ecaf8094-merged.mount: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: libpod-fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14.scope: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain podman[278742]: 2026-02-01 09:41:02.770572597 +0000 UTC m=+0.263456158 container died fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.buildah.version=1.41.4, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, RELEASE=main, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: tmp-crun.bv3AiM.mount: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c702e3ad5d22d14a762824bbbd34de918998d569142807e785cde7a03f4abe1b-merged.mount: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain podman[278783]: 2026-02-01 09:41:02.87162573 +0000 UTC m=+0.087885006 container remove fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=musing_wu, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, version=7, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: libpod-conmon-fb238d7504b1c940a89bbda9a453f78d5df44d9139964b6bff2f80bac05a8a14.scope: Deactivated successfully.
Feb 01 09:41:02 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:41:03 np0005604215.localdomain systemd-rc-local-generator[278821]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:41:03 np0005604215.localdomain systemd-sysv-generator[278824]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:41:03 np0005604215.localdomain systemd-rc-local-generator[278863]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:41:03 np0005604215.localdomain systemd-sysv-generator[278869]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:41:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: Starting Ceph mon.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 09:41:03 np0005604215.localdomain podman[278931]: 
Feb 01 09:41:03 np0005604215.localdomain podman[278931]: 2026-02-01 09:41:03.920654519 +0000 UTC m=+0.067560725 container create e5584900e40475bfb0e0992a38ca26dd007e21b74ac9ad70262abebad82b75d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, version=7, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., name=rhceph)
Feb 01 09:41:03 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83968a87c9b2ae83e102a25fe5279ef93ea6c64bdb6aa577d60da58f4409de1/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:03 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83968a87c9b2ae83e102a25fe5279ef93ea6c64bdb6aa577d60da58f4409de1/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:03 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83968a87c9b2ae83e102a25fe5279ef93ea6c64bdb6aa577d60da58f4409de1/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:03 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e83968a87c9b2ae83e102a25fe5279ef93ea6c64bdb6aa577d60da58f4409de1/merged/var/lib/ceph/mon/ceph-np0005604215 supports timestamps until 2038 (0x7fffffff)
Feb 01 09:41:03 np0005604215.localdomain podman[278931]: 2026-02-01 09:41:03.971116103 +0000 UTC m=+0.118022319 container init e5584900e40475bfb0e0992a38ca26dd007e21b74ac9ad70262abebad82b75d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, version=7, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 01 09:41:03 np0005604215.localdomain systemd[1]: tmp-crun.7ODqRs.mount: Deactivated successfully.
Feb 01 09:41:03 np0005604215.localdomain podman[278931]: 2026-02-01 09:41:03.889264186 +0000 UTC m=+0.036170432 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: set uid:gid to 167:167 (ceph:ceph)
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Feb 01 09:41:04 np0005604215.localdomain bash[278931]: e5584900e40475bfb0e0992a38ca26dd007e21b74ac9ad70262abebad82b75d8
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pidfile_write: ignore empty --pid-file
Feb 01 09:41:04 np0005604215.localdomain podman[278931]: 2026-02-01 09:41:04.024920441 +0000 UTC m=+0.171826647 container start e5584900e40475bfb0e0992a38ca26dd007e21b74ac9ad70262abebad82b75d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.expose-services=, ceph=True, version=7, distribution-scope=public, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., name=rhceph, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:41:04 np0005604215.localdomain systemd[1]: Started Ceph mon.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: load: jerasure load: lrc 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: RocksDB version: 7.9.2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Git sha 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Compile date 2025-09-23 00:00:00
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: DB SUMMARY
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: DB Session ID:  7PKSWXLLH9M8NB5FULPW
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: CURRENT file:  CURRENT
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: IDENTITY file:  IDENTITY
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005604215/store.db dir, Total Num: 0, files: 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005604215/store.db: 000004.log size: 761 ; 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                         Options.error_if_exists: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.create_if_missing: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                         Options.paranoid_checks: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                                     Options.env: 0x560754d4f9e0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                                Options.info_log: 0x560755f86d20
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.max_file_opening_threads: 16
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                              Options.statistics: (nil)
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                               Options.use_fsync: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.max_log_file_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                         Options.allow_fallocate: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                        Options.use_direct_reads: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.create_missing_column_families: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                              Options.db_log_dir: 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                                 Options.wal_dir: 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.advise_random_on_open: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                    Options.write_buffer_manager: 0x560755f97540
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                            Options.rate_limiter: (nil)
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.unordered_write: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                               Options.row_cache: None
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                              Options.wal_filter: None
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.allow_ingest_behind: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.two_write_queues: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.manual_wal_flush: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.wal_compression: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.atomic_flush: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.log_readahead_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.allow_data_in_errors: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.db_host_id: __hostname__
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.max_background_jobs: 2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.max_background_compactions: -1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.max_subcompactions: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.max_total_wal_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                          Options.max_open_files: -1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                          Options.bytes_per_sync: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:       Options.compaction_readahead_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.max_background_flushes: -1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Compression algorithms supported:
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kZSTD supported: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kXpressCompression supported: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kBZip2Compression supported: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kLZ4Compression supported: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kZlibCompression supported: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kLZ4HCCompression supported: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         kSnappyCompression supported: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005604215/store.db/MANIFEST-000005
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:           Options.merge_operator: 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:        Options.compaction_filter: None
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x560755f86980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x560755f83350
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:        Options.write_buffer_size: 33554432
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:  Options.max_write_buffer_number: 2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.compression: NoCompression
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.num_levels: 7
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.table_properties_collectors: 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                           Options.bloom_locality: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                               Options.ttl: 2592000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                       Options.enable_blob_files: false
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                           Options.min_blob_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005604215/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c9a40fa3-7e53-4325-8a76-a86e4a0fff5d
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938864045657, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938864048779, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1887, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 773, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 651, "raw_average_value_size": 130, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938864049164, "job": 1, "event": "recovery_finished"}
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x560755faae00
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: DB pointer 0x5607560a0000
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.84 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Sum      1/0    1.84 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.6      0.00              0.00         1    0.003       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.09 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.09 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x560755f83350#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 1.6e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 does not exist in monmap, will attempt to join an existing cluster
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: using public_addr v2:172.18.0.108:0/0 -> [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0]
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: starting mon.np0005604215 rank -1 at public addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] at bind addrs [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005604215 fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:04 np0005604215.localdomain sudo[278609]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(???) e0 preinit fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing) e3 sync_obtain_latest_monmap
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing) e3 sync_obtain_latest_monmap obtained monmap e3
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).mds e16 new map
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        14
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2026-02-01T07:59:04.480309+0000
                                                           modified        2026-02-01T09:39:55.510678+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        79
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26329}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26329 members: 26329
                                                           [mds.mds.np0005604212.tkdkxt{0:26329} state up:active seq 12 addr [v2:172.18.0.106:6808/1133321306,v1:172.18.0.106:6809/1133321306] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005604215.rwvxvg{-1:16872} state up:standby seq 1 addr [v2:172.18.0.108:6808/2262553558,v1:172.18.0.108:6809/2262553558] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005604213.jdbvyh{-1:16878} state up:standby seq 1 addr [v2:172.18.0.107:6808/3323601884,v1:172.18.0.107:6809/3323601884] compat {c=[1],r=[1],i=[17ff]}]
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).osd e81 crush map has features 3314933000852226048, adjusting msgr requires
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).osd e81 crush map has features 288514051259236352, adjusting msgr requires
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: osdmap e80: 6 total, 6 up, 6 in
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 3 up:standby
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3724: 177 pgs: 177 active+clean; 104 MiB data, 530 MiB used, 41 GiB / 42 GiB avail; 8.2 KiB/s rd, 0 B/s wr, 7 op/s
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604210.yulljq"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3725: 177 pgs: 177 active+clean; 104 MiB data, 530 MiB used, 41 GiB / 42 GiB avail; 7.8 KiB/s rd, 0 B/s wr, 7 op/s
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Removing key for mds.mds.np0005604210.yulljq
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005604210.yulljq"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005604210.yulljq"}]': finished
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Removing daemon mds.mds.np0005604211.ggsxcc from np0005604211.localdomain -- ports []
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3726: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail; 1.9 KiB/s rd, 0 B/s wr, 1 op/s
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: osdmap e81: 6 total, 6 up, 6 in
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3728: 177 pgs: 177 active+clean; 104 MiB data, 548 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604211.ggsxcc"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3729: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Removing key for mds.mds.np0005604211.ggsxcc
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth rm", "entity": "mds.mds.np0005604211.ggsxcc"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth rm", "entity": "mds.mds.np0005604211.ggsxcc"}]': finished
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3730: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3731: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3732: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3733: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3734: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3735: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3736: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3737: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3738: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3739: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3740: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/2080126051' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/2080126051' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3741: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.16986 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604212.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mgr to host np0005604212.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3742: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.16992 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604213.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mgr to host np0005604213.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/2547260514' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/695635790' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3743: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17010 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604215.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mgr to host np0005604215.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/3420980041' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/448651037' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17028 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Saving service mgr spec with placement label:mgr
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3744: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17034 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mgr", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/2970418528' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3745: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/2289517100' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17058 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604209.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mon to host np0005604209.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3746: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17064 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604209.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label _admin to host np0005604209.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd='[{"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]}]': finished
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17076 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604210.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mon to host np0005604210.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3747: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604212.oynhpm started
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17082 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604210.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label _admin to host np0005604210.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mgrmap e12: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604210.rirrtk, np0005604212.oynhpm
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3748: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17094 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604211.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mon to host np0005604211.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604213.caiaeh started
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17100 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604211.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label _admin to host np0005604211.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mgrmap e13: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3749: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17106 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604212.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mon to host np0005604212.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604215.uhhqtv started
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3750: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mgrmap e14: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17112 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604212.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label _admin to host np0005604212.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17118 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604213.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mon to host np0005604213.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3751: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17124 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604213.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label _admin to host np0005604213.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3752: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17130 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604215.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label mon to host np0005604215.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17136 -' entity='client.admin' cmd=[{"prefix": "orch host label add", "hostname": "np0005604215.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Added label _admin to host np0005604215.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3753: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.17142 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Saving service mon spec with placement label:mon
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: pgmap v3754: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: from='client.26678 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604212", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:41:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(synchronizing).paxosservice(auth 1..34) refresh upgraded, format 0 -> 3
Feb 01 09:41:04 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f91e0 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Feb 01 09:41:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@-1(probing) e4  my rank is now 3 (was -1)
Feb 01 09:41:06 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:41:06 np0005604215.localdomain ceph-mon[278949]: paxos.3).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 01 09:41:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 01 09:41:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 01 09:41:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:41:07 np0005604215.localdomain systemd[1]: tmp-crun.xkU4JE.mount: Deactivated successfully.
Feb 01 09:41:07 np0005604215.localdomain podman[278988]: 2026-02-01 09:41:07.902410992 +0000 UTC m=+0.086801212 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:41:07 np0005604215.localdomain podman[278988]: 2026-02-01 09:41:07.915695134 +0000 UTC m=+0.100085314 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 01 09:41:07 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:41:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e4 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e4 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mgrc update_daemon_metadata mon.np0005604215 metadata {addrs=[v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005604215.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005604215.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116604,os=Linux}
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: pgmap v3755: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604209"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604209 calling monitor election
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: pgmap v3756: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: pgmap v3757: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604209 is new leader, mons np0005604209,np0005604211,np0005604210,np0005604215 in quorum (ranks 0,1,2,3)
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: monmap epoch 4
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:41:04.288930+0000
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604209
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: osdmap e81: 6 total, 6 up, 6 in
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mgrmap e14: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e4 handle_auth_request failed to assign global_id
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.464392) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938870464502, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 9165, "num_deletes": 254, "total_data_size": 10211253, "memory_usage": 10523112, "flush_reason": "Manual Compaction"}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938870520622, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 9045802, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 9170, "table_properties": {"data_size": 8995028, "index_size": 27564, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 22149, "raw_key_size": 233345, "raw_average_key_size": 26, "raw_value_size": 8843278, "raw_average_value_size": 1000, "num_data_blocks": 1059, "num_entries": 8841, "num_filter_entries": 8841, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 1769938864, "file_creation_time": 1769938870, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 56292 microseconds, and 20668 cpu microseconds.
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.520690) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 9045802 bytes OK
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.520717) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.522711) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.522734) EVENT_LOG_v1 {"time_micros": 1769938870522728, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.522751) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 10147324, prev total WAL file size 10147324, number of live WAL files 2.
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.524478) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F730039353338' seq:72057594037927935, type:22 .. '7061786F730039373930' seq:0, type:0; will stop at (end)
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(8833KB) 8(1887B)]
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938870524578, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 9047689, "oldest_snapshot_seqno": -1}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 8591 keys, 9042432 bytes, temperature: kUnknown
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938870574091, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 9042432, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 8992313, "index_size": 27554, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 21509, "raw_key_size": 228827, "raw_average_key_size": 26, "raw_value_size": 8843808, "raw_average_value_size": 1029, "num_data_blocks": 1059, "num_entries": 8591, "num_filter_entries": 8591, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769938870, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.574395) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 9042432 bytes
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.576091) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 182.5 rd, 182.4 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(8.6, 0.0 +0.0 blob) out(8.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 8846, records dropped: 255 output_compression: NoCompression
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.576120) EVENT_LOG_v1 {"time_micros": 1769938870576107, "job": 4, "event": "compaction_finished", "compaction_time_micros": 49576, "compaction_time_cpu_micros": 27734, "output_level": 6, "num_output_files": 1, "total_output_size": 9042432, "num_input_records": 8846, "num_output_records": 8591, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938870577431, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938870577483, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:41:10.524325) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:41:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e4  adding peer [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] to list of hints
Feb 01 09:41:10 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f8f20 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Feb 01 09:41:11 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:41:11 np0005604215.localdomain ceph-mon[278949]: paxos.3).electionLogic(18) init, last seen epoch 18
Feb 01 09:41:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:12 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 01 09:41:12 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 01 09:41:12 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 01 09:41:14 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 01 09:41:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:41:14 np0005604215.localdomain podman[279008]: 2026-02-01 09:41:14.864621949 +0000 UTC m=+0.078820494 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:41:14 np0005604215.localdomain podman[279008]: 2026-02-01 09:41:14.876635011 +0000 UTC m=+0.090833546 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:41:14 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e5 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604209"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604209 calling monitor election
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: pgmap v3759: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: pgmap v3760: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604209 is new leader, mons np0005604209,np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3,4)
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: monmap epoch 5
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:41:10.979996+0000
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604209
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: osdmap e81: 6 total, 6 up, 6 in
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mgrmap e14: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e5 handle_auth_request failed to assign global_id
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e5  adding peer [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] to list of hints
Feb 01 09:41:16 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f9600 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: paxos.3).electionLogic(22) init, last seen epoch 22
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:16 np0005604215.localdomain sudo[279033]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:41:16 np0005604215.localdomain sudo[279033]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:16 np0005604215.localdomain sudo[279033]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:16 np0005604215.localdomain sudo[279051]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:16 np0005604215.localdomain sudo[279051]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:16 np0005604215.localdomain sudo[279051]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:16 np0005604215.localdomain sudo[279069]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:41:16 np0005604215.localdomain sudo[279069]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:17 np0005604215.localdomain podman[279159]: 2026-02-01 09:41:17.741282825 +0000 UTC m=+0.092458938 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., version=7, ceph=True, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z)
Feb 01 09:41:17 np0005604215.localdomain podman[279159]: 2026-02-01 09:41:17.849734436 +0000 UTC m=+0.200910589 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, release=1764794109, io.openshift.tags=rhceph ceph, architecture=x86_64, distribution-scope=public, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., ceph=True, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, name=rhceph, GIT_CLEAN=True)
Feb 01 09:41:18 np0005604215.localdomain sudo[279069]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: paxos.3).electionLogic(25) init, last seen epoch 25, mid-election, bumping
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(electing) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e6 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:21 np0005604215.localdomain sudo[279277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:41:21 np0005604215.localdomain sudo[279277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:21 np0005604215.localdomain sudo[279277]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:41:22 np0005604215.localdomain sudo[279295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279295]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:22 np0005604215.localdomain sudo[279313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279313]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:22 np0005604215.localdomain sudo[279331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279331]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:22 np0005604215.localdomain sudo[279349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279349]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:22 np0005604215.localdomain sudo[279383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279383]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:22 np0005604215.localdomain sudo[279401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279401]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279419]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain sudo[279419]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279419]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279437]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:22 np0005604215.localdomain sudo[279437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279437]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279455]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:22 np0005604215.localdomain sudo[279455]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279455]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279473]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:22 np0005604215.localdomain sudo[279473]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279473]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: monmap epoch 6
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:41:16.652287+0000
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604209
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005604212
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: osdmap e81: 6 total, 6 up, 6 in
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mgrmap e14: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Health check failed: 2/6 mons down, quorum np0005604209,np0005604211,np0005604210,np0005604215 (MON_DOWN)
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604209 calling monitor election
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604209 is new leader, mons np0005604209,np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4,5)
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: monmap epoch 6
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:41:16.652287+0000
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604209
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 4: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: 5: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005604212
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: osdmap e81: 6 total, 6 up, 6 in
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: mgrmap e14: np0005604209.isqrps(active, since 2h), standbys: np0005604211.cuflqz, np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Health check cleared: MON_DOWN (was: 2/6 mons down, quorum np0005604209,np0005604211,np0005604210,np0005604215)
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604209.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: pgmap v3764: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:22 np0005604215.localdomain sudo[279491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:22 np0005604215.localdomain sudo[279491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279491]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:22 np0005604215.localdomain sudo[279509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:22 np0005604215.localdomain sudo[279509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:22 np0005604215.localdomain sudo[279509]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279543]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:23 np0005604215.localdomain sudo[279543]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:41:23 np0005604215.localdomain sudo[279543]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:41:23 np0005604215.localdomain sudo[279563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:23 np0005604215.localdomain sudo[279563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279563]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain podman[279562]: 2026-02-01 09:41:23.109435293 +0000 UTC m=+0.069693346 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:41:23 np0005604215.localdomain podman[279562]: 2026-02-01 09:41:23.12269548 +0000 UTC m=+0.082953493 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:41:23 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:41:23 np0005604215.localdomain sudo[279601]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:23 np0005604215.localdomain sudo[279601]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279601]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain podman[279561]: 2026-02-01 09:41:23.158668357 +0000 UTC m=+0.118453966 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 01 09:41:23 np0005604215.localdomain podman[279561]: 2026-02-01 09:41:23.182083972 +0000 UTC m=+0.141869641 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:41:23 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:41:23 np0005604215.localdomain sudo[279635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:41:23 np0005604215.localdomain sudo[279635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279635]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279663]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:41:23 np0005604215.localdomain sudo[279663]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279663]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:23 np0005604215.localdomain sudo[279681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279681]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:23 np0005604215.localdomain sudo[279699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279699]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:23 np0005604215.localdomain sudo[279717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279717]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279751]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:23 np0005604215.localdomain sudo[279751]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279751]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279769]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:23 np0005604215.localdomain sudo[279769]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279769]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279787]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:23 np0005604215.localdomain sudo[279787]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279787]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279805]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:23 np0005604215.localdomain sudo[279805]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279805]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:23 np0005604215.localdomain sudo[279823]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:23 np0005604215.localdomain sudo[279823]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:23 np0005604215.localdomain sudo[279823]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain sudo[279841]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:24 np0005604215.localdomain sudo[279841]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279841]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain sudo[279859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:24 np0005604215.localdomain sudo[279859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279859]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain sudo[279877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:24 np0005604215.localdomain sudo[279877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279877]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='client.34101 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604212", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:24 np0005604215.localdomain sudo[279911]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:24 np0005604215.localdomain sudo[279911]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279911]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain sudo[279929]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:24 np0005604215.localdomain sudo[279929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279929]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain sudo[279947]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:24 np0005604215.localdomain sudo[279947]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279947]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:24 np0005604215.localdomain sudo[279965]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:41:24 np0005604215.localdomain sudo[279965]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:24 np0005604215.localdomain sudo[279965]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='client.17172 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604213", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: pgmap v3765: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604209 (monmap changed)...
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604209 on np0005604209.localdomain
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604209.isqrps", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e6 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.103:0/3951772484' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='client.34113 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604215", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604209.isqrps (monmap changed)...
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604209.isqrps on np0005604209.localdomain
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: pgmap v3766: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604209.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:27 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.103:0/3951772484' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:41:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604209 (monmap changed)...
Feb 01 09:41:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604209 on np0005604209.localdomain
Feb 01 09:41:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604210 (monmap changed)...
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: pgmap v3767: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.103:0/1146543056' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:41:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:41:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:41:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:41:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:41:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17772 "" "Go-http-client/1.1"
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604210 (monmap changed)...
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' 
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14120 172.18.0.103:0/1843309307' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon).osd e81 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon).osd e81 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 01 09:41:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon).osd e82 e82: 6 total, 6 up, 6 in
Feb 01 09:41:30 np0005604215.localdomain sshd[26130]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26320]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26148]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26170]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26227]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26208]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26284]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-14.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain sshd[26246]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-16.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain sshd[26339]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-21.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain sshd[26303]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-22.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain sshd[26265]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain sshd[26189]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-23.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-20.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-18.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-17.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-24.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-19.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-25.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-26.scope: Deactivated successfully.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: session-26.scope: Consumed 3min 16.410s CPU time.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 14 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 16 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 26 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 19 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 17 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 18 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 23 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 20 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 22 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 21 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 24 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Session 25 logged out. Waiting for processes to exit.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 14.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 16.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 21.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 22.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 23.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 20.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 18.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 17.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 24.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 19.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 25.
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: Removed session 26.
Feb 01 09:41:30 np0005604215.localdomain sshd[279983]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:41:30 np0005604215.localdomain sshd[279983]: Accepted publickey for ceph-admin from 192.168.122.105 port 59384 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:41:30 np0005604215.localdomain systemd-logind[761]: New session 64 of user ceph-admin.
Feb 01 09:41:30 np0005604215.localdomain systemd[1]: Started Session 64 of User ceph-admin.
Feb 01 09:41:30 np0005604215.localdomain sshd[279983]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:41:31 np0005604215.localdomain sudo[279987]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:31 np0005604215.localdomain sudo[279987]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:31 np0005604215.localdomain sudo[279987]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:31 np0005604215.localdomain sudo[280005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:41:31 np0005604215.localdomain sudo[280005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)...
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: pgmap v3768: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.103:0/3887042624' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604211.cuflqz
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: osdmap e82: 6 total, 6 up, 6 in
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604209"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.103:0/3887042624' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: mgrmap e15: np0005604211.cuflqz(active, starting, since 0.0714218s), standbys: np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604210.rirrtk", "id": "np0005604210.rirrtk"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604211.cuflqz is now available
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch
Feb 01 09:41:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:41:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:41:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:41:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:41:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:41:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:41:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:41:31 np0005604215.localdomain systemd[1]: tmp-crun.xUrXqz.mount: Deactivated successfully.
Feb 01 09:41:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:41:31 np0005604215.localdomain podman[280068]: 2026-02-01 09:41:31.895596108 +0000 UTC m=+0.094253867 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.7, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., distribution-scope=public)
Feb 01 09:41:31 np0005604215.localdomain podman[280068]: 2026-02-01 09:41:31.912659853 +0000 UTC m=+0.111317602 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, maintainer=Red Hat, Inc.)
Feb 01 09:41:31 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:41:32 np0005604215.localdomain podman[280094]: 2026-02-01 09:41:32.037407946 +0000 UTC m=+0.132383833 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 01 09:41:32 np0005604215.localdomain podman[280094]: 2026-02-01 09:41:32.043109925 +0000 UTC m=+0.138085852 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:41:32 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:41:32 np0005604215.localdomain podman[280131]: 2026-02-01 09:41:32.159274867 +0000 UTC m=+0.089049573 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.41.4, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, RELEASE=main)
Feb 01 09:41:32 np0005604215.localdomain podman[280131]: 2026-02-01 09:41:32.28910823 +0000 UTC m=+0.218882966 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_CLEAN=True, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:41:32 np0005604215.localdomain ceph-mon[278949]: mgrmap e16: np0005604211.cuflqz(active, since 1.1111s), standbys: np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:32 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:32 np0005604215.localdomain sudo[280005]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:33 np0005604215.localdomain sudo[280250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:33 np0005604215.localdomain sudo[280250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:33 np0005604215.localdomain sudo[280250]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:33 np0005604215.localdomain sudo[280268]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:41:33 np0005604215.localdomain sudo[280268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:41:32] ENGINE Bus STARTING
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:41:32] ENGINE Serving on https://172.18.0.105:7150
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:41:32] ENGINE Client ('172.18.0.105', 36928) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:41:32] ENGINE Serving on http://172.18.0.105:8765
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:41:32] ENGINE Bus STARTED
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: mgrmap e17: np0005604211.cuflqz(active, since 2s), standbys: np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:33 np0005604215.localdomain sudo[280268]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:33 np0005604215.localdomain sudo[280317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:33 np0005604215.localdomain sudo[280317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:33 np0005604215.localdomain sudo[280317]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:33 np0005604215.localdomain sudo[280335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:41:33 np0005604215.localdomain sudo[280335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon).osd e82 _set_new_cache_sizes cache_size:1019840633 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:41:34 np0005604215.localdomain sudo[280335]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:34 np0005604215.localdomain sudo[280373]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:41:34 np0005604215.localdomain sudo[280373]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:34 np0005604215.localdomain sudo[280373]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:34 np0005604215.localdomain sudo[280391]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:41:34 np0005604215.localdomain sudo[280391]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:34 np0005604215.localdomain sudo[280391]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:34 np0005604215.localdomain sudo[280409]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:34 np0005604215.localdomain sudo[280409]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:34 np0005604215.localdomain sudo[280409]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:34 np0005604215.localdomain sudo[280427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:34 np0005604215.localdomain sudo[280427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:34 np0005604215.localdomain sudo[280427]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:34 np0005604215.localdomain sudo[280445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:34 np0005604215.localdomain sudo[280445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:34 np0005604215.localdomain sudo[280445]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:35 np0005604215.localdomain sudo[280479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280479]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:41:35 np0005604215.localdomain sudo[280497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280497]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain sudo[280515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280515]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604209", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604209", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604209.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: mgrmap e18: np0005604211.cuflqz(active, since 4s), standbys: np0005604213.caiaeh, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/3527258170' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/3527258170' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:41:35 np0005604215.localdomain sudo[280533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:35 np0005604215.localdomain sudo[280533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280533]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:35 np0005604215.localdomain sudo[280551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280551]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:35 np0005604215.localdomain sudo[280569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280569]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:35 np0005604215.localdomain sudo[280587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280587]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:35 np0005604215.localdomain sudo[280605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280605]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:35 np0005604215.localdomain sudo[280639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280639]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:41:35 np0005604215.localdomain sudo[280657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280657]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:35 np0005604215.localdomain sudo[280675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280675]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:41:35 np0005604215.localdomain sudo[280693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:35 np0005604215.localdomain sudo[280711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:41:35 np0005604215.localdomain sudo[280711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:35 np0005604215.localdomain sudo[280711]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280729]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:36 np0005604215.localdomain sudo[280747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280747]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280765]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280765]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280765]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280799]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280817]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280817]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280817]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280835]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:36 np0005604215.localdomain sudo[280835]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280835]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280853]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:36 np0005604215.localdomain sudo[280853]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280853]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:41:36 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604209.isqrps started
Feb 01 09:41:36 np0005604215.localdomain sudo[280871]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:41:36 np0005604215.localdomain sudo[280871]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280871]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280889]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280889]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280889]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280907]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:36 np0005604215.localdomain sudo[280907]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280907]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280925]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280925]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280925]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280959]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280959]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280959]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:36 np0005604215.localdomain sudo[280977]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:41:36 np0005604215.localdomain sudo[280977]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:36 np0005604215.localdomain sudo[280977]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:37 np0005604215.localdomain sudo[280995]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain sudo[280995]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:37 np0005604215.localdomain sudo[280995]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:37.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:37.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:41:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:37.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:41:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:37.309 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604209.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: mgrmap e19: np0005604211.cuflqz(active, since 6s), standbys: np0005604213.caiaeh, np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:41:37 np0005604215.localdomain sudo[281013]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:41:37 np0005604215.localdomain sudo[281013]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:37 np0005604215.localdomain sudo[281013]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:38.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:38.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:38.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:38.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:41:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:41:38 np0005604215.localdomain podman[281031]: 2026-02-01 09:41:38.866715306 +0000 UTC m=+0.079432943 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:41:38 np0005604215.localdomain podman[281031]: 2026-02-01 09:41:38.904926263 +0000 UTC m=+0.117643830 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127)
Feb 01 09:41:38 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon).osd e82 _set_new_cache_sizes cache_size:1020050934 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.234 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.234 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.235 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.235 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.235 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)...
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:41:39 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3704204462' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.690 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.906 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.908 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12432MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.909 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:41:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:39.909 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.000 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.001 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.028 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.474 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.480 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.504 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.506 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:41:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:40.507 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.597s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604211 (monmap changed)...
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/3704204462' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:40 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/664833400' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 12 op/s
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:41:41.758 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:41:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:41:41.758 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:41:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:41:41.758 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:41:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:42.502 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:42.502 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:42.522 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:41:42.523 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/4252264274' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:41:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: from='client.34139 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/429959371' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:41:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon).osd e82 _set_new_cache_sizes cache_size:1020054664 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:44 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:41:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:41:45 np0005604215.localdomain podman[281094]: 2026-02-01 09:41:45.857541409 +0000 UTC m=+0.071469092 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:41:45 np0005604215.localdomain podman[281094]: 2026-02-01 09:41:45.866450848 +0000 UTC m=+0.080378532 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:41:45 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e6 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1423703' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:46 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f9080 mon_map magic: 0 from mon.2 v2:172.18.0.104:3300/0
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@3(peon) e7  my rank is now 2 (was 3)
Feb 01 09:41:46 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Feb 01 09:41:46 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Feb 01 09:41:46 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f9600 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: paxos.2).electionLogic(28) init, last seen epoch 28
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 handle_auth_request failed to assign global_id
Feb 01 09:41:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 handle_auth_request failed to assign global_id
Feb 01 09:41:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 handle_auth_request failed to assign global_id
Feb 01 09:41:50 np0005604215.localdomain ceph-mds[276952]: mds.beacon.mds.np0005604215.rwvxvg missed beacon ack from the monitors
Feb 01 09:41:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 handle_auth_request failed to assign global_id
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: paxos.2).electionLogic(31) init, last seen epoch 31, mid-election, bumping
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e7 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/3584391546' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/1423703' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='client.34167 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005604209"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Remove daemons mon.np0005604209
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Safe to remove mon.np0005604209: new quorum should be ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213', 'np0005604212'] (from ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213', 'np0005604212'])
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Removing monitor np0005604209 from monmap...
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon rm", "name": "np0005604209"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: Removing daemon mon.np0005604209 from np0005604209.localdomain -- ports []
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3)
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4)
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: monmap epoch 7
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:41:46.450126+0000
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: 4: [v2:172.18.0.106:3300/0,v1:172.18.0.106:6789/0] mon.np0005604212
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: osdmap e82: 6 total, 6 up, 6 in
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: mgrmap e19: np0005604211.cuflqz(active, since 21s), standbys: np0005604213.caiaeh, np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:41:51 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: from='client.34198 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604209.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: Removed label mon from host np0005604209.localdomain
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:41:53 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:41:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:41:53 np0005604215.localdomain podman[281118]: 2026-02-01 09:41:53.873380571 +0000 UTC m=+0.088014781 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 01 09:41:53 np0005604215.localdomain systemd[1]: tmp-crun.2ZAZDq.mount: Deactivated successfully.
Feb 01 09:41:53 np0005604215.localdomain podman[281118]: 2026-02-01 09:41:53.93232183 +0000 UTC m=+0.146956060 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:41:53 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:41:53 np0005604215.localdomain podman[281119]: 2026-02-01 09:41:53.934754156 +0000 UTC m=+0.144414850 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:41:54 np0005604215.localdomain podman[281119]: 2026-02-01 09:41:54.015015573 +0000 UTC m=+0.224676287 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:41:54 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054730 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: from='client.34172 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604209.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:41:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: Removed label mgr from host np0005604209.localdomain
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='client.26801 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604209.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: Removed label _admin from host np0005604209.localdomain
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:55 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:41:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:41:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:58 np0005604215.localdomain sudo[281165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:58 np0005604215.localdomain sudo[281165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:58 np0005604215.localdomain sudo[281165]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:58 np0005604215.localdomain sudo[281183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:58 np0005604215.localdomain sudo[281183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:58 np0005604215.localdomain podman[281219]: 
Feb 01 09:41:59 np0005604215.localdomain podman[281219]: 2026-02-01 09:41:58.999957819 +0000 UTC m=+0.077076959 container create b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_banach, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, release=1764794109, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: Started libpod-conmon-b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5.scope.
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:41:59 np0005604215.localdomain podman[281219]: 2026-02-01 09:41:58.968161921 +0000 UTC m=+0.045281091 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:41:59 np0005604215.localdomain podman[281219]: 2026-02-01 09:41:59.071353537 +0000 UTC m=+0.148472677 container init b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_banach, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, release=1764794109, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.expose-services=, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc.)
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:41:59 np0005604215.localdomain podman[281219]: 2026-02-01 09:41:59.082122605 +0000 UTC m=+0.159241745 container start b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_banach, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, RELEASE=main, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, release=1764794109)
Feb 01 09:41:59 np0005604215.localdomain podman[281219]: 2026-02-01 09:41:59.082372873 +0000 UTC m=+0.159492083 container attach b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_banach, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container)
Feb 01 09:41:59 np0005604215.localdomain distracted_banach[281234]: 167 167
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: libpod-b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5.scope: Deactivated successfully.
Feb 01 09:41:59 np0005604215.localdomain podman[281219]: 2026-02-01 09:41:59.088221386 +0000 UTC m=+0.165340556 container died b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_banach, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, name=rhceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, distribution-scope=public, RELEASE=main, vcs-type=git, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:41:59 np0005604215.localdomain podman[281239]: 2026-02-01 09:41:59.183869846 +0000 UTC m=+0.082993133 container remove b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=distracted_banach, ceph=True, CEPH_POINT_RELEASE=, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, name=rhceph, distribution-scope=public, release=1764794109, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: libpod-conmon-b3b9e8e5daa73087f0d73108a5e6374b4903cd8b5e28492b78f720f845b44ce5.scope: Deactivated successfully.
Feb 01 09:41:59 np0005604215.localdomain sudo[281183]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:59 np0005604215.localdomain sudo[281255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:41:59 np0005604215.localdomain sudo[281255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:59 np0005604215.localdomain sudo[281255]: pam_unix(sudo:session): session closed for user root
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:41:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:41:59 np0005604215.localdomain sudo[281273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:41:59 np0005604215.localdomain sudo[281273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 2026-02-01 09:41:59.923268376 +0000 UTC m=+0.073503936 container create 2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_grothendieck, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-type=git, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, io.openshift.expose-services=, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True)
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: Started libpod-conmon-2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508.scope.
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 2026-02-01 09:41:59.982676399 +0000 UTC m=+0.132911949 container init 2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_grothendieck, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, distribution-scope=public, name=rhceph, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 2026-02-01 09:41:59.991592818 +0000 UTC m=+0.141828398 container start 2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_grothendieck, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, architecture=x86_64, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z)
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 2026-02-01 09:41:59.991854587 +0000 UTC m=+0.142090187 container attach 2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_grothendieck, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vendor=Red Hat, Inc., RELEASE=main, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, version=7, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, release=1764794109)
Feb 01 09:41:59 np0005604215.localdomain sleepy_grothendieck[281324]: 167 167
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 2026-02-01 09:41:59.894155993 +0000 UTC m=+0.044391573 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:41:59 np0005604215.localdomain systemd[1]: libpod-2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508.scope: Deactivated successfully.
Feb 01 09:41:59 np0005604215.localdomain podman[281309]: 2026-02-01 09:41:59.995230583 +0000 UTC m=+0.145466163 container died 2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_grothendieck, build-date=2025-12-08T17:28:53Z, distribution-scope=public, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container)
Feb 01 09:42:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-46a77e1510c16e43830f36acb27cc3f318464f3519063eb1bd7052dd15448178-merged.mount: Deactivated successfully.
Feb 01 09:42:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:42:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:42:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:42:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155373 "" "Go-http-client/1.1"
Feb 01 09:42:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:42:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18100 "" "Go-http-client/1.1"
Feb 01 09:42:00 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-6674a12f290f9201b921bd86a590aff441f2e681786a5a1d3a45e45e7982e135-merged.mount: Deactivated successfully.
Feb 01 09:42:00 np0005604215.localdomain podman[281329]: 2026-02-01 09:42:00.152120724 +0000 UTC m=+0.144228075 container remove 2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_grothendieck, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, com.redhat.component=rhceph-container, name=rhceph, distribution-scope=public, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7)
Feb 01 09:42:00 np0005604215.localdomain systemd[1]: libpod-conmon-2d8b18ccf3d06487dc33a3343896bb27f193e5e719609cccb6ad1e9537ab8508.scope: Deactivated successfully.
Feb 01 09:42:00 np0005604215.localdomain sudo[281273]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:00 np0005604215.localdomain sudo[281351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:00 np0005604215.localdomain sudo[281351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:00 np0005604215.localdomain sudo[281351]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:00 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:42:00 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:42:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:42:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:00 np0005604215.localdomain sudo[281369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:00 np0005604215.localdomain sudo[281369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:00 np0005604215.localdomain podman[281404]: 
Feb 01 09:42:00 np0005604215.localdomain podman[281404]: 2026-02-01 09:42:00.9676216 +0000 UTC m=+0.068845270 container create 71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_edison, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, name=rhceph, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:42:01 np0005604215.localdomain systemd[1]: Started libpod-conmon-71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3.scope.
Feb 01 09:42:01 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:01 np0005604215.localdomain podman[281404]: 2026-02-01 09:42:01.032535016 +0000 UTC m=+0.133758676 container init 71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_edison, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, architecture=x86_64, vendor=Red Hat, Inc., ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1764794109, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public)
Feb 01 09:42:01 np0005604215.localdomain podman[281404]: 2026-02-01 09:42:00.937028201 +0000 UTC m=+0.038251941 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:01 np0005604215.localdomain podman[281404]: 2026-02-01 09:42:01.043119128 +0000 UTC m=+0.144342788 container start 71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_edison, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, name=rhceph, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=1764794109, GIT_BRANCH=main, ceph=True, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7)
Feb 01 09:42:01 np0005604215.localdomain podman[281404]: 2026-02-01 09:42:01.043374196 +0000 UTC m=+0.144597856 container attach 71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_edison, io.buildah.version=1.41.4, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, architecture=x86_64)
Feb 01 09:42:01 np0005604215.localdomain unruffled_edison[281419]: 167 167
Feb 01 09:42:01 np0005604215.localdomain systemd[1]: libpod-71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3.scope: Deactivated successfully.
Feb 01 09:42:01 np0005604215.localdomain podman[281404]: 2026-02-01 09:42:01.047191266 +0000 UTC m=+0.148414956 container died 71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_edison, distribution-scope=public, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, GIT_CLEAN=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:42:01 np0005604215.localdomain podman[281424]: 2026-02-01 09:42:01.143049753 +0000 UTC m=+0.084103290 container remove 71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_edison, architecture=x86_64, io.buildah.version=1.41.4, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, GIT_BRANCH=main, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, version=7, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph)
Feb 01 09:42:01 np0005604215.localdomain systemd[1]: libpod-conmon-71e58449e65df3eb97ebfaf25bf91b2176b9a2c7800d81b31e7b41ef229e86c3.scope: Deactivated successfully.
Feb 01 09:42:01 np0005604215.localdomain sudo[281369]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:01 np0005604215.localdomain sudo[281448]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:01 np0005604215.localdomain sudo[281448]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:01 np0005604215.localdomain sudo[281448]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:01 np0005604215.localdomain sudo[281466]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:01 np0005604215.localdomain sudo[281466]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:42:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:42:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:42:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:42:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:42:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:42:01 np0005604215.localdomain podman[281503]: 
Feb 01 09:42:01 np0005604215.localdomain podman[281503]: 2026-02-01 09:42:01.993614279 +0000 UTC m=+0.076650495 container create 108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_dirac, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, io.buildah.version=1.41.4, release=1764794109, ceph=True, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.openshift.expose-services=)
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0ebf2d8067f95cd436c99c0f920b1b3492f01548588d943565faf7dc56cd8af5-merged.mount: Deactivated successfully.
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: Started libpod-conmon-108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434.scope.
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:02 np0005604215.localdomain podman[281503]: 2026-02-01 09:42:01.95986022 +0000 UTC m=+0.042896506 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:02 np0005604215.localdomain podman[281503]: 2026-02-01 09:42:02.061330072 +0000 UTC m=+0.144366288 container init 108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_dirac, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., RELEASE=main, name=rhceph, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, ceph=True, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:42:02 np0005604215.localdomain podman[281503]: 2026-02-01 09:42:02.071328286 +0000 UTC m=+0.154364502 container start 108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_dirac, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, ceph=True, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:42:02 np0005604215.localdomain podman[281503]: 2026-02-01 09:42:02.071516172 +0000 UTC m=+0.154552388 container attach 108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_dirac, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, RELEASE=main, build-date=2025-12-08T17:28:53Z, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.openshift.expose-services=, ceph=True)
Feb 01 09:42:02 np0005604215.localdomain hopeful_dirac[281518]: 167 167
Feb 01 09:42:02 np0005604215.localdomain podman[281503]: 2026-02-01 09:42:02.076713845 +0000 UTC m=+0.159750091 container died 108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_dirac, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-type=git, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7)
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: libpod-108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434.scope: Deactivated successfully.
Feb 01 09:42:02 np0005604215.localdomain podman[281519]: 2026-02-01 09:42:02.126006751 +0000 UTC m=+0.081108385 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., container_name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, name=ubi9/ubi-minimal, version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Feb 01 09:42:02 np0005604215.localdomain podman[281533]: 2026-02-01 09:42:02.181555143 +0000 UTC m=+0.091823621 container remove 108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hopeful_dirac, ceph=True, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, version=7, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.openshift.tags=rhceph ceph, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: libpod-conmon-108785bf21bd35833b987e91acec7233c8291f086b547ed513d99118a6607434.scope: Deactivated successfully.
Feb 01 09:42:02 np0005604215.localdomain podman[281519]: 2026-02-01 09:42:02.22132324 +0000 UTC m=+0.176424944 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, io.buildah.version=1.33.7, release=1769056855, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 01 09:42:02 np0005604215.localdomain podman[281535]: 2026-02-01 09:42:02.23373935 +0000 UTC m=+0.138246886 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:42:02 np0005604215.localdomain podman[281535]: 2026-02-01 09:42:02.244659653 +0000 UTC m=+0.149167239 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:42:02 np0005604215.localdomain sudo[281466]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:42:02 np0005604215.localdomain sudo[281575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:02 np0005604215.localdomain sudo[281575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:02 np0005604215.localdomain sudo[281575]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:02 np0005604215.localdomain sudo[281593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:02 np0005604215.localdomain sudo[281593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:02 np0005604215.localdomain podman[281628]: 
Feb 01 09:42:02 np0005604215.localdomain podman[281628]: 2026-02-01 09:42:02.934615242 +0000 UTC m=+0.076632674 container create 841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hellman, version=7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: Started libpod-conmon-841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71.scope.
Feb 01 09:42:02 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:02 np0005604215.localdomain podman[281628]: 2026-02-01 09:42:02.992842548 +0000 UTC m=+0.134859980 container init 841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hellman, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, distribution-scope=public, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-type=git, ceph=True, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:42:03 np0005604215.localdomain interesting_hellman[281643]: 167 167
Feb 01 09:42:03 np0005604215.localdomain podman[281628]: 2026-02-01 09:42:03.001686865 +0000 UTC m=+0.143704267 container start 841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hellman, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: libpod-841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71.scope: Deactivated successfully.
Feb 01 09:42:03 np0005604215.localdomain podman[281628]: 2026-02-01 09:42:03.002317475 +0000 UTC m=+0.144334877 container attach 841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hellman, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:03 np0005604215.localdomain podman[281628]: 2026-02-01 09:42:02.903820636 +0000 UTC m=+0.045838118 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:03 np0005604215.localdomain podman[281628]: 2026-02-01 09:42:03.004598737 +0000 UTC m=+0.146616149 container died 841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hellman, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, name=rhceph, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-type=git, GIT_CLEAN=True, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, io.buildah.version=1.41.4, io.openshift.expose-services=, CEPH_POINT_RELEASE=)
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c1f216f62cdb19cadf090a54847194e48cb113d5fef9fbfb597031b2e739ef86-merged.mount: Deactivated successfully.
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-15da0d3557ccd159f916e75820219bc59467909e0f5842b3e4ca825e652ea28b-merged.mount: Deactivated successfully.
Feb 01 09:42:03 np0005604215.localdomain podman[281648]: 2026-02-01 09:42:03.092565205 +0000 UTC m=+0.081119195 container remove 841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_hellman, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, com.redhat.component=rhceph-container, RELEASE=main, build-date=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., name=rhceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vcs-type=git)
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: libpod-conmon-841ab84f74b3b0de7505308422686f614e6815e2289162b730cc0dab5c3a1e71.scope: Deactivated successfully.
Feb 01 09:42:03 np0005604215.localdomain sudo[281593]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:03 np0005604215.localdomain sudo[281665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:03 np0005604215.localdomain sudo[281665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:03 np0005604215.localdomain sudo[281665]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:03 np0005604215.localdomain sudo[281683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:03 np0005604215.localdomain sudo[281683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: pgmap v20: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:42:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 2026-02-01 09:42:03.793680395 +0000 UTC m=+0.080256388 container create 9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mendeleev, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: Started libpod-conmon-9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401.scope.
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 2026-02-01 09:42:03.854800582 +0000 UTC m=+0.141376545 container init 9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mendeleev, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, release=1764794109, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 2026-02-01 09:42:03.762385244 +0000 UTC m=+0.048961247 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 2026-02-01 09:42:03.863789254 +0000 UTC m=+0.150365237 container start 9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mendeleev, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, version=7, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, RELEASE=main, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, vendor=Red Hat, Inc., GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109)
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 2026-02-01 09:42:03.864049982 +0000 UTC m=+0.150625945 container attach 9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mendeleev, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, version=7, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, RELEASE=main, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main)
Feb 01 09:42:03 np0005604215.localdomain boring_mendeleev[281732]: 167 167
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: libpod-9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401.scope: Deactivated successfully.
Feb 01 09:42:03 np0005604215.localdomain podman[281717]: 2026-02-01 09:42:03.870633708 +0000 UTC m=+0.157209711 container died 9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mendeleev, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, RELEASE=main, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:42:03 np0005604215.localdomain podman[281737]: 2026-02-01 09:42:03.963027686 +0000 UTC m=+0.084944235 container remove 9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_mendeleev, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, distribution-scope=public, name=rhceph, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:42:03 np0005604215.localdomain systemd[1]: libpod-conmon-9cec727ba05020d1431722bb5825354416bbd11e3f3bbdbaee93d7477c23e401.scope: Deactivated successfully.
Feb 01 09:42:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-fd7138d88f859902d472c0727c0dcec9e804da4e7fdf15a3be4a2442575f3842-merged.mount: Deactivated successfully.
Feb 01 09:42:04 np0005604215.localdomain sudo[281683]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:04 np0005604215.localdomain sudo[281754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:04 np0005604215.localdomain sudo[281754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:04 np0005604215.localdomain sudo[281754]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:04 np0005604215.localdomain sudo[281772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ceph-volume --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e -- inventory --format=json-pretty --filter-for-batch
Feb 01 09:42:04 np0005604215.localdomain sudo[281772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604215 (monmap changed)...
Feb 01 09:42:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:42:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 2026-02-01 09:42:04.806423168 +0000 UTC m=+0.072188966 container create 8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_hertz, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1764794109, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, architecture=x86_64, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:42:04 np0005604215.localdomain systemd[1]: Started libpod-conmon-8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9.scope.
Feb 01 09:42:04 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 2026-02-01 09:42:04.867643238 +0000 UTC m=+0.133409036 container init 8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_hertz, distribution-scope=public, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, name=rhceph, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 2026-02-01 09:42:04.776626934 +0000 UTC m=+0.042392792 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 2026-02-01 09:42:04.880973006 +0000 UTC m=+0.146738804 container start 8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_hertz, distribution-scope=public, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, release=1764794109, com.redhat.component=rhceph-container, ceph=True, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, version=7)
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 2026-02-01 09:42:04.881491772 +0000 UTC m=+0.147257580 container attach 8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_hertz, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:42:04 np0005604215.localdomain intelligent_hertz[281845]: 167 167
Feb 01 09:42:04 np0005604215.localdomain systemd[1]: libpod-8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9.scope: Deactivated successfully.
Feb 01 09:42:04 np0005604215.localdomain podman[281830]: 2026-02-01 09:42:04.885346433 +0000 UTC m=+0.151112231 container died 8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_hertz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc., distribution-scope=public, GIT_CLEAN=True, version=7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, name=rhceph, ceph=True, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:04 np0005604215.localdomain podman[281850]: 2026-02-01 09:42:04.973689384 +0000 UTC m=+0.079317738 container remove 8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=intelligent_hertz, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, vcs-type=git, name=rhceph, version=7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, ceph=True, description=Red Hat Ceph Storage 7)
Feb 01 09:42:04 np0005604215.localdomain systemd[1]: libpod-conmon-8220110df24105aca26072148fe992cfe6b42290b9d9909bc4da026476d0bcb9.scope: Deactivated successfully.
Feb 01 09:42:05 np0005604215.localdomain systemd[1]: tmp-crun.ysYJMn.mount: Deactivated successfully.
Feb 01 09:42:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-060c0c0947e7821dcdb0bccdb08858632da93356492bf9811bae79228f2e5d5c-merged.mount: Deactivated successfully.
Feb 01 09:42:05 np0005604215.localdomain podman[281872]: 
Feb 01 09:42:05 np0005604215.localdomain podman[281872]: 2026-02-01 09:42:05.19197038 +0000 UTC m=+0.077840132 container create e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_carver, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=1764794109, ceph=True, distribution-scope=public, com.redhat.component=rhceph-container, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:42:05 np0005604215.localdomain systemd[1]: Started libpod-conmon-e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7.scope.
Feb 01 09:42:05 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:05 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bb9f9684541b0544fdc9533088585cb3613da05aa104a70a2f14ab297a183cc/merged/rootfs supports timestamps until 2038 (0x7fffffff)
Feb 01 09:42:05 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bb9f9684541b0544fdc9533088585cb3613da05aa104a70a2f14ab297a183cc/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:42:05 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bb9f9684541b0544fdc9533088585cb3613da05aa104a70a2f14ab297a183cc/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 09:42:05 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6bb9f9684541b0544fdc9533088585cb3613da05aa104a70a2f14ab297a183cc/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 09:42:05 np0005604215.localdomain podman[281872]: 2026-02-01 09:42:05.254920084 +0000 UTC m=+0.140789816 container init e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_carver, name=rhceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, architecture=x86_64, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_CLEAN=True, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:05 np0005604215.localdomain podman[281872]: 2026-02-01 09:42:05.159728858 +0000 UTC m=+0.045598670 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:05 np0005604215.localdomain podman[281872]: 2026-02-01 09:42:05.271863915 +0000 UTC m=+0.157733677 container start e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_carver, release=1764794109, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, io.buildah.version=1.41.4, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:42:05 np0005604215.localdomain podman[281872]: 2026-02-01 09:42:05.272222436 +0000 UTC m=+0.158092188 container attach e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_carver, release=1764794109, io.buildah.version=1.41.4, name=rhceph, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7)
Feb 01 09:42:05 np0005604215.localdomain ceph-mon[278949]: pgmap v21: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:06 np0005604215.localdomain systemd[1]: tmp-crun.eEe5vm.mount: Deactivated successfully.
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]: [
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:     {
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "available": false,
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "ceph_device": false,
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "device_id": "QEMU_DVD-ROM_QM00001",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "lsm_data": {},
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "lvs": [],
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "path": "/dev/sr0",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "rejected_reasons": [
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "Insufficient space (<5GB)",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "Has a FileSystem"
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         ],
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         "sys_api": {
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "actuators": null,
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "device_nodes": "sr0",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "human_readable_size": "482.00 KB",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "id_bus": "ata",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "model": "QEMU DVD-ROM",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "nr_requests": "2",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "partitions": {},
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "path": "/dev/sr0",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "removable": "1",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "rev": "2.5+",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "ro": "0",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "rotational": "1",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "sas_address": "",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "sas_device_handle": "",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "scheduler_mode": "mq-deadline",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "sectors": 0,
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "sectorsize": "2048",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "size": 493568.0,
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "support_discard": "0",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "type": "disk",
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:             "vendor": "QEMU"
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:         }
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]:     }
Feb 01 09:42:06 np0005604215.localdomain great_carver[281887]: ]
Feb 01 09:42:06 np0005604215.localdomain systemd[1]: libpod-e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7.scope: Deactivated successfully.
Feb 01 09:42:06 np0005604215.localdomain podman[281872]: 2026-02-01 09:42:06.210153733 +0000 UTC m=+1.096023535 container died e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_carver, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, GIT_CLEAN=True, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., release=1764794109, ceph=True, version=7, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public)
Feb 01 09:42:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-6bb9f9684541b0544fdc9533088585cb3613da05aa104a70a2f14ab297a183cc-merged.mount: Deactivated successfully.
Feb 01 09:42:06 np0005604215.localdomain podman[283544]: 2026-02-01 09:42:06.290447602 +0000 UTC m=+0.072698621 container remove e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=great_carver, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git)
Feb 01 09:42:06 np0005604215.localdomain systemd[1]: libpod-conmon-e6325582f06e898ac32fb4a8cc3bab54315d9270b3cf4c91c489ff48572ef9e7.scope: Deactivated successfully.
Feb 01 09:42:06 np0005604215.localdomain sudo[281772]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:06 np0005604215.localdomain sudo[283559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:42:06 np0005604215.localdomain sudo[283559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:06 np0005604215.localdomain sudo[283559]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:06 np0005604215.localdomain sudo[283577]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:42:06 np0005604215.localdomain sudo[283577]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:06 np0005604215.localdomain sudo[283577]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:06 np0005604215.localdomain sudo[283595]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:06 np0005604215.localdomain sudo[283595]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:06 np0005604215.localdomain sudo[283595]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:07 np0005604215.localdomain sudo[283613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283613]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283631]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283665]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283665]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283665]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283683]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283683]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283683]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: pgmap v22: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:07 np0005604215.localdomain sudo[283701]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:42:07 np0005604215.localdomain sudo[283701]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283701]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283719]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:07 np0005604215.localdomain sudo[283719]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283719]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283737]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:07 np0005604215.localdomain sudo[283737]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283737]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283755]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283755]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283755]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283773]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:07 np0005604215.localdomain sudo[283773]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283773]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283791]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283791]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283791]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283825]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:07 np0005604215.localdomain sudo[283843]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:07 np0005604215.localdomain sudo[283843]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:07 np0005604215.localdomain sudo[283843]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:08 np0005604215.localdomain sudo[283861]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain sudo[283861]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:08 np0005604215.localdomain sudo[283861]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Removing np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Removing np0005604209.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='client.34208 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005604209.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Added label _no_schedule to host np0005604209.localdomain
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Removing np0005604209.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604209.localdomain
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: from='client.34187 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005604209.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: Removing daemon crash.np0005604209 from np0005604209.localdomain -- ports []
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: pgmap v23: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"} : dispatch
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"} : dispatch
Feb 01 09:42:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain"}]': finished
Feb 01 09:42:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:42:09 np0005604215.localdomain podman[283879]: 2026-02-01 09:42:09.867342305 +0000 UTC m=+0.079135921 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 01 09:42:09 np0005604215.localdomain podman[283879]: 2026-02-01 09:42:09.876674628 +0000 UTC m=+0.088468244 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:42:09 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:42:10 np0005604215.localdomain sudo[283898]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:42:10 np0005604215.localdomain sudo[283898]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:10 np0005604215.localdomain sudo[283898]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:10 np0005604215.localdomain sudo[283916]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:42:10 np0005604215.localdomain sudo[283916]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:10 np0005604215.localdomain sudo[283916]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='client.34218 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005604209.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: Removed host np0005604209.localdomain
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: Removing key for client.crash.np0005604209.localdomain
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix": "auth rm", "entity": "client.crash.np0005604209.localdomain"}]': finished
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604210 (monmap changed)...
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: pgmap v24: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604210 (monmap changed)...
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:12 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)...
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: pgmap v25: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:42:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604211 (monmap changed)...
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: pgmap v26: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:42:16 np0005604215.localdomain podman[283934]: 2026-02-01 09:42:16.876028661 +0000 UTC m=+0.091085058 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:42:16 np0005604215.localdomain podman[283934]: 2026-02-01 09:42:16.886718405 +0000 UTC m=+0.101774852 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:42:16 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: pgmap v27: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: from='client.26816 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: Saving service mon spec with placement label:mon
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:42:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: pgmap v28: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:20 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f91e0 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 01 09:42:20 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:42:20 np0005604215.localdomain ceph-mon[278949]: paxos.2).electionLogic(34) init, last seen epoch 34
Feb 01 09:42:20 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:20 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:42:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:42:24 np0005604215.localdomain podman[283958]: 2026-02-01 09:42:24.865845559 +0000 UTC m=+0.076240562 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:42:24 np0005604215.localdomain podman[283958]: 2026-02-01 09:42:24.87672201 +0000 UTC m=+0.087116973 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:42:24 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:42:24 np0005604215.localdomain podman[283957]: 2026-02-01 09:42:24.916990603 +0000 UTC m=+0.130247557 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:42:24 np0005604215.localdomain podman[283957]: 2026-02-01 09:42:24.958008019 +0000 UTC m=+0.171264923 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 01 09:42:24 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:42:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e8 handle_timecheck drop unexpected msg
Feb 01 09:42:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e8 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='client.34233 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005604212"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Remove daemons mon.np0005604212
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Safe to remove mon.np0005604212: new quorum should be ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213'] (from ['np0005604211', 'np0005604210', 'np0005604215', 'np0005604213'])
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Removing monitor np0005604212 from monmap...
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Removing daemon mon.np0005604212 from np0005604212.localdomain -- ports []
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: pgmap v29: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: pgmap v30: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: pgmap v31: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:42:20.208227+0000
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: osdmap e82: 6 total, 6 up, 6 in
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mgrmap e19: np0005604211.cuflqz(active, since 54s), standbys: np0005604213.caiaeh, np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1/4 mons down, quorum np0005604211,np0005604215,np0005604213 (MON_DOWN)
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213 in quorum (ranks 0,1,2,3)
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: monmap epoch 8
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:42:20.208227+0000
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: osdmap e82: 6 total, 6 up, 6 in
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: mgrmap e19: np0005604211.cuflqz(active, since 54s), standbys: np0005604213.caiaeh, np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005604211,np0005604215,np0005604213)
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:42:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: pgmap v32: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:42:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:42:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:42:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:42:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: pgmap v33: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:42:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:42:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:42:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:42:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:42:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17772 "" "Go-http-client/1.1"
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:31 np0005604215.localdomain sudo[284005]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:31 np0005604215.localdomain sudo[284005]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:31 np0005604215.localdomain sudo[284005]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:31 np0005604215.localdomain sudo[284023]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:31 np0005604215.localdomain sudo[284023]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: pgmap v34: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:42:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:42:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:42:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:42:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:42:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 2026-02-01 09:42:31.723707975 +0000 UTC m=+0.062231033 container create 0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_yonath, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, distribution-scope=public, GIT_BRANCH=main, vendor=Red Hat, Inc., name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:42:31 np0005604215.localdomain systemd[1]: Started libpod-conmon-0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f.scope.
Feb 01 09:42:31 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 2026-02-01 09:42:31.690143162 +0000 UTC m=+0.028666270 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 2026-02-01 09:42:31.791900003 +0000 UTC m=+0.130423071 container init 0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_yonath, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 2026-02-01 09:42:31.802050042 +0000 UTC m=+0.140573100 container start 0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_yonath, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, distribution-scope=public, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1764794109, build-date=2025-12-08T17:28:53Z, architecture=x86_64, description=Red Hat Ceph Storage 7, vcs-type=git, version=7)
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 2026-02-01 09:42:31.802330071 +0000 UTC m=+0.140853199 container attach 0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_yonath, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:42:31 np0005604215.localdomain gallant_yonath[284073]: 167 167
Feb 01 09:42:31 np0005604215.localdomain systemd[1]: libpod-0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f.scope: Deactivated successfully.
Feb 01 09:42:31 np0005604215.localdomain podman[284057]: 2026-02-01 09:42:31.806068658 +0000 UTC m=+0.144591756 container died 0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_yonath, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.41.4, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container, version=7, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-type=git, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:42:31 np0005604215.localdomain podman[284078]: 2026-02-01 09:42:31.904263937 +0000 UTC m=+0.084359176 container remove 0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_yonath, distribution-scope=public, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, version=7, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, GIT_CLEAN=True, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:42:31 np0005604215.localdomain systemd[1]: libpod-conmon-0a58650c56f3caae4f60b33951f2267b45e7c9415d56a601068e1406c5c1e43f.scope: Deactivated successfully.
Feb 01 09:42:31 np0005604215.localdomain sudo[284023]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:32 np0005604215.localdomain sudo[284095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:32 np0005604215.localdomain sudo[284095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:32 np0005604215.localdomain sudo[284095]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:32 np0005604215.localdomain sudo[284113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:32 np0005604215.localdomain sudo[284113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:42:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:42:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:42:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:42:32 np0005604215.localdomain podman[284145]: 2026-02-01 09:42:32.626070836 +0000 UTC m=+0.082966434 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, release=1769056855, architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter)
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 
Feb 01 09:42:32 np0005604215.localdomain podman[284145]: 2026-02-01 09:42:32.663001314 +0000 UTC m=+0.119896892 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.7, io.openshift.tags=minimal rhel9, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, managed_by=edpm_ansible)
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:42:32 np0005604215.localdomain podman[284146]: 2026-02-01 09:42:32.682929509 +0000 UTC m=+0.139566359 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 2026-02-01 09:42:32.698957482 +0000 UTC m=+0.127173520 container create a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_chatterjee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, ceph=True, architecture=x86_64, description=Red Hat Ceph Storage 7)
Feb 01 09:42:32 np0005604215.localdomain podman[284146]: 2026-02-01 09:42:32.717666569 +0000 UTC m=+0.174303469 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent)
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 2026-02-01 09:42:32.61921527 +0000 UTC m=+0.047431298 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: Started libpod-conmon-a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920.scope.
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-abff02312b73c76c268b85fdd873e95c4cf7adfb7bdee37794fbb9c2a7e7d65d-merged.mount: Deactivated successfully.
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 2026-02-01 09:42:32.756512706 +0000 UTC m=+0.184728734 container init a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_chatterjee, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, build-date=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 2026-02-01 09:42:32.76648354 +0000 UTC m=+0.194699568 container start a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_chatterjee, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, architecture=x86_64, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109)
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 2026-02-01 09:42:32.766751618 +0000 UTC m=+0.194967686 container attach a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_chatterjee, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, distribution-scope=public, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, release=1764794109, build-date=2025-12-08T17:28:53Z, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:32 np0005604215.localdomain infallible_chatterjee[284199]: 167 167
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: libpod-a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920.scope: Deactivated successfully.
Feb 01 09:42:32 np0005604215.localdomain podman[284159]: 2026-02-01 09:42:32.769986719 +0000 UTC m=+0.198202827 container died a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_chatterjee, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, CEPH_POINT_RELEASE=, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.buildah.version=1.41.4, vendor=Red Hat, Inc., name=rhceph, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:42:32 np0005604215.localdomain podman[284204]: 2026-02-01 09:42:32.868418656 +0000 UTC m=+0.085185462 container remove a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=infallible_chatterjee, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-type=git, name=rhceph, release=1764794109, architecture=x86_64, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True)
Feb 01 09:42:32 np0005604215.localdomain systemd[1]: libpod-conmon-a3b117273bf954634b68ef03a30e30725a875b7024a1a065b5d191b3ece21920.scope: Deactivated successfully.
Feb 01 09:42:33 np0005604215.localdomain sudo[284113]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:33 np0005604215.localdomain sudo[284227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:33 np0005604215.localdomain sudo[284227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:33 np0005604215.localdomain sudo[284227]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:33 np0005604215.localdomain sudo[284245]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:33 np0005604215.localdomain sudo[284245]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: pgmap v35: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:42:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 2026-02-01 09:42:33.647621106 +0000 UTC m=+0.078553326 container create 6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_aryabhata, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, vcs-type=git, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True)
Feb 01 09:42:33 np0005604215.localdomain systemd[1]: Started libpod-conmon-6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda.scope.
Feb 01 09:42:33 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 2026-02-01 09:42:33.701844916 +0000 UTC m=+0.132777166 container init 6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_aryabhata, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, name=rhceph, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, RELEASE=main, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 2026-02-01 09:42:33.710837398 +0000 UTC m=+0.141769638 container start 6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_aryabhata, vcs-type=git, architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, RELEASE=main, CEPH_POINT_RELEASE=, distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 2026-02-01 09:42:33.711097486 +0000 UTC m=+0.142029736 container attach 6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_aryabhata, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, ceph=True, name=rhceph, vcs-type=git, version=7, distribution-scope=public, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:42:33 np0005604215.localdomain bold_aryabhata[284297]: 167 167
Feb 01 09:42:33 np0005604215.localdomain systemd[1]: libpod-6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda.scope: Deactivated successfully.
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 2026-02-01 09:42:33.714691518 +0000 UTC m=+0.145623788 container died 6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_aryabhata, distribution-scope=public, name=rhceph, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:33 np0005604215.localdomain podman[284281]: 2026-02-01 09:42:33.61624061 +0000 UTC m=+0.047172870 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-03c0dbcb7a80c4978082c27b932267ebcdf02635aee3b5336b36a4179f48d69f-merged.mount: Deactivated successfully.
Feb 01 09:42:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e645fb20f1e7d3ad5acc388d6d4eb7d292c55183c371648e6268ce78bc15f268-merged.mount: Deactivated successfully.
Feb 01 09:42:33 np0005604215.localdomain podman[284302]: 2026-02-01 09:42:33.817769461 +0000 UTC m=+0.089788557 container remove 6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=bold_aryabhata, architecture=x86_64, name=rhceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, release=1764794109, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_CLEAN=True, vcs-type=git, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:42:33 np0005604215.localdomain systemd[1]: libpod-conmon-6b98d98b8ce5f088db49f8c890e7d9ba7dc927375d2a32d0f9ca9c2af8bfadda.scope: Deactivated successfully.
Feb 01 09:42:33 np0005604215.localdomain sudo[284245]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:34 np0005604215.localdomain sudo[284326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:34 np0005604215.localdomain sudo[284326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:34 np0005604215.localdomain sudo[284326]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:34 np0005604215.localdomain sudo[284344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:34 np0005604215.localdomain sudo[284344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='client.26829 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005604212.localdomain:172.18.0.103", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 2026-02-01 09:42:34.620602651 +0000 UTC m=+0.075513389 container create b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.openshift.expose-services=, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_BRANCH=main, io.buildah.version=1.41.4, GIT_CLEAN=True, com.redhat.component=rhceph-container, distribution-scope=public, ceph=True, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109)
Feb 01 09:42:34 np0005604215.localdomain systemd[1]: Started libpod-conmon-b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627.scope.
Feb 01 09:42:34 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 2026-02-01 09:42:34.589707302 +0000 UTC m=+0.044618090 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 2026-02-01 09:42:34.694360234 +0000 UTC m=+0.149270972 container init b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, vcs-type=git, architecture=x86_64, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 2026-02-01 09:42:34.703395217 +0000 UTC m=+0.158305945 container start b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, vcs-type=git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, CEPH_POINT_RELEASE=, distribution-scope=public, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, architecture=x86_64, RELEASE=main, GIT_BRANCH=main, ceph=True, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 2026-02-01 09:42:34.703618904 +0000 UTC m=+0.158529642 container attach b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.buildah.version=1.41.4, RELEASE=main, version=7, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7)
Feb 01 09:42:34 np0005604215.localdomain nifty_archimedes[284393]: 167 167
Feb 01 09:42:34 np0005604215.localdomain systemd[1]: libpod-b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627.scope: Deactivated successfully.
Feb 01 09:42:34 np0005604215.localdomain podman[284378]: 2026-02-01 09:42:34.705770562 +0000 UTC m=+0.160681290 container died b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vendor=Red Hat, Inc., release=1764794109, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:42:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3ea4fa55cda2cc6bdb831e3ada3fdfb1b8276d0cbe4a3811218ce31280003806-merged.mount: Deactivated successfully.
Feb 01 09:42:34 np0005604215.localdomain podman[284398]: 2026-02-01 09:42:34.803109345 +0000 UTC m=+0.083254633 container remove b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=nifty_archimedes, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main)
Feb 01 09:42:34 np0005604215.localdomain systemd[1]: libpod-conmon-b5fe18c4f0eac005dd389a509e7535e87844c167e874e32612960871781fd627.scope: Deactivated successfully.
Feb 01 09:42:34 np0005604215.localdomain sudo[284344]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:34 np0005604215.localdomain sudo[284414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:34 np0005604215.localdomain sudo[284414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:34 np0005604215.localdomain sudo[284414]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:35 np0005604215.localdomain sudo[284432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:35 np0005604215.localdomain sudo[284432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 2026-02-01 09:42:35.528105433 +0000 UTC m=+0.069447619 container create 4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., name=rhceph, CEPH_POINT_RELEASE=, RELEASE=main, release=1764794109, com.redhat.component=rhceph-container, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:42:35 np0005604215.localdomain systemd[1]: Started libpod-conmon-4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0.scope.
Feb 01 09:42:35 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 2026-02-01 09:42:35.585807182 +0000 UTC m=+0.127149358 container init 4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph, build-date=2025-12-08T17:28:53Z, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc.)
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 2026-02-01 09:42:35.593978649 +0000 UTC m=+0.135320845 container start 4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, vcs-type=git, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, architecture=x86_64, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, version=7, release=1764794109, io.buildah.version=1.41.4, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=)
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 2026-02-01 09:42:35.594273558 +0000 UTC m=+0.135615784 container attach 4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, io.openshift.tags=rhceph ceph, vcs-type=git, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, version=7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, ceph=True, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:42:35 np0005604215.localdomain mystifying_poitras[284482]: 167 167
Feb 01 09:42:35 np0005604215.localdomain systemd[1]: libpod-4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0.scope: Deactivated successfully.
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 2026-02-01 09:42:35.597169859 +0000 UTC m=+0.138512075 container died 4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, RELEASE=main, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:42:35 np0005604215.localdomain podman[284467]: 2026-02-01 09:42:35.502846031 +0000 UTC m=+0.044188237 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: pgmap v36: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/3777375696' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/3777375696' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:35 np0005604215.localdomain podman[284487]: 2026-02-01 09:42:35.69414036 +0000 UTC m=+0.084167000 container remove 4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=mystifying_poitras, description=Red Hat Ceph Storage 7, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., version=7, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=)
Feb 01 09:42:35 np0005604215.localdomain systemd[1]: libpod-conmon-4f84bd405b091abd186f45113162ec731181873dd9ff1d6cbe8796fca860e4f0.scope: Deactivated successfully.
Feb 01 09:42:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-dde2bd4f4d3dc4fc457a25ebfe4f99249ce6b76a09e50c621d51165220efe7eb-merged.mount: Deactivated successfully.
Feb 01 09:42:35 np0005604215.localdomain sudo[284432]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:35 np0005604215.localdomain sudo[284503]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:35 np0005604215.localdomain sudo[284503]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:35 np0005604215.localdomain sudo[284503]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:35 np0005604215.localdomain sudo[284521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:42:35 np0005604215.localdomain sudo[284521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e8  adding peer [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] to list of hints
Feb 01 09:42:36 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f9600 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: paxos.2).electionLogic(40) init, last seen epoch 40
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:36 np0005604215.localdomain sudo[284521]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:37.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:37.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:42:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:37.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:42:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:37.118 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:42:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:38.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:39.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:39.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:39.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:42:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:40.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:42:40 np0005604215.localdomain podman[284572]: 2026-02-01 09:42:40.87149436 +0000 UTC m=+0.083649925 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 01 09:42:40 np0005604215.localdomain podman[284572]: 2026-02-01 09:42:40.880039348 +0000 UTC m=+0.092194923 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 01 09:42:40 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.127 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.127 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.128 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.128 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.128 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(electing) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604210 calling monitor election
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: pgmap v37: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: pgmap v38: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: pgmap v39: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604210,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3,4)
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: monmap epoch 9
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:42:36.306274+0000
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604210
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: 4: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: osdmap e82: 6 total, 6 up, 6 in
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mgrmap e19: np0005604211.cuflqz(active, since 70s), standbys: np0005604213.caiaeh, np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:42:41.759 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:42:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:42:41.761 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:42:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:42:41.761 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:42:41 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1268025690' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:41.791 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.663s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.002 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.003 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12411MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.004 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.004 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:42:42 np0005604215.localdomain sudo[284613]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:42:42 np0005604215.localdomain sudo[284613]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284613]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.403 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.404 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:42:42 np0005604215.localdomain sudo[284631]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:42:42 np0005604215.localdomain sudo[284631]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.433 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:42:42 np0005604215.localdomain sudo[284631]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/1268025690' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:42 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/2087940709' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:42:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:42 np0005604215.localdomain sudo[284650]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:42 np0005604215.localdomain sudo[284650]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284650]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain sudo[284668]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:42 np0005604215.localdomain sudo[284668]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284668]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain sudo[284705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:42 np0005604215.localdomain sudo[284705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284705]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain sudo[284739]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:42 np0005604215.localdomain sudo[284739]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284739]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.870 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.876 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:42:42 np0005604215.localdomain sudo[284757]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:42 np0005604215.localdomain sudo[284757]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284757]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.896 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 1.0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.899 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:42:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:42.899 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.895s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:42:42 np0005604215.localdomain sudo[284777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:42:42 np0005604215.localdomain sudo[284777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:42 np0005604215.localdomain sudo[284777]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:43 np0005604215.localdomain sudo[284795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284795]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:43 np0005604215.localdomain sudo[284813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284813]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:43 np0005604215.localdomain sudo[284831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284831]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:43 np0005604215.localdomain sudo[284849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284849]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:43 np0005604215.localdomain sudo[284867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284867]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284901]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:43 np0005604215.localdomain sudo[284901]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284901]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: pgmap v40: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/3152701224' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:43 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/1887964732' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:43 np0005604215.localdomain sudo[284919]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:43 np0005604215.localdomain sudo[284919]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284919]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain sudo[284937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:43 np0005604215.localdomain sudo[284937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:43 np0005604215.localdomain sudo[284937]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:43.895 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:43.895 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:44 np0005604215.localdomain sudo[284955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:42:44 np0005604215.localdomain sudo[284955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:44 np0005604215.localdomain sudo[284955]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:42:44.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604210 (monmap changed)...
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: pgmap v41: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)...
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:42:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/1047655474' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: pgmap v42: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/1944956596' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/4199490115' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:42:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/1047655474' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:42:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:42:47 np0005604215.localdomain podman[284973]: 2026-02-01 09:42:47.862871293 +0000 UTC m=+0.079074001 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:42:47 np0005604215.localdomain podman[284973]: 2026-02-01 09:42:47.896711744 +0000 UTC m=+0.112914442 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:42:47 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:42:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e82 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: pgmap v43: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='client.34217 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: Reconfig service osd.default_drive_group
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e83 e83: 6 total, 6 up, 6 in
Feb 01 09:42:51 np0005604215.localdomain sshd[279983]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:42:51 np0005604215.localdomain systemd[1]: session-64.scope: Deactivated successfully.
Feb 01 09:42:51 np0005604215.localdomain systemd[1]: session-64.scope: Consumed 18.755s CPU time.
Feb 01 09:42:51 np0005604215.localdomain systemd-logind[761]: Session 64 logged out. Waiting for processes to exit.
Feb 01 09:42:51 np0005604215.localdomain systemd-logind[761]: Removed session 64.
Feb 01 09:42:51 np0005604215.localdomain sshd[284996]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:42:51 np0005604215.localdomain sshd[284996]: Accepted publickey for ceph-admin from 192.168.122.107 port 49628 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:42:51 np0005604215.localdomain systemd-logind[761]: New session 65 of user ceph-admin.
Feb 01 09:42:51 np0005604215.localdomain systemd[1]: Started Session 65 of User ceph-admin.
Feb 01 09:42:51 np0005604215.localdomain sshd[284996]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:42:51 np0005604215.localdomain sudo[285000]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:51 np0005604215.localdomain sudo[285000]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:51 np0005604215.localdomain sudo[285000]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:51 np0005604215.localdomain sudo[285018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:42:51 np0005604215.localdomain sudo[285018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: pgmap v44: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.14190 172.18.0.105:0/4078248779' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/1066355409' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604213.caiaeh
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: osdmap e83: 6 total, 6 up, 6 in
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: mgrmap e20: np0005604213.caiaeh(active, starting, since 0.0405236s), standbys: np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604210.rirrtk", "id": "np0005604210.rirrtk"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604213.caiaeh is now available
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: removing stray HostCache host record np0005604209.localdomain.devices.0
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"}]': finished
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604209.localdomain.devices.0"}]': finished
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch
Feb 01 09:42:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch
Feb 01 09:42:52 np0005604215.localdomain podman[285111]: 2026-02-01 09:42:52.507225474 +0000 UTC m=+0.085714279 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, release=1764794109, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:42:52 np0005604215.localdomain podman[285111]: 2026-02-01 09:42:52.606859659 +0000 UTC m=+0.185348464 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, name=rhceph, GIT_BRANCH=main, release=1764794109, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, ceph=True)
Feb 01 09:42:53 np0005604215.localdomain ceph-mon[278949]: mgrmap e21: np0005604213.caiaeh(active, since 1.05663s), standbys: np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:42:53 np0005604215.localdomain ceph-mon[278949]: pgmap v3: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:53 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:42:52] ENGINE Bus STARTING
Feb 01 09:42:53 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:42:52] ENGINE Serving on http://172.18.0.107:8765
Feb 01 09:42:53 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:53 np0005604215.localdomain sudo[285018]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:53 np0005604215.localdomain sudo[285232]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:53 np0005604215.localdomain sudo[285232]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:53 np0005604215.localdomain sudo[285232]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:53 np0005604215.localdomain sudo[285250]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:42:53 np0005604215.localdomain sudo[285250]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:54 np0005604215.localdomain sudo[285250]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:42:52] ENGINE Serving on https://172.18.0.107:7150
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:42:52] ENGINE Bus STARTED
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:42:52] ENGINE Client ('172.18.0.107', 41850) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:42:54 np0005604215.localdomain sudo[285300]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:42:54 np0005604215.localdomain sudo[285300]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:54 np0005604215.localdomain sudo[285300]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:54 np0005604215.localdomain sudo[285318]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:42:54 np0005604215.localdomain sudo[285318]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:54 np0005604215.localdomain sudo[285318]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:55 np0005604215.localdomain ceph-mon[278949]: mgrmap e22: np0005604213.caiaeh(active, since 3s), standbys: np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv
Feb 01 09:42:55 np0005604215.localdomain sudo[285355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:42:55 np0005604215.localdomain sudo[285355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:42:55 np0005604215.localdomain sudo[285355]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:42:55 np0005604215.localdomain sudo[285385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:42:55 np0005604215.localdomain sudo[285385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:55 np0005604215.localdomain sudo[285385]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:55 np0005604215.localdomain podman[285373]: 2026-02-01 09:42:55.601419239 +0000 UTC m=+0.090553861 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller)
Feb 01 09:42:55 np0005604215.localdomain podman[285373]: 2026-02-01 09:42:55.638036557 +0000 UTC m=+0.127171239 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller)
Feb 01 09:42:55 np0005604215.localdomain podman[285374]: 2026-02-01 09:42:55.650754557 +0000 UTC m=+0.139919420 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:42:55 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:42:55 np0005604215.localdomain sudo[285416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:55 np0005604215.localdomain podman[285374]: 2026-02-01 09:42:55.662708961 +0000 UTC m=+0.151873824 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:42:55 np0005604215.localdomain sudo[285416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:55 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:42:55 np0005604215.localdomain sudo[285416]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:55 np0005604215.localdomain sudo[285457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:55 np0005604215.localdomain sudo[285457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:55 np0005604215.localdomain sudo[285457]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:55 np0005604215.localdomain sudo[285476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:55 np0005604215.localdomain sudo[285476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:55 np0005604215.localdomain sudo[285476]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:55 np0005604215.localdomain sudo[285510]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:55 np0005604215.localdomain sudo[285510]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:55 np0005604215.localdomain sudo[285510]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285528]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:42:56 np0005604215.localdomain sudo[285528]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285528]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285546]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain sudo[285546]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285546]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:56 np0005604215.localdomain sudo[285564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285564]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:56 np0005604215.localdomain sudo[285582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285582]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604211.cuflqz started
Feb 01 09:42:56 np0005604215.localdomain sudo[285600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:56 np0005604215.localdomain sudo[285600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285600]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:56 np0005604215.localdomain sudo[285618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285618]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285636]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:56 np0005604215.localdomain sudo[285636]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285636]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:56 np0005604215.localdomain sudo[285670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285670]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:42:56 np0005604215.localdomain sudo[285688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285688]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:42:56 np0005604215.localdomain sudo[285706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285706]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285724]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:42:56 np0005604215.localdomain sudo[285724]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285724]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:56 np0005604215.localdomain sudo[285742]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:42:56 np0005604215.localdomain sudo[285742]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:56 np0005604215.localdomain sudo[285742]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285760]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285760]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285760]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285778]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:57 np0005604215.localdomain sudo[285778]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285778]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285796]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285796]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285796]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: mgrmap e23: np0005604213.caiaeh(active, since 5s), standbys: np0005604209.isqrps, np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:57 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:42:57 np0005604215.localdomain sudo[285830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285830]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285848]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:42:57 np0005604215.localdomain sudo[285866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285866]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285884]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:57 np0005604215.localdomain sudo[285884]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285884]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:42:57 np0005604215.localdomain sudo[285902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285902]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285920]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285920]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285920]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285938]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:42:57 np0005604215.localdomain sudo[285938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285938]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285956]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:57 np0005604215.localdomain sudo[285990]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:42:57 np0005604215.localdomain sudo[285990]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:57 np0005604215.localdomain sudo[285990]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:58 np0005604215.localdomain sudo[286008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:42:58 np0005604215.localdomain sudo[286008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:58 np0005604215.localdomain sudo[286008]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:58 np0005604215.localdomain sudo[286026]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:58 np0005604215.localdomain sudo[286026]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:58 np0005604215.localdomain sudo[286026]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:58 np0005604215.localdomain sudo[286044]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:42:58 np0005604215.localdomain sudo[286044]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:42:58 np0005604215.localdomain sudo[286044]: pam_unix(sudo:session): session closed for user root
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:42:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:43:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:43:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:43:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:43:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:43:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17776 "" "Go-http-client/1.1"
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 0 B/s wr, 18 op/s
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:43:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:43:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:43:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:43:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:43:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:43:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:43:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 25 KiB/s rd, 0 B/s wr, 14 op/s
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:43:02 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:43:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:43:02 np0005604215.localdomain podman[286062]: 2026-02-01 09:43:02.892992348 +0000 UTC m=+0.073381542 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, release=1769056855)
Feb 01 09:43:02 np0005604215.localdomain podman[286062]: 2026-02-01 09:43:02.904835539 +0000 UTC m=+0.085224733 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., architecture=x86_64, build-date=2026-01-22T05:09:47Z, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, version=9.7, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_id=openstack_network_exporter)
Feb 01 09:43:02 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:43:02 np0005604215.localdomain podman[286063]: 2026-02-01 09:43:02.957256484 +0000 UTC m=+0.130183924 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Feb 01 09:43:02 np0005604215.localdomain podman[286063]: 2026-02-01 09:43:02.966619917 +0000 UTC m=+0.139547337 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 01 09:43:02 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:43:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.404 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:43:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='client.34267 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:43:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:05 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='client.34343 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: Saving service mon spec with placement label:mon
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:06 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:43:06 np0005604215.localdomain sudo[286101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:06 np0005604215.localdomain sudo[286101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:06 np0005604215.localdomain sudo[286101]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:06 np0005604215.localdomain sudo[286119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:06 np0005604215.localdomain sudo[286119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 2026-02-01 09:43:06.885642561 +0000 UTC m=+0.068227071 container create 52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_ritchie, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, release=1764794109, GIT_BRANCH=main, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, version=7, io.openshift.tags=rhceph ceph)
Feb 01 09:43:06 np0005604215.localdomain systemd[1]: Started libpod-conmon-52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d.scope.
Feb 01 09:43:06 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 2026-02-01 09:43:06.951435985 +0000 UTC m=+0.134020405 container init 52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_ritchie, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, release=1764794109, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, ceph=True, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 2026-02-01 09:43:06.859896363 +0000 UTC m=+0.042480813 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 2026-02-01 09:43:06.962418969 +0000 UTC m=+0.145003399 container start 52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_ritchie, release=1764794109, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, CEPH_POINT_RELEASE=)
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 2026-02-01 09:43:06.962687947 +0000 UTC m=+0.145272387 container attach 52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_ritchie, version=7, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vcs-type=git, distribution-scope=public, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:43:06 np0005604215.localdomain interesting_ritchie[286168]: 167 167
Feb 01 09:43:06 np0005604215.localdomain systemd[1]: libpod-52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d.scope: Deactivated successfully.
Feb 01 09:43:06 np0005604215.localdomain podman[286153]: 2026-02-01 09:43:06.966032342 +0000 UTC m=+0.148616782 container died 52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_ritchie, version=7, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64)
Feb 01 09:43:07 np0005604215.localdomain podman[286173]: 2026-02-01 09:43:07.058646897 +0000 UTC m=+0.083693156 container remove 52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=interesting_ritchie, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, vcs-type=git, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.buildah.version=1.41.4, com.redhat.component=rhceph-container, release=1764794109, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, description=Red Hat Ceph Storage 7)
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: libpod-conmon-52d3777649e7139cc129c0eef708b23b57fd6ddf5ca55ad00d63de305b18da3d.scope: Deactivated successfully.
Feb 01 09:43:07 np0005604215.localdomain sudo[286119]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:07 np0005604215.localdomain sudo[286189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:07 np0005604215.localdomain sudo[286189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:07 np0005604215.localdomain sudo[286189]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:07 np0005604215.localdomain sudo[286207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:07 np0005604215.localdomain sudo[286207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='client.34348 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604212", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:07 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 2026-02-01 09:43:07.746610244 +0000 UTC m=+0.076882103 container create b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_ellis, ceph=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109)
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: Started libpod-conmon-b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b.scope.
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 2026-02-01 09:43:07.811088096 +0000 UTC m=+0.141359995 container init b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_ellis, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_CLEAN=True, RELEASE=main, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109)
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 2026-02-01 09:43:07.715839799 +0000 UTC m=+0.046111698 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 2026-02-01 09:43:07.822532875 +0000 UTC m=+0.152804724 container start b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_ellis, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, distribution-scope=public)
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 2026-02-01 09:43:07.822941848 +0000 UTC m=+0.153213697 container attach b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_ellis, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, vendor=Red Hat, Inc., RELEASE=main, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, ceph=True, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:43:07 np0005604215.localdomain stupefied_ellis[286258]: 167 167
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: libpod-b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b.scope: Deactivated successfully.
Feb 01 09:43:07 np0005604215.localdomain podman[286243]: 2026-02-01 09:43:07.826486529 +0000 UTC m=+0.156758398 container died b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_ellis, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, vendor=Red Hat, Inc., release=1764794109, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-cf8cf44bb348c9ea2d8dcfa3b45cd020c7f772e2941b4f1ee2e3a46865527e17-merged.mount: Deactivated successfully.
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3f447bde679b1d24d5382b25f8629a43d466226929c54cdaf02cf53c37e2c0d5-merged.mount: Deactivated successfully.
Feb 01 09:43:07 np0005604215.localdomain podman[286263]: 2026-02-01 09:43:07.930275474 +0000 UTC m=+0.090725846 container remove b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stupefied_ellis, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, ceph=True, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:43:07 np0005604215.localdomain systemd[1]: libpod-conmon-b628758ffcc4605feb9996548cd4ed4e845742d5613ec37e4b24e77c028b605b.scope: Deactivated successfully.
Feb 01 09:43:08 np0005604215.localdomain sudo[286207]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:08 np0005604215.localdomain sudo[286285]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:08 np0005604215.localdomain sudo[286285]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:08 np0005604215.localdomain sudo[286285]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:08 np0005604215.localdomain sudo[286303]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:08 np0005604215.localdomain sudo[286303]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 2026-02-01 09:43:08.803496261 +0000 UTC m=+0.075326813 container create 087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_wu, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.openshift.expose-services=, architecture=x86_64, GIT_BRANCH=main, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., name=rhceph, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git)
Feb 01 09:43:08 np0005604215.localdomain systemd[1]: Started libpod-conmon-087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798.scope.
Feb 01 09:43:08 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 2026-02-01 09:43:08.87169346 +0000 UTC m=+0.143524012 container init 087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_wu, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, io.openshift.expose-services=, ceph=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, architecture=x86_64, version=7, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 2026-02-01 09:43:08.773847591 +0000 UTC m=+0.045678173 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 2026-02-01 09:43:08.88125852 +0000 UTC m=+0.153089082 container start 087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_wu, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., ceph=True, vcs-type=git, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, architecture=x86_64, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 2026-02-01 09:43:08.881546639 +0000 UTC m=+0.153377241 container attach 087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_wu, version=7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=1764794109, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, GIT_BRANCH=main, ceph=True)
Feb 01 09:43:08 np0005604215.localdomain boring_wu[286353]: 167 167
Feb 01 09:43:08 np0005604215.localdomain systemd[1]: libpod-087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798.scope: Deactivated successfully.
Feb 01 09:43:08 np0005604215.localdomain podman[286338]: 2026-02-01 09:43:08.884017117 +0000 UTC m=+0.155847689 container died 087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_wu, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, architecture=x86_64, io.buildah.version=1.41.4, distribution-scope=public, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:43:08 np0005604215.localdomain systemd[1]: tmp-crun.nWjMYG.mount: Deactivated successfully.
Feb 01 09:43:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3c2ac1cd2b5dc283264ce2a72a7b16b21161e1cb1490c31004d3b1894bf73256-merged.mount: Deactivated successfully.
Feb 01 09:43:08 np0005604215.localdomain podman[286359]: 2026-02-01 09:43:08.993973625 +0000 UTC m=+0.097008434 container remove 087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=boring_wu, RELEASE=main, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, name=rhceph, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:43:08 np0005604215.localdomain systemd[1]: libpod-conmon-087efad5aff32d603da4573c076b8607d80c9ecb109fa0cf1a2ac0a967141798.scope: Deactivated successfully.
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:09 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/2970647074' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:43:09 np0005604215.localdomain sudo[286303]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:09 np0005604215.localdomain sudo[286383]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:09 np0005604215.localdomain sudo[286383]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:09 np0005604215.localdomain sudo[286383]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:09 np0005604215.localdomain sudo[286401]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:09 np0005604215.localdomain sudo[286401]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 2026-02-01 09:43:09.905485823 +0000 UTC m=+0.075648804 container create 91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_dewdney, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph, release=1764794109, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z)
Feb 01 09:43:09 np0005604215.localdomain systemd[1]: Started libpod-conmon-91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713.scope.
Feb 01 09:43:09 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 2026-02-01 09:43:09.973192206 +0000 UTC m=+0.143355187 container init 91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_dewdney, distribution-scope=public, release=1764794109, com.redhat.component=rhceph-container, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 2026-02-01 09:43:09.874964485 +0000 UTC m=+0.045127466 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 2026-02-01 09:43:09.981854419 +0000 UTC m=+0.152017390 container start 91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_dewdney, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, vcs-type=git, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.buildah.version=1.41.4, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z)
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 2026-02-01 09:43:09.982123577 +0000 UTC m=+0.152286558 container attach 91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_dewdney, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, io.buildah.version=1.41.4, distribution-scope=public, release=1764794109, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_BRANCH=main)
Feb 01 09:43:09 np0005604215.localdomain determined_dewdney[286450]: 167 167
Feb 01 09:43:09 np0005604215.localdomain systemd[1]: libpod-91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713.scope: Deactivated successfully.
Feb 01 09:43:09 np0005604215.localdomain podman[286435]: 2026-02-01 09:43:09.985196903 +0000 UTC m=+0.155359894 container died 91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_dewdney, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, vcs-type=git, io.openshift.tags=rhceph ceph, distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vendor=Red Hat, Inc., ceph=True, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7)
Feb 01 09:43:10 np0005604215.localdomain podman[286455]: 2026-02-01 09:43:10.078969494 +0000 UTC m=+0.084540703 container remove 91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=determined_dewdney, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph)
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: libpod-conmon-91f505ad3d954de0519d85aa866a829925952846448c010a7436941ab9b26713.scope: Deactivated successfully.
Feb 01 09:43:10 np0005604215.localdomain sudo[286401]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:10 np0005604215.localdomain sudo[286472]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:10 np0005604215.localdomain sudo[286472]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:10 np0005604215.localdomain sudo[286472]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:10 np0005604215.localdomain sudo[286490]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:10 np0005604215.localdomain sudo[286490]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 2026-02-01 09:43:10.814593136 +0000 UTC m=+0.079087681 container create c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_taussig, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, distribution-scope=public, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4)
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: Started libpod-conmon-c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04.scope.
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 2026-02-01 09:43:10.876812927 +0000 UTC m=+0.141307442 container init c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_taussig, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, ceph=True, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64)
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 2026-02-01 09:43:10.7841045 +0000 UTC m=+0.048599045 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 2026-02-01 09:43:10.885868571 +0000 UTC m=+0.150363116 container start c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_taussig, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, RELEASE=main, vendor=Red Hat, Inc., release=1764794109, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.buildah.version=1.41.4, ceph=True, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, version=7)
Feb 01 09:43:10 np0005604215.localdomain relaxed_taussig[286541]: 167 167
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 2026-02-01 09:43:10.886281474 +0000 UTC m=+0.150776009 container attach c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_taussig, version=7, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, release=1764794109, architecture=x86_64, GIT_CLEAN=True, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, name=rhceph, RELEASE=main, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph)
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: libpod-c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04.scope: Deactivated successfully.
Feb 01 09:43:10 np0005604215.localdomain podman[286526]: 2026-02-01 09:43:10.8896485 +0000 UTC m=+0.154143035 container died c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_taussig, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, release=1764794109, vcs-type=git, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-333cdf6ed04606b2443d6c45c9aaacf06093307685a9232066d175831129dab4-merged.mount: Deactivated successfully.
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:43:10 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8f2c777c9ebb8086f65143a5e908dc6786b140d5db04925da3e8b2bef752823b-merged.mount: Deactivated successfully.
Feb 01 09:43:11 np0005604215.localdomain podman[286547]: 2026-02-01 09:43:11.00442976 +0000 UTC m=+0.101799254 container remove c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=relaxed_taussig, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, release=1764794109, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, architecture=x86_64)
Feb 01 09:43:11 np0005604215.localdomain systemd[1]: libpod-conmon-c899daceff4522ed65a81a07fb017999c8c8078003d3eebefeba239443b68e04.scope: Deactivated successfully.
Feb 01 09:43:11 np0005604215.localdomain sudo[286490]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:11 np0005604215.localdomain podman[286556]: 2026-02-01 09:43:11.087220176 +0000 UTC m=+0.152692419 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute)
Feb 01 09:43:11 np0005604215.localdomain podman[286556]: 2026-02-01 09:43:11.100998908 +0000 UTC m=+0.166471201 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:43:11 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:43:11 np0005604215.localdomain sudo[286584]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:11 np0005604215.localdomain sudo[286584]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:11 np0005604215.localdomain sudo[286584]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604215 (monmap changed)...
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:43:11 np0005604215.localdomain sudo[286602]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:11 np0005604215.localdomain sudo[286602]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 2026-02-01 09:43:11.669443937 +0000 UTC m=+0.045548980 container create 5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_swartz, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, distribution-scope=public, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:43:11 np0005604215.localdomain systemd[1]: Started libpod-conmon-5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5.scope.
Feb 01 09:43:11 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 2026-02-01 09:43:11.719460906 +0000 UTC m=+0.095565969 container init 5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_swartz, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, name=rhceph, RELEASE=main, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, architecture=x86_64, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, distribution-scope=public, vendor=Red Hat, Inc., version=7)
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 2026-02-01 09:43:11.726739043 +0000 UTC m=+0.102844106 container start 5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_swartz, release=1764794109, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_BRANCH=main, architecture=x86_64, version=7, vendor=Red Hat, Inc., io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z)
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 2026-02-01 09:43:11.727369294 +0000 UTC m=+0.103474357 container attach 5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_swartz, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, name=rhceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., release=1764794109, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, version=7, GIT_CLEAN=True, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:43:11 np0005604215.localdomain unruffled_swartz[286652]: 167 167
Feb 01 09:43:11 np0005604215.localdomain systemd[1]: libpod-5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5.scope: Deactivated successfully.
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 2026-02-01 09:43:11.730035767 +0000 UTC m=+0.106140830 container died 5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_swartz, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, description=Red Hat Ceph Storage 7, architecture=x86_64, com.redhat.component=rhceph-container, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:43:11 np0005604215.localdomain podman[286637]: 2026-02-01 09:43:11.65200214 +0000 UTC m=+0.028107183 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:43:11 np0005604215.localdomain podman[286658]: 2026-02-01 09:43:11.819635587 +0000 UTC m=+0.081891050 container remove 5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=unruffled_swartz, io.openshift.expose-services=, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container)
Feb 01 09:43:11 np0005604215.localdomain systemd[1]: libpod-conmon-5a2b9b6dabb51b23c433dd4ad69153481cc77923b404ca4f90b8962c9bcab9b5.scope: Deactivated successfully.
Feb 01 09:43:11 np0005604215.localdomain sudo[286602]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:12 np0005604215.localdomain sudo[286675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:43:12 np0005604215.localdomain sudo[286675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:12 np0005604215.localdomain sudo[286675]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e83 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 565 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604210 (monmap changed)...
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604210 on np0005604210.localdomain
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/1929927047' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 01 09:43:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 e84: 6 total, 6 up, 6 in
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604210"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604211"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).mds e16 all = 0
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).mds e16 all = 0
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).mds e16 all = 0
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604210.rirrtk", "id": "np0005604210.rirrtk"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604210.rirrtk", "id": "np0005604210.rirrtk"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mds metadata"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mon metadata"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604211 (monmap changed)...
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 172.18.0.107:0/3506477835' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/474945783' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604209.isqrps
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: osdmap e84: 6 total, 6 up, 6 in
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mgrmap e24: np0005604209.isqrps(active, starting, since 0.0496261s), standbys: np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.17070 ' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).mds e16 all = 1
Feb 01 09:43:15 np0005604215.localdomain sshd[284996]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:43:15 np0005604215.localdomain systemd[1]: session-65.scope: Deactivated successfully.
Feb 01 09:43:15 np0005604215.localdomain systemd[1]: session-65.scope: Consumed 10.529s CPU time.
Feb 01 09:43:15 np0005604215.localdomain systemd-logind[761]: Session 65 logged out. Waiting for processes to exit.
Feb 01 09:43:15 np0005604215.localdomain systemd-logind[761]: Removed session 65.
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} v 0)
Feb 01 09:43:15 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch
Feb 01 09:43:15 np0005604215.localdomain sshd[286693]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:43:15 np0005604215.localdomain sshd[286693]: Accepted publickey for ceph-admin from 192.168.122.103 port 50980 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:43:15 np0005604215.localdomain systemd-logind[761]: New session 66 of user ceph-admin.
Feb 01 09:43:15 np0005604215.localdomain systemd[1]: Started Session 66 of User ceph-admin.
Feb 01 09:43:15 np0005604215.localdomain sshd[286693]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:43:15 np0005604215.localdomain sudo[286697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:15 np0005604215.localdomain sudo[286697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:15 np0005604215.localdomain sudo[286697]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:15 np0005604215.localdomain sudo[286715]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:43:15 np0005604215.localdomain sudo[286715]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604210.rirrtk", "id": "np0005604210.rirrtk"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604209.isqrps is now available
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.302878) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996302917, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 6152, "num_deletes": 759, "total_data_size": 20046824, "memory_usage": 21016520, "flush_reason": "Manual Compaction"}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996372553, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 11726491, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 9175, "largest_seqno": 15322, "table_properties": {"data_size": 11703142, "index_size": 14734, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 7557, "raw_key_size": 66356, "raw_average_key_size": 22, "raw_value_size": 11647748, "raw_average_value_size": 3876, "num_data_blocks": 640, "num_entries": 3005, "num_filter_entries": 3005, "num_deletions": 756, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938870, "oldest_key_time": 1769938870, "file_creation_time": 1769938996, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 70064 microseconds, and 19946 cpu microseconds.
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.372934) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 11726491 bytes OK
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.373063) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.374990) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.375015) EVENT_LOG_v1 {"time_micros": 1769938996375009, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.375037) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 20016009, prev total WAL file size 20016009, number of live WAL files 2.
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.378983) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130303430' seq:72057594037927935, type:22 .. '7061786F73003130323932' seq:0, type:0; will stop at (end)
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(11MB)], [15(8830KB)]
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996379061, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 20768923, "oldest_snapshot_seqno": -1}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 10076 keys, 16967028 bytes, temperature: kUnknown
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996483740, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 16967028, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 16905284, "index_size": 35467, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 25221, "raw_key_size": 267877, "raw_average_key_size": 26, "raw_value_size": 16728945, "raw_average_value_size": 1660, "num_data_blocks": 1376, "num_entries": 10076, "num_filter_entries": 10076, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769938996, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.484200) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 16967028 bytes
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.486148) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 197.9 rd, 161.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(11.2, 8.6 +0.0 blob) out(16.2 +0.0 blob), read-write-amplify(3.2) write-amplify(1.4) OK, records in: 11596, records dropped: 1520 output_compression: NoCompression
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.486177) EVENT_LOG_v1 {"time_micros": 1769938996486165, "job": 6, "event": "compaction_finished", "compaction_time_micros": 104934, "compaction_time_cpu_micros": 46467, "output_level": 6, "num_output_files": 1, "total_output_size": 16967028, "num_input_records": 11596, "num_output_records": 10076, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996488476, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769938996489976, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.378891) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.490187) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.490195) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.490198) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.490201) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:16 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:16.490204) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:16 np0005604215.localdomain podman[286803]: 2026-02-01 09:43:16.691500235 +0000 UTC m=+0.086998030 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, name=rhceph, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, version=7, ceph=True, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, com.redhat.component=rhceph-container, RELEASE=main)
Feb 01 09:43:16 np0005604215.localdomain podman[286803]: 2026-02-01 09:43:16.822989679 +0000 UTC m=+0.218487444 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.expose-services=, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1764794109, vcs-type=git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604210.localdomain.devices.0}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604210.localdomain}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mgrmap e25: np0005604209.isqrps(active, since 1.19462s), standbys: np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:43:17 np0005604215.localdomain sudo[286715]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:43:17 np0005604215.localdomain sudo[286921]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:17 np0005604215.localdomain sudo[286921]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:17 np0005604215.localdomain sudo[286921]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:43:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:43:17 np0005604215.localdomain sudo[286939]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:43:17 np0005604215.localdomain sudo[286939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:17] ENGINE Bus STARTING
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:18 np0005604215.localdomain sudo[286939]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:18 np0005604215.localdomain sudo[286988]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:43:18 np0005604215.localdomain sudo[286988]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:43:18 np0005604215.localdomain sudo[286988]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:18 np0005604215.localdomain podman[287006]: 2026-02-01 09:43:18.547846345 +0000 UTC m=+0.089261470 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:43:18 np0005604215.localdomain podman[287006]: 2026-02-01 09:43:18.562620179 +0000 UTC m=+0.104035284 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:43:18 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:43:18 np0005604215.localdomain sudo[287018]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:43:18 np0005604215.localdomain sudo[287018]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604210.localdomain.devices.0}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604210.localdomain}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:43:18 np0005604215.localdomain sudo[287018]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:43:18 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain sudo[287065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:43:19 np0005604215.localdomain sudo[287065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287065]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287083]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:43:19 np0005604215.localdomain sudo[287083]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287083]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287101]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:43:19 np0005604215.localdomain sudo[287101]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287101]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287119]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:19 np0005604215.localdomain sudo[287119]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287119]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287137]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:43:19 np0005604215.localdomain sudo[287137]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287137]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: mgrmap e26: np0005604209.isqrps(active, since 3s), standbys: np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:43:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:43:19 np0005604215.localdomain sudo[287171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:43:19 np0005604215.localdomain sudo[287171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287171]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:43:19 np0005604215.localdomain sudo[287189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287189]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:43:19 np0005604215.localdomain sudo[287207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287207]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:43:19 np0005604215.localdomain sudo[287225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287225]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:19 np0005604215.localdomain sudo[287243]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:43:19 np0005604215.localdomain sudo[287243]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:19 np0005604215.localdomain sudo[287243]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287261]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:43:20 np0005604215.localdomain sudo[287261]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287261]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287279]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:20 np0005604215.localdomain sudo[287279]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287279]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287297]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:43:20 np0005604215.localdomain sudo[287297]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287297]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:43:20 np0005604215.localdomain sudo[287331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287331]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:43:20 np0005604215.localdomain sudo[287349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287349]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287367]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:43:20 np0005604215.localdomain sudo[287367]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287367]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604213.caiaeh started
Feb 01 09:43:20 np0005604215.localdomain sudo[287385]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:43:20 np0005604215.localdomain sudo[287385]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287385]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} v 0)
Feb 01 09:43:20 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:43:20 np0005604215.localdomain sudo[287403]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:43:20 np0005604215.localdomain sudo[287403]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287403]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:43:20 np0005604215.localdomain sudo[287421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287421]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:20 np0005604215.localdomain sudo[287439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287439]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:20 np0005604215.localdomain sudo[287457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:43:20 np0005604215.localdomain sudo[287457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:20 np0005604215.localdomain sudo[287457]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:43:21 np0005604215.localdomain sudo[287491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287491]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:43:21 np0005604215.localdomain sudo[287509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287509]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287527]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain sudo[287527]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287527]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287545]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:43:21 np0005604215.localdomain sudo[287545]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287545]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287563]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:43:21 np0005604215.localdomain sudo[287563]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287563]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:43:21 np0005604215.localdomain sudo[287581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287581]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287599]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:43:21 np0005604215.localdomain sudo[287599]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287599]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain sudo[287617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:43:21 np0005604215.localdomain sudo[287617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287617]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mgrmap e27: np0005604209.isqrps(active, since 5s), standbys: np0005604210.rirrtk, np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz, np0005604213.caiaeh
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604210.localdomain.devices.0}] v 0)
Feb 01 09:43:21 np0005604215.localdomain sudo[287651]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:43:21 np0005604215.localdomain sudo[287651]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287651]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604210.localdomain}] v 0)
Feb 01 09:43:21 np0005604215.localdomain sudo[287669]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:43:21 np0005604215.localdomain sudo[287669]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287669]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:43:21 np0005604215.localdomain sudo[287687]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:43:21 np0005604215.localdomain sudo[287687]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:21 np0005604215.localdomain sudo[287687]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:43:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.162348) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002162389, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 654, "num_deletes": 256, "total_data_size": 2377265, "memory_usage": 2410704, "flush_reason": "Manual Compaction"}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002171417, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 1502961, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15327, "largest_seqno": 15976, "table_properties": {"data_size": 1499505, "index_size": 1311, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8247, "raw_average_key_size": 19, "raw_value_size": 1492156, "raw_average_value_size": 3470, "num_data_blocks": 51, "num_entries": 430, "num_filter_entries": 430, "num_deletions": 256, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938997, "oldest_key_time": 1769938997, "file_creation_time": 1769939002, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 9114 microseconds, and 4138 cpu microseconds.
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.171461) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 1502961 bytes OK
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.171483) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.175099) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.175122) EVENT_LOG_v1 {"time_micros": 1769939002175116, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.175141) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 2373439, prev total WAL file size 2373439, number of live WAL files 2.
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.176013) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031303034' seq:72057594037927935, type:22 .. '6B760031323631' seq:0, type:0; will stop at (end)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(1467KB)], [18(16MB)]
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002176053, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 18469989, "oldest_snapshot_seqno": -1}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 9968 keys, 17441044 bytes, temperature: kUnknown
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002277524, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 17441044, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 17380922, "index_size": 34101, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 24965, "raw_key_size": 267328, "raw_average_key_size": 26, "raw_value_size": 17207177, "raw_average_value_size": 1726, "num_data_blocks": 1299, "num_entries": 9968, "num_filter_entries": 9968, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769939002, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.277867) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 17441044 bytes
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.279680) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 181.8 rd, 171.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.4, 16.2 +0.0 blob) out(16.6 +0.0 blob), read-write-amplify(23.9) write-amplify(11.6) OK, records in: 10506, records dropped: 538 output_compression: NoCompression
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.279714) EVENT_LOG_v1 {"time_micros": 1769939002279700, "job": 8, "event": "compaction_finished", "compaction_time_micros": 101587, "compaction_time_cpu_micros": 46301, "output_level": 6, "num_output_files": 1, "total_output_size": 17441044, "num_input_records": 10506, "num_output_records": 9968, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002280069, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939002282682, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.175941) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.282935) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.282943) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.282946) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.282948) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:43:22.282951) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:43:22 np0005604215.localdomain sudo[287705]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:43:22 np0005604215.localdomain sudo[287705]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:22 np0005604215.localdomain sudo[287705]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:43:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:22] ENGINE Error in 'start' listener <bound method Server.start of <cephadm.service_discovery.Root object at 0x7f6ff1c161c0>>
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish
                                                               output.append(listener(*args, **kwargs))
                                                             File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start
                                                               super(Server, self).start()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start
                                                               self.wait()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait
                                                               portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)
                                                             File "/lib/python3.9/site-packages/portend.py", line 162, in occupied
                                                               raise Timeout("Port {port} not bound on {host}.".format(**locals()))
                                                           portend.Timeout: Port 8765 not bound on 172.18.0.103.
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 172.18.0.103:0/3314697342' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:43:23 np0005604215.localdomain sudo[287723]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:43:23 np0005604215.localdomain sudo[287723]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:43:23 np0005604215.localdomain sudo[287723]: pam_unix(sudo:session): session closed for user root
Feb 01 09:43:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e9 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:43:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:43:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:43:25 np0005604215.localdomain podman[287742]: 2026-02-01 09:43:25.877403233 +0000 UTC m=+0.089948822 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:43:25 np0005604215.localdomain podman[287742]: 2026-02-01 09:43:25.89164953 +0000 UTC m=+0.104195059 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:43:25 np0005604215.localdomain systemd[1]: tmp-crun.udhjZn.mount: Deactivated successfully.
Feb 01 09:43:25 np0005604215.localdomain podman[287741]: 2026-02-01 09:43:25.924457868 +0000 UTC m=+0.137764942 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 01 09:43:25 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:43:25 np0005604215.localdomain podman[287741]: 2026-02-01 09:43:25.993925067 +0000 UTC m=+0.207232171 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:43:26 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:43:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.17208 ' entity='mgr.np0005604209.isqrps' 
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE Error in 'start' listener <bound method Server.start of <cephadm.agent.HostData object at 0x7f708579a880>>
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 230, in publish
                                                               output.append(listener(*args, **kwargs))
                                                             File "/lib/python3.9/site-packages/cherrypy/_cpserver.py", line 180, in start
                                                               super(Server, self).start()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 184, in start
                                                               self.wait()
                                                             File "/lib/python3.9/site-packages/cherrypy/process/servers.py", line 260, in wait
                                                               portend.occupied(*self.bound_addr, timeout=Timeouts.occupied)
                                                             File "/lib/python3.9/site-packages/portend.py", line 162, in occupied
                                                               raise Timeout("Port {port} not bound on {host}.".format(**locals()))
                                                           portend.Timeout: Port 7150 not bound on 172.18.0.103.
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE Shutting down due to error in start listener:
                                                           Traceback (most recent call last):
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 268, in start
                                                               self.publish('start')
                                                             File "/lib/python3.9/site-packages/cherrypy/process/wspbus.py", line 248, in publish
                                                               raise exc
                                                           cherrypy.process.wspbus.ChannelFailures: Timeout('Port 8765 not bound on 172.18.0.103.')
                                                           Timeout('Port 7150 not bound on 172.18.0.103.')
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE Bus STOPPING
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 8765)) already shut down
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE HTTP Server cherrypy._cpwsgi_server.CPWSGIServer(('172.18.0.103', 7150)) already shut down
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE Bus STOPPED
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE Bus EXITING
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:43:27] ENGINE Bus EXITED
Feb 01 09:43:28 np0005604215.localdomain ceph-mon[278949]: Failed to run cephadm http server: Timeout('Port 8765 not bound on 172.18.0.103.')
                                                           Timeout('Port 7150 not bound on 172.18.0.103.')
Feb 01 09:43:29 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:43:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:43:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:43:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:43:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:43:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17780 "" "Go-http-client/1.1"
Feb 01 09:43:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:43:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:43:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:43:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:43:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:43:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:43:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:43:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:43:33 np0005604215.localdomain podman[287788]: 2026-02-01 09:43:33.856518725 +0000 UTC m=+0.072623719 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, release=1769056855, architecture=x86_64, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal)
Feb 01 09:43:33 np0005604215.localdomain podman[287788]: 2026-02-01 09:43:33.869703219 +0000 UTC m=+0.085808213 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.openshift.expose-services=, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, distribution-scope=public, version=9.7, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64)
Feb 01 09:43:33 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:43:33 np0005604215.localdomain podman[287789]: 2026-02-01 09:43:33.918964153 +0000 UTC m=+0.132480285 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:43:33 np0005604215.localdomain podman[287789]: 2026-02-01 09:43:33.95265815 +0000 UTC m=+0.166174262 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:43:33 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:43:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:34 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/683050622' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:43:34 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/683050622' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:43:37 np0005604215.localdomain ceph-mon[278949]: pgmap v3: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:37.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:37.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:43:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:37.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:43:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:37.115 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:43:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:39.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:39.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:39 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:40.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:40.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:43:40 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:41.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:43:41.761 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:43:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:43:41.761 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:43:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:43:41.761 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:43:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:43:41 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/1821239121' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:43:41 np0005604215.localdomain podman[287824]: 2026-02-01 09:43:41.865172604 +0000 UTC m=+0.081758165 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:43:41 np0005604215.localdomain podman[287824]: 2026-02-01 09:43:41.877471759 +0000 UTC m=+0.094057360 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:43:41 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:43:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:42.095 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.141 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.142 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.143 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.143 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.143 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:43:43 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:43 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/2047471593' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.608 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.815 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.817 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12408MB free_disk=0.0GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.817 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.818 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.880 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.880 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=0GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:43:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:43.916 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:43:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:44 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/352628822' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.349 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.433s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.354 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.415 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updated inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with generation 3 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.415 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 generation from 3 to 4 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.416 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.436 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:43:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:44.437 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.619s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:43:45 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:45 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/3819429004' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:43:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:45.433 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:45.453 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:43:45.453 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:43:46 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/2163163526' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:43:48 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/2827578341' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:43:48 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:43:48 np0005604215.localdomain systemd[1]: tmp-crun.8zd7tv.mount: Deactivated successfully.
Feb 01 09:43:48 np0005604215.localdomain podman[287887]: 2026-02-01 09:43:48.896057086 +0000 UTC m=+0.104999555 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:43:48 np0005604215.localdomain podman[287887]: 2026-02-01 09:43:48.907743952 +0000 UTC m=+0.116686441 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:43:48 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:43:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:51 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:53 np0005604215.localdomain ceph-mon[278949]: pgmap v11: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:55 np0005604215.localdomain ceph-mon[278949]: pgmap v12: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:56 np0005604215.localdomain ceph-mon[278949]: pgmap v13: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:43:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:43:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:43:56 np0005604215.localdomain systemd[1]: tmp-crun.cxkZIw.mount: Deactivated successfully.
Feb 01 09:43:56 np0005604215.localdomain podman[287911]: 2026-02-01 09:43:56.883490158 +0000 UTC m=+0.091386618 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller)
Feb 01 09:43:56 np0005604215.localdomain podman[287912]: 2026-02-01 09:43:56.952064938 +0000 UTC m=+0.157093408 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:43:56 np0005604215.localdomain podman[287912]: 2026-02-01 09:43:56.96264835 +0000 UTC m=+0.167676800 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:43:56 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:43:56 np0005604215.localdomain podman[287911]: 2026-02-01 09:43:56.984876728 +0000 UTC m=+0.192773198 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:43:57 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:43:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:43:59 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:44:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:44:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:44:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:44:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:44:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:44:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17780 "" "Go-http-client/1.1"
Feb 01 09:44:00 np0005604215.localdomain ceph-mon[278949]: pgmap v15: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:44:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:44:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:44:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:44:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:44:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:44:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:44:03 np0005604215.localdomain ceph-mon[278949]: pgmap v16: 177 pgs: 177 unknown; 0 B data, 0 B used, 0 B / 0 B avail
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e84 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e85 e85: 6 total, 6 up, 6 in
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/534665898' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604210.rirrtk
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: osdmap e85: 6 total, 6 up, 6 in
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: mgrmap e28: np0005604210.rirrtk(active, starting, since 0.0436345s), standbys: np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz, np0005604213.caiaeh
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr metadata", "who": "np0005604210.rirrtk", "id": "np0005604210.rirrtk"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:44:04 np0005604215.localdomain sshd[286693]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: session-66.scope: Deactivated successfully.
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: session-66.scope: Consumed 6.039s CPU time.
Feb 01 09:44:04 np0005604215.localdomain systemd-logind[761]: Session 66 logged out. Waiting for processes to exit.
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:44:04 np0005604215.localdomain systemd-logind[761]: Removed session 66.
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: tmp-crun.aZo2Sc.mount: Deactivated successfully.
Feb 01 09:44:04 np0005604215.localdomain podman[287960]: 2026-02-01 09:44:04.353266216 +0000 UTC m=+0.092618276 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 09:44:04 np0005604215.localdomain podman[287960]: 2026-02-01 09:44:04.385530468 +0000 UTC m=+0.124882478 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:44:04 np0005604215.localdomain podman[287959]: 2026-02-01 09:44:04.403061418 +0000 UTC m=+0.142780149 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, architecture=x86_64, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:44:04 np0005604215.localdomain podman[287959]: 2026-02-01 09:44:04.419868375 +0000 UTC m=+0.159587106 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-type=git, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., release=1769056855, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:44:04 np0005604215.localdomain sshd[287998]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:44:04 np0005604215.localdomain sshd[287998]: Accepted publickey for ceph-admin from 192.168.122.104 port 60502 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:44:04 np0005604215.localdomain systemd-logind[761]: New session 67 of user ceph-admin.
Feb 01 09:44:04 np0005604215.localdomain systemd[1]: Started Session 67 of User ceph-admin.
Feb 01 09:44:04 np0005604215.localdomain sshd[287998]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:44:04 np0005604215.localdomain sudo[288002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:44:04 np0005604215.localdomain sudo[288002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:04 np0005604215.localdomain sudo[288002]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:04 np0005604215.localdomain sudo[288020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:44:04 np0005604215.localdomain sudo[288020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:05 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604210.rirrtk is now available
Feb 01 09:44:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604210.rirrtk/mirror_snapshot_schedule"} : dispatch
Feb 01 09:44:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604210.rirrtk/trash_purge_schedule"} : dispatch
Feb 01 09:44:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:05 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:05 np0005604215.localdomain systemd[1]: tmp-crun.YLkpy2.mount: Deactivated successfully.
Feb 01 09:44:05 np0005604215.localdomain podman[288111]: 2026-02-01 09:44:05.686459399 +0000 UTC m=+0.102393412 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, version=7)
Feb 01 09:44:05 np0005604215.localdomain podman[288111]: 2026-02-01 09:44:05.813029659 +0000 UTC m=+0.228963652 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, vendor=Red Hat, Inc., GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, release=1764794109, name=rhceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph)
Feb 01 09:44:06 np0005604215.localdomain ceph-mon[278949]: mgrmap e29: np0005604210.rirrtk(active, since 1.17522s), standbys: np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz, np0005604213.caiaeh
Feb 01 09:44:06 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:06 np0005604215.localdomain ceph-mon[278949]: from='client.34408 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:44:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:06 np0005604215.localdomain sudo[288020]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:06 np0005604215.localdomain sudo[288226]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:44:06 np0005604215.localdomain sudo[288226]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:06 np0005604215.localdomain sudo[288226]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:06 np0005604215.localdomain sudo[288244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:44:06 np0005604215.localdomain sudo[288244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:44:05] ENGINE Bus STARTING
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='client.34357 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:44:05] ENGINE Serving on https://172.18.0.104:7150
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:44:05] ENGINE Client ('172.18.0.104', 39488) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:44:05] ENGINE Serving on http://172.18.0.104:8765
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:44:05] ENGINE Bus STARTED
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:07 np0005604215.localdomain sudo[288244]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:07 np0005604215.localdomain sudo[288295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:44:07 np0005604215.localdomain sudo[288295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:07 np0005604215.localdomain sudo[288295]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:07 np0005604215.localdomain sudo[288313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:44:07 np0005604215.localdomain sudo[288313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288313]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288351]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:44:08 np0005604215.localdomain sudo[288351]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288351]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288369]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:44:08 np0005604215.localdomain sudo[288369]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288369]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288387]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:08 np0005604215.localdomain sudo[288387]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288387]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:08 np0005604215.localdomain sudo[288405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288405]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: mgrmap e30: np0005604210.rirrtk(active, since 3s), standbys: np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz, np0005604213.caiaeh
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd/host:np0005604210", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:08 np0005604215.localdomain ceph-mon[278949]: from='client.27076 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604210", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:44:08 np0005604215.localdomain sudo[288423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:08 np0005604215.localdomain sudo[288423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288423]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288457]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:08 np0005604215.localdomain sudo[288457]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288457]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288475]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:08 np0005604215.localdomain sudo[288475]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288475]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:08 np0005604215.localdomain sudo[288493]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:44:08 np0005604215.localdomain sudo[288493]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:08 np0005604215.localdomain sudo[288493]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:44:09 np0005604215.localdomain sudo[288511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288511]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:44:09 np0005604215.localdomain sudo[288529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288529]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:09 np0005604215.localdomain sudo[288547]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:09 np0005604215.localdomain sudo[288547]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288547]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288565]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:09 np0005604215.localdomain sudo[288565]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288565]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288583]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:09 np0005604215.localdomain sudo[288583]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288583]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288617]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:09 np0005604215.localdomain sudo[288617]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288617]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288635]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:09 np0005604215.localdomain sudo[288635]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288635]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288653]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:09 np0005604215.localdomain sudo[288653]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288653]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604209.isqrps started
Feb 01 09:44:09 np0005604215.localdomain sudo[288671]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:44:09 np0005604215.localdomain sudo[288671]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288671]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288689]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:44:09 np0005604215.localdomain sudo[288689]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288689]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288707]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:44:09 np0005604215.localdomain sudo[288707]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288707]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c2f70f8f20 mon_map magic: 0 from mon.2 v2:172.18.0.108:3300/0
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@2(peon) e10  my rank is now 1 (was 2)
Feb 01 09:44:09 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 01 09:44:09 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 01 09:44:09 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c30080e000 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: paxos.1).electionLogic(44) init, last seen epoch 44
Feb 01 09:44:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:44:09 np0005604215.localdomain sudo[288725]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:09 np0005604215.localdomain sudo[288725]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288725]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:09 np0005604215.localdomain sudo[288743]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:44:09 np0005604215.localdomain sudo[288743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:09 np0005604215.localdomain sudo[288743]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288777]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:44:10 np0005604215.localdomain sudo[288777]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288777]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288795]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:44:10 np0005604215.localdomain sudo[288795]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288795]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288813]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:10 np0005604215.localdomain sudo[288813]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288813]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288831]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:44:10 np0005604215.localdomain sudo[288831]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288831]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288849]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:44:10 np0005604215.localdomain sudo[288849]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288849]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:44:10 np0005604215.localdomain sudo[288867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288867]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:10 np0005604215.localdomain sudo[288885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288885]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:44:10 np0005604215.localdomain sudo[288903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288903]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288937]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:44:10 np0005604215.localdomain sudo[288937]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288937]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288955]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:44:10 np0005604215.localdomain sudo[288955]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288955]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:10 np0005604215.localdomain sudo[288973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:10 np0005604215.localdomain sudo[288973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:10 np0005604215.localdomain sudo[288973]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:12 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:44:12 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:44:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:44:12 np0005604215.localdomain podman[288991]: 2026-02-01 09:44:12.868510492 +0000 UTC m=+0.083978485 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Feb 01 09:44:12 np0005604215.localdomain podman[288991]: 2026-02-01 09:44:12.90099585 +0000 UTC m=+0.116463833 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:44:12 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='client.34372 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005604210"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Remove daemons mon.np0005604210
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Safe to remove mon.np0005604210: new quorum should be ['np0005604211', 'np0005604215', 'np0005604213', 'np0005604212'] (from ['np0005604211', 'np0005604215', 'np0005604213', 'np0005604212'])
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Removing monitor np0005604210 from monmap...
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Removing daemon mon.np0005604210 from np0005604210.localdomain -- ports []
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604213,np0005604212 in quorum (ranks 0,1,2,3)
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: monmap epoch 10
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:44:09.863320+0000
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.107:3300/0,v1:172.18.0.107:6789/0] mon.np0005604213
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: osdmap e85: 6 total, 6 up, 6 in
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: mgrmap e31: np0005604210.rirrtk(active, since 8s), standbys: np0005604212.oynhpm, np0005604215.uhhqtv, np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:13 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 0 B/s wr, 13 op/s
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mon.np0005604210 on np0005604210.localdomain
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='client.26864 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604210.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:14 np0005604215.localdomain ceph-mon[278949]: Removed label mon from host np0005604210.localdomain
Feb 01 09:44:15 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:44:15 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:44:15 np0005604215.localdomain ceph-mon[278949]: from='client.34385 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604210.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:44:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:15 np0005604215.localdomain ceph-mon[278949]: Removed label mgr from host np0005604210.localdomain
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: pgmap v11: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:44:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:16 np0005604215.localdomain sudo[289010]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:44:16 np0005604215.localdomain sudo[289010]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:17 np0005604215.localdomain sudo[289010]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:17 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:44:17 np0005604215.localdomain ceph-mon[278949]: from='client.34482 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604210.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:44:17 np0005604215.localdomain ceph-mon[278949]: Removed label _admin from host np0005604210.localdomain
Feb 01 09:44:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:44:18 np0005604215.localdomain sudo[289028]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:44:18 np0005604215.localdomain sudo[289028]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:18 np0005604215.localdomain sudo[289028]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:18 np0005604215.localdomain sudo[289046]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:44:18 np0005604215.localdomain sudo[289046]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:18 np0005604215.localdomain sudo[289046]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:18 np0005604215.localdomain sudo[289064]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:18 np0005604215.localdomain sudo[289064]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:18 np0005604215.localdomain sudo[289064]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:18 np0005604215.localdomain sudo[289082]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:18 np0005604215.localdomain sudo[289082]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:18 np0005604215.localdomain sudo[289082]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: pgmap v12: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:18 np0005604215.localdomain sudo[289100]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:44:18 np0005604215.localdomain sudo[289100]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289100]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain systemd[1]: tmp-crun.x2kI7g.mount: Deactivated successfully.
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:44:19 np0005604215.localdomain podman[289117]: 2026-02-01 09:44:19.072911063 +0000 UTC m=+0.068491230 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:19 np0005604215.localdomain podman[289117]: 2026-02-01 09:44:19.107130886 +0000 UTC m=+0.102711003 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:44:19 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:44:19 np0005604215.localdomain sudo[289157]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:19 np0005604215.localdomain sudo[289157]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289157]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289175]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:44:19 np0005604215.localdomain sudo[289175]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289175]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289193]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain sudo[289193]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289193]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289211]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:44:19 np0005604215.localdomain sudo[289211]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289211]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:44:19 np0005604215.localdomain sudo[289229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289229]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:19 np0005604215.localdomain sudo[289247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289247]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:44:19 np0005604215.localdomain sudo[289265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289265]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:19 np0005604215.localdomain sudo[289283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289283]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289317]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:19 np0005604215.localdomain sudo[289317]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289317]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289335]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:44:19 np0005604215.localdomain sudo[289335]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289335]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain sudo[289353]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain sudo[289353]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:19 np0005604215.localdomain sudo[289353]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Removing np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Removing np0005604210.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Removing np0005604210.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: pgmap v13: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon ok-to-stop", "ids": ["np0005604210"]} : dispatch
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: Safe to remove mon.np0005604210: not in monmap (['np0005604211', 'np0005604215', 'np0005604213', 'np0005604212'])
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: Removing monitor np0005604210 from monmap...
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon rm", "name": "np0005604210"} : dispatch
Feb 01 09:44:20 np0005604215.localdomain ceph-mon[278949]: Removing daemon mon.np0005604210 from np0005604210.localdomain -- ports []
Feb 01 09:44:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:44:21 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:44:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mon metadata", "id": "np0005604210"} : dispatch
Feb 01 09:44:22 np0005604215.localdomain sudo[289371]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:44:22 np0005604215.localdomain sudo[289371]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:22 np0005604215.localdomain sudo[289371]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:23 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:44:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:44:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:24 np0005604215.localdomain sudo[289389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:44:24 np0005604215.localdomain sudo[289389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:44:24 np0005604215.localdomain sudo[289389]: pam_unix(sudo:session): session closed for user root
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: pgmap v15: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604210.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:44:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604210 (monmap changed)...
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604210 on np0005604210.localdomain
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604210.rirrtk", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:44:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604210.rirrtk (monmap changed)...
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604210.rirrtk on np0005604210.localdomain
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: pgmap v16: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:44:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:44:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:44:27 np0005604215.localdomain podman[289407]: 2026-02-01 09:44:27.87042076 +0000 UTC m=+0.082894791 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 09:44:27 np0005604215.localdomain podman[289408]: 2026-02-01 09:44:27.917415494 +0000 UTC m=+0.125991852 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:44:27 np0005604215.localdomain podman[289408]: 2026-02-01 09:44:27.929583116 +0000 UTC m=+0.138159454 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:44:27 np0005604215.localdomain podman[289407]: 2026-02-01 09:44:27.929828453 +0000 UTC m=+0.142302484 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:44:27 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:44:27 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604211 (monmap changed)...
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:44:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: pgmap v17: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: from='client.26872 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005604210.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: Added label _no_schedule to host np0005604210.localdomain
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604210.localdomain
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:44:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:44:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:44:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:44:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:44:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17787 "" "Go-http-client/1.1"
Feb 01 09:44:30 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:44:30 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:44:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:44:30 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='client.34494 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005604210.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: pgmap v18: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain"} : dispatch
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain"}]': finished
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:44:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:44:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:44:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:44:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:44:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:44:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: from='client.34498 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005604210.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: Removed host np0005604210.localdomain
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: pgmap v19: 177 pgs: 177 active+clean; 104 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' 
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.24104 172.18.0.104:0/238177948' entity='mgr.np0005604210.rirrtk' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:44:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:44:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:34 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/2767003253' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:44:34 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/2767003253' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:44:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:44:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:44:34 np0005604215.localdomain podman[289457]: 2026-02-01 09:44:34.870448324 +0000 UTC m=+0.080400122 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, distribution-scope=public, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1769056855, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc.)
Feb 01 09:44:34 np0005604215.localdomain podman[289457]: 2026-02-01 09:44:34.883373299 +0000 UTC m=+0.093325087 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, architecture=x86_64, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc.)
Feb 01 09:44:34 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:44:34 np0005604215.localdomain systemd[1]: tmp-crun.vdAg7J.mount: Deactivated successfully.
Feb 01 09:44:34 np0005604215.localdomain podman[289458]: 2026-02-01 09:44:34.928890077 +0000 UTC m=+0.134749186 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:44:34 np0005604215.localdomain podman[289458]: 2026-02-01 09:44:34.963787422 +0000 UTC m=+0.169646521 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:44:34 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:44:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:39.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:39.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:44:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:39.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:44:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:39.120 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:44:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:41.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:41.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:41.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:44:41.762 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:44:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:44:41.762 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:44:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:44:41.763 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:44:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:42.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:42.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:42.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:44:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:44:43 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/3430034347' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:43 np0005604215.localdomain podman[289495]: 2026-02-01 09:44:43.865835991 +0000 UTC m=+0.082096195 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:44:43 np0005604215.localdomain podman[289495]: 2026-02-01 09:44:43.882876785 +0000 UTC m=+0.099137009 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 01 09:44:43 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.131 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.132 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.132 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.132 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.133 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:44:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:44:44 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2305914736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.578 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.758 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.760 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12380MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.760 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.761 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:44:44 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/2305914736' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:44 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/1461207862' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.843 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.843 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:44:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:44.868 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:44:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e10 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:44:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4166851382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.304 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.436s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.310 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.375 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updated inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.375 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.376 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.406 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:44:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:45.407 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.646s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:44:45 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/4166851382' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:44:46.407 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:44:46 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/216114197' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:48 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/2102676193' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:44:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:44:49 np0005604215.localdomain podman[289558]: 2026-02-01 09:44:49.860244426 +0000 UTC m=+0.076059367 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:44:49 np0005604215.localdomain podman[289558]: 2026-02-01 09:44:49.893537359 +0000 UTC m=+0.109352310 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:44:49 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:44:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:44:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:44:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:44:58 np0005604215.localdomain podman[289582]: 2026-02-01 09:44:58.863384153 +0000 UTC m=+0.075464318 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:44:58 np0005604215.localdomain podman[289582]: 2026-02-01 09:44:58.900665743 +0000 UTC m=+0.112745908 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:44:58 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:44:58 np0005604215.localdomain podman[289581]: 2026-02-01 09:44:58.915391894 +0000 UTC m=+0.128776200 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Feb 01 09:44:58 np0005604215.localdomain podman[289581]: 2026-02-01 09:44:58.97580943 +0000 UTC m=+0.189193756 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:44:58 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:44:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:45:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:45:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:45:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:45:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:45:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17780 "" "Go-http-client/1.1"
Feb 01 09:45:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:45:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:45:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:45:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:45:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:45:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:45:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:45:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e85 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:45:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:45:05 np0005604215.localdomain podman[289629]: 2026-02-01 09:45:05.866539165 +0000 UTC m=+0.077494641 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal, managed_by=edpm_ansible, version=9.7, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 09:45:05 np0005604215.localdomain podman[289629]: 2026-02-01 09:45:05.880622096 +0000 UTC m=+0.091577562 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.7, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, name=ubi9/ubi-minimal, vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z)
Feb 01 09:45:05 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:45:05 np0005604215.localdomain systemd[1]: tmp-crun.W84p8b.mount: Deactivated successfully.
Feb 01 09:45:05 np0005604215.localdomain podman[289630]: 2026-02-01 09:45:05.926502156 +0000 UTC m=+0.134064516 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:45:05 np0005604215.localdomain podman[289630]: 2026-02-01 09:45:05.956228417 +0000 UTC m=+0.163790797 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:45:05 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:45:07 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 e86: 6 total, 6 up, 6 in
Feb 01 09:45:07 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604212.oynhpm
Feb 01 09:45:07 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604210.rirrtk is unresponsive, replacing it with standby daemon np0005604212.oynhpm
Feb 01 09:45:07 np0005604215.localdomain ceph-mon[278949]: osdmap e86: 6 total, 6 up, 6 in
Feb 01 09:45:07 np0005604215.localdomain ceph-mon[278949]: mgrmap e32: np0005604212.oynhpm(active, starting, since 0.0465605s), standbys: np0005604215.uhhqtv, np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:07 np0005604215.localdomain sshd[289666]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:45:07 np0005604215.localdomain sshd[289666]: Accepted publickey for ceph-admin from 192.168.122.106 port 49628 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:45:07 np0005604215.localdomain systemd-logind[761]: New session 68 of user ceph-admin.
Feb 01 09:45:07 np0005604215.localdomain systemd[1]: Started Session 68 of User ceph-admin.
Feb 01 09:45:07 np0005604215.localdomain sshd[289666]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:45:07 np0005604215.localdomain sudo[289670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:07 np0005604215.localdomain sudo[289670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:07 np0005604215.localdomain sudo[289670]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:07 np0005604215.localdomain sudo[289688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 09:45:07 np0005604215.localdomain sudo[289688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604212.oynhpm is now available
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: removing stray HostCache host record np0005604210.localdomain.devices.0
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"}]': finished
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604210.localdomain.devices.0"}]': finished
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604212.oynhpm/mirror_snapshot_schedule"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604212.oynhpm/trash_purge_schedule"} : dispatch
Feb 01 09:45:08 np0005604215.localdomain sudo[289688]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.530914) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108530957, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 2394, "num_deletes": 256, "total_data_size": 8559110, "memory_usage": 9439008, "flush_reason": "Manual Compaction"}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb 01 09:45:08 np0005604215.localdomain sudo[289726]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:08 np0005604215.localdomain sudo[289726]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:08 np0005604215.localdomain sudo[289726]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108565914, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 5261577, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 15981, "largest_seqno": 18370, "table_properties": {"data_size": 5252204, "index_size": 5558, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2757, "raw_key_size": 24676, "raw_average_key_size": 22, "raw_value_size": 5231466, "raw_average_value_size": 4764, "num_data_blocks": 238, "num_entries": 1098, "num_filter_entries": 1098, "num_deletions": 254, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939002, "oldest_key_time": 1769939002, "file_creation_time": 1769939108, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 35055 microseconds, and 10883 cpu microseconds.
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.565967) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 5261577 bytes OK
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.565992) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.567665) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.567689) EVENT_LOG_v1 {"time_micros": 1769939108567681, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.567711) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 8547676, prev total WAL file size 8547676, number of live WAL files 2.
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.569492) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130323931' seq:72057594037927935, type:22 .. '7061786F73003130353433' seq:0, type:0; will stop at (end)
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(5138KB)], [21(16MB)]
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108569573, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 22702621, "oldest_snapshot_seqno": -1}
Feb 01 09:45:08 np0005604215.localdomain sudo[289744]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:45:08 np0005604215.localdomain sudo[289744]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 10517 keys, 19394522 bytes, temperature: kUnknown
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108701448, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 19394522, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19331318, "index_size": 35839, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26309, "raw_key_size": 280607, "raw_average_key_size": 26, "raw_value_size": 19148558, "raw_average_value_size": 1820, "num_data_blocks": 1377, "num_entries": 10517, "num_filter_entries": 10517, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769939108, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.701874) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 19394522 bytes
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.703841) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.0 rd, 146.9 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(5.0, 16.6 +0.0 blob) out(18.5 +0.0 blob), read-write-amplify(8.0) write-amplify(3.7) OK, records in: 11066, records dropped: 549 output_compression: NoCompression
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.703872) EVENT_LOG_v1 {"time_micros": 1769939108703859, "job": 10, "event": "compaction_finished", "compaction_time_micros": 132027, "compaction_time_cpu_micros": 48906, "output_level": 6, "num_output_files": 1, "total_output_size": 19394522, "num_input_records": 11066, "num_output_records": 10517, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108704825, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939108707719, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.569384) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.707761) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.707766) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.707768) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.707770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:08 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:08.707772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:09 np0005604215.localdomain systemd[1]: tmp-crun.vKjP1t.mount: Deactivated successfully.
Feb 01 09:45:09 np0005604215.localdomain podman[289830]: 2026-02-01 09:45:09.321217584 +0000 UTC m=+0.094006929 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, build-date=2025-12-08T17:28:53Z, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, distribution-scope=public, architecture=x86_64, vcs-type=git, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, name=rhceph)
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: mgrmap e33: np0005604212.oynhpm(active, since 1.14543s), standbys: np0005604215.uhhqtv, np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='client.34409 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: Saving service mon spec with placement label:mon
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:09 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:08] ENGINE Bus STARTING
Feb 01 09:45:09 np0005604215.localdomain podman[289830]: 2026-02-01 09:45:09.445950737 +0000 UTC m=+0.218740122 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.component=rhceph-container, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:45:10 np0005604215.localdomain sudo[289744]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:10 np0005604215.localdomain sudo[289952]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:10 np0005604215.localdomain sudo[289952]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:10 np0005604215.localdomain sudo[289952]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:10 np0005604215.localdomain sudo[289970]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:45:10 np0005604215.localdomain sudo[289970]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:08] ENGINE Serving on https://172.18.0.106:7150
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:08] ENGINE Client ('172.18.0.106', 47908) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:09] ENGINE Serving on http://172.18.0.106:8765
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:09] ENGINE Bus STARTED
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: mgrmap e34: np0005604212.oynhpm(active, since 2s), standbys: np0005604215.uhhqtv, np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:10 np0005604215.localdomain sudo[289970]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:10 np0005604215.localdomain sudo[290020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:10 np0005604215.localdomain sudo[290020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:11 np0005604215.localdomain sudo[290020]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:11 np0005604215.localdomain sudo[290038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:45:11 np0005604215.localdomain sudo[290038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:11 np0005604215.localdomain sudo[290038]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:11 np0005604215.localdomain ceph-mon[278949]: from='client.34530 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604213", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:45:11 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55c30080e160 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Feb 01 09:45:11 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:45:11 np0005604215.localdomain ceph-mon[278949]: paxos.1).electionLogic(46) init, last seen epoch 46
Feb 01 09:45:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:45:14 np0005604215.localdomain podman[290076]: 2026-02-01 09:45:14.874324139 +0000 UTC m=+0.080673032 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true)
Feb 01 09:45:14 np0005604215.localdomain podman[290076]: 2026-02-01 09:45:14.91265332 +0000 UTC m=+0.119002173 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 01 09:45:14 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: paxos.1).electionLogic(49) init, last seen epoch 49, mid-election, bumping
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: from='client.34445 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005604213"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: Remove daemons mon.np0005604213
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: Safe to remove mon.np0005604213: new quorum should be ['np0005604211', 'np0005604215', 'np0005604212'] (from ['np0005604211', 'np0005604215', 'np0005604212'])
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: Removing monitor np0005604213 from monmap...
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: Removing daemon mon.np0005604213 from np0005604213.localdomain -- ports []
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604215 in quorum (ranks 0,1)
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604212 in quorum (ranks 0,1,2)
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: monmap epoch 11
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:45:11.538842+0000
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: osdmap e86: 6 total, 6 up, 6 in
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: mgrmap e35: np0005604212.oynhpm(active, since 9s), standbys: np0005604215.uhhqtv, np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:16 np0005604215.localdomain sudo[290095]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:45:16 np0005604215.localdomain sudo[290095]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:16 np0005604215.localdomain sudo[290095]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:16 np0005604215.localdomain sudo[290113]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:45:16 np0005604215.localdomain sudo[290113]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:16 np0005604215.localdomain sudo[290113]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:16 np0005604215.localdomain sudo[290131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:16 np0005604215.localdomain sudo[290131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:16 np0005604215.localdomain sudo[290131]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290149]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:17 np0005604215.localdomain sudo[290149]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290149]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290167]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290201]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290201]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290201]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290219]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290219]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290219]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290237]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:45:17 np0005604215.localdomain sudo[290237]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290237]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290255]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:17 np0005604215.localdomain sudo[290255]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290255]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290273]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:17 np0005604215.localdomain sudo[290273]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290273]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290291]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290291]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290291]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290309]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:17 np0005604215.localdomain sudo[290309]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290309]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to  1348M
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:17 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:45:17 np0005604215.localdomain sudo[290327]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290327]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290327]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290361]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290361]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290361]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:17 np0005604215.localdomain sudo[290379]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:17 np0005604215.localdomain sudo[290379]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:17 np0005604215.localdomain sudo[290379]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290397]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:18 np0005604215.localdomain sudo[290397]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290397]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290415]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:45:18 np0005604215.localdomain sudo[290415]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290415]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290433]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:45:18 np0005604215.localdomain sudo[290433]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290433]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:18 np0005604215.localdomain sudo[290451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290451]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290469]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:18 np0005604215.localdomain sudo[290469]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290469]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290487]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:18 np0005604215.localdomain sudo[290487]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290487]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290521]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:18 np0005604215.localdomain sudo[290521]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290521]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290539]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:18 np0005604215.localdomain sudo[290539]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290539]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 23 KiB/s rd, 0 B/s wr, 13 op/s
Feb 01 09:45:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:18 np0005604215.localdomain sudo[290557]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:18 np0005604215.localdomain sudo[290557]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290557]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290575]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:18 np0005604215.localdomain sudo[290575]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290575]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290593]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:18 np0005604215.localdomain sudo[290593]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290593]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290611]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:18 np0005604215.localdomain sudo[290611]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:18 np0005604215.localdomain sudo[290611]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:18 np0005604215.localdomain sudo[290629]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:18 np0005604215.localdomain sudo[290629]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:19 np0005604215.localdomain sudo[290629]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:19 np0005604215.localdomain sudo[290647]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:19 np0005604215.localdomain sudo[290647]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:19 np0005604215.localdomain sudo[290647]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:19 np0005604215.localdomain sudo[290681]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:19 np0005604215.localdomain sudo[290681]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:19 np0005604215.localdomain sudo[290681]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:19 np0005604215.localdomain sudo[290699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:19 np0005604215.localdomain sudo[290699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:19 np0005604215.localdomain sudo[290699]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:19 np0005604215.localdomain sudo[290717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain sudo[290717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:19 np0005604215.localdomain sudo[290717]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:45:19 np0005604215.localdomain sudo[290735]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:45:19 np0005604215.localdomain sudo[290735]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:19 np0005604215.localdomain sudo[290735]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:45:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:45:20 np0005604215.localdomain podman[290753]: 2026-02-01 09:45:20.867753462 +0000 UTC m=+0.080295208 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:45:20 np0005604215.localdomain podman[290753]: 2026-02-01 09:45:20.903483213 +0000 UTC m=+0.116024939 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:45:20 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:45:21 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:45:21 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:45:21 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:21 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:21 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:45:21 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:23 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:45:23 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:45:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:45:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail; 20 KiB/s rd, 0 B/s wr, 11 op/s
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:45:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: from='client.34450 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005604213.localdomain:172.18.0.104", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: Deploying daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:45:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:45:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:45:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:45:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:45:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:45:29 np0005604215.localdomain podman[290776]: 2026-02-01 09:45:29.87219101 +0000 UTC m=+0.083365555 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:45:29 np0005604215.localdomain podman[290777]: 2026-02-01 09:45:29.923990055 +0000 UTC m=+0.132130585 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:45:29 np0005604215.localdomain podman[290776]: 2026-02-01 09:45:29.935735063 +0000 UTC m=+0.146909568 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:45:29 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:45:29 np0005604215.localdomain podman[290777]: 2026-02-01 09:45:29.95859381 +0000 UTC m=+0.166734300 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:45:29 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:29 np0005604215.localdomain ceph-mon[278949]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:45:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:45:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:45:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:45:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:45:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17775 "" "Go-http-client/1.1"
Feb 01 09:45:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:45:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:45:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:45:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:45:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:45:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:45:31 np0005604215.localdomain sudo[290824]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:31 np0005604215.localdomain sudo[290824]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:31 np0005604215.localdomain sudo[290824]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:32 np0005604215.localdomain sudo[290842]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:32 np0005604215.localdomain sudo[290842]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:45:32 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 2026-02-01 09:45:32.469073147 +0000 UTC m=+0.074945141 container create 51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_faraday, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:45:32 np0005604215.localdomain systemd[1]: Started libpod-conmon-51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7.scope.
Feb 01 09:45:32 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 2026-02-01 09:45:32.438655034 +0000 UTC m=+0.044527058 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 2026-02-01 09:45:32.54089468 +0000 UTC m=+0.146766674 container init 51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_faraday, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, name=rhceph, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, version=7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 2026-02-01 09:45:32.550040416 +0000 UTC m=+0.155912410 container start 51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_faraday, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, release=1764794109, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, distribution-scope=public, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 2026-02-01 09:45:32.550353726 +0000 UTC m=+0.156225730 container attach 51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_faraday, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1764794109, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-12-08T17:28:53Z, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:45:32 np0005604215.localdomain vigilant_faraday[290893]: 167 167
Feb 01 09:45:32 np0005604215.localdomain systemd[1]: libpod-51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7.scope: Deactivated successfully.
Feb 01 09:45:32 np0005604215.localdomain podman[290878]: 2026-02-01 09:45:32.554205958 +0000 UTC m=+0.160077982 container died 51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_faraday, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, release=1764794109, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-type=git, name=rhceph, version=7, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:45:32 np0005604215.localdomain podman[290898]: 2026-02-01 09:45:32.650077134 +0000 UTC m=+0.083879462 container remove 51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=vigilant_faraday, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, GIT_BRANCH=main, vendor=Red Hat, Inc., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, name=rhceph, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:45:32 np0005604215.localdomain systemd[1]: libpod-conmon-51ed8bd914c675be908c916b22d4ee95f4888429ac532c094cfb66bf6bf4d9f7.scope: Deactivated successfully.
Feb 01 09:45:32 np0005604215.localdomain sudo[290842]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:32 np0005604215.localdomain sudo[290915]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:32 np0005604215.localdomain sudo[290915]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:32 np0005604215.localdomain sudo[290915]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:32 np0005604215.localdomain sudo[290933]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:32 np0005604215.localdomain sudo[290933]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 
Feb 01 09:45:33 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 2026-02-01 09:45:33.339243799 +0000 UTC m=+0.078492463 container create d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_taussig, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, vcs-type=git, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:45:33 np0005604215.localdomain systemd[1]: Started libpod-conmon-d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053.scope.
Feb 01 09:45:33 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 2026-02-01 09:45:33.398436565 +0000 UTC m=+0.137685209 container init d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_taussig, release=1764794109, architecture=x86_64, io.buildah.version=1.41.4, vcs-type=git, com.redhat.component=rhceph-container, GIT_BRANCH=main, CEPH_POINT_RELEASE=, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 2026-02-01 09:45:33.305226412 +0000 UTC m=+0.044475066 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 2026-02-01 09:45:33.407493989 +0000 UTC m=+0.146742633 container start d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_taussig, io.openshift.tags=rhceph ceph, RELEASE=main, name=rhceph, io.openshift.expose-services=, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 2026-02-01 09:45:33.408355277 +0000 UTC m=+0.147603971 container attach d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_taussig, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 01 09:45:33 np0005604215.localdomain exciting_taussig[290982]: 167 167
Feb 01 09:45:33 np0005604215.localdomain systemd[1]: libpod-d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053.scope: Deactivated successfully.
Feb 01 09:45:33 np0005604215.localdomain podman[290967]: 2026-02-01 09:45:33.412982501 +0000 UTC m=+0.152231155 container died d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_taussig, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, ceph=True, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 01 09:45:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-1cd2cd6bf2747b23efca7474a35f9a1c7a2fcce6a5e2396504a7559da09269ca-merged.mount: Deactivated successfully.
Feb 01 09:45:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e5c542e879b2b5ddc744aeaaafaa0866b2d1dc1e2de2f1a91b048c438090c101-merged.mount: Deactivated successfully.
Feb 01 09:45:33 np0005604215.localdomain podman[290987]: 2026-02-01 09:45:33.513920377 +0000 UTC m=+0.091797330 container remove d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=exciting_taussig, version=7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, name=rhceph, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, architecture=x86_64, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, release=1764794109, com.redhat.component=rhceph-container)
Feb 01 09:45:33 np0005604215.localdomain systemd[1]: libpod-conmon-d69ec2dfcb2168f33878466928913d23eacb02a9e87f4b9a34ca7c807b6f4053.scope: Deactivated successfully.
Feb 01 09:45:33 np0005604215.localdomain sudo[290933]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:33 np0005604215.localdomain sudo[291009]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:33 np0005604215.localdomain sudo[291009]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:33 np0005604215.localdomain sudo[291009]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:33 np0005604215.localdomain sudo[291027]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:33 np0005604215.localdomain sudo[291027]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 2026-02-01 09:45:34.289393968 +0000 UTC m=+0.067968582 container create 00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_villani, io.openshift.tags=rhceph ceph, version=7, io.openshift.expose-services=, io.buildah.version=1.41.4, GIT_CLEAN=True, vcs-type=git, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, name=rhceph, ceph=True, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:45:34 np0005604215.localdomain systemd[1]: Started libpod-conmon-00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89.scope.
Feb 01 09:45:34 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 2026-02-01 09:45:34.351251278 +0000 UTC m=+0.129825882 container init 00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_villani, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, description=Red Hat Ceph Storage 7, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:45:34 np0005604215.localdomain frosty_villani[291077]: 167 167
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 2026-02-01 09:45:34.359052453 +0000 UTC m=+0.137627067 container start 00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_villani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., io.buildah.version=1.41.4, distribution-scope=public, vcs-type=git, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, description=Red Hat Ceph Storage 7, release=1764794109, architecture=x86_64)
Feb 01 09:45:34 np0005604215.localdomain systemd[1]: libpod-00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89.scope: Deactivated successfully.
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 2026-02-01 09:45:34.359373903 +0000 UTC m=+0.137948567 container attach 00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_villani, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, distribution-scope=public, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_BRANCH=main, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, RELEASE=main, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7)
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 2026-02-01 09:45:34.362021566 +0000 UTC m=+0.140596230 container died 00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_villani, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, RELEASE=main, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, GIT_BRANCH=main, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:45:34 np0005604215.localdomain podman[291062]: 2026-02-01 09:45:34.267883704 +0000 UTC m=+0.046458308 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:45:34 np0005604215.localdomain podman[291082]: 2026-02-01 09:45:34.441203199 +0000 UTC m=+0.070576414 container remove 00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_villani, GIT_CLEAN=True, io.openshift.expose-services=, release=1764794109, version=7, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, ceph=True, architecture=x86_64, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:45:34 np0005604215.localdomain systemd[1]: libpod-conmon-00fcbaad54d882197224adadbb2c0185ae7671cb035c87672c991cc15b63dd89.scope: Deactivated successfully.
Feb 01 09:45:34 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-415a315a48d84418b498e2aca14aae75294608a60201d927b2fd819706f75aa4-merged.mount: Deactivated successfully.
Feb 01 09:45:34 np0005604215.localdomain sudo[291027]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/108569471' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:45:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/108569471' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:45:34 np0005604215.localdomain sudo[291107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:34 np0005604215.localdomain sudo[291107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:34 np0005604215.localdomain sudo[291107]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:34 np0005604215.localdomain sudo[291125]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:34 np0005604215.localdomain sudo[291125]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/108569471' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/108569471' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 2026-02-01 09:45:35.245215076 +0000 UTC m=+0.079433312 container create e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_banzai, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, ceph=True, release=1764794109, distribution-scope=public, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, architecture=x86_64, GIT_BRANCH=main, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-type=git)
Feb 01 09:45:35 np0005604215.localdomain systemd[1]: Started libpod-conmon-e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490.scope.
Feb 01 09:45:35 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 2026-02-01 09:45:35.309871405 +0000 UTC m=+0.144089661 container init e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_banzai, name=rhceph, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_CLEAN=True, version=7)
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 2026-02-01 09:45:35.214152652 +0000 UTC m=+0.048370878 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 2026-02-01 09:45:35.322526841 +0000 UTC m=+0.156745067 container start e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_banzai, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, CEPH_POINT_RELEASE=, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=)
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 2026-02-01 09:45:35.323028037 +0000 UTC m=+0.157246273 container attach e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_banzai, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, ceph=True, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, architecture=x86_64, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:45:35 np0005604215.localdomain wizardly_banzai[291176]: 167 167
Feb 01 09:45:35 np0005604215.localdomain systemd[1]: libpod-e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490.scope: Deactivated successfully.
Feb 01 09:45:35 np0005604215.localdomain podman[291161]: 2026-02-01 09:45:35.325560646 +0000 UTC m=+0.159778892 container died e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_banzai, version=7, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, architecture=x86_64, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:35 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:35 np0005604215.localdomain podman[291181]: 2026-02-01 09:45:35.424051075 +0000 UTC m=+0.084873433 container remove e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=wizardly_banzai, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., name=rhceph, GIT_BRANCH=main, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, vcs-type=git, release=1764794109, RELEASE=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph)
Feb 01 09:45:35 np0005604215.localdomain systemd[1]: libpod-conmon-e3161dbb95da6ec54506b538985bfe8a3aae89ffba7f69a8320fc96fe7a64490.scope: Deactivated successfully.
Feb 01 09:45:35 np0005604215.localdomain systemd[1]: tmp-crun.Rx9K4l.mount: Deactivated successfully.
Feb 01 09:45:35 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4173a90f4465ff5bee1af091f92297faaeefa0af6ab9281e766acb34dad7fd86-merged.mount: Deactivated successfully.
Feb 01 09:45:35 np0005604215.localdomain sudo[291125]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:35 np0005604215.localdomain sudo[291198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:35 np0005604215.localdomain sudo[291198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:35 np0005604215.localdomain sudo[291198]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:35 np0005604215.localdomain sudo[291216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:35 np0005604215.localdomain sudo[291216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:45:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:36.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:36.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:45:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:36 np0005604215.localdomain podman[291248]: 2026-02-01 09:45:36.125361052 +0000 UTC m=+0.079894788 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:45:36 np0005604215.localdomain podman[291248]: 2026-02-01 09:45:36.139487845 +0000 UTC m=+0.094021581 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, managed_by=edpm_ansible, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:45:36 np0005604215.localdomain podman[291249]: 2026-02-01 09:45:36.191424733 +0000 UTC m=+0.145646229 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 01 09:45:36 np0005604215.localdomain podman[291249]: 2026-02-01 09:45:36.199683082 +0000 UTC m=+0.153904628 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 2026-02-01 09:45:36.221247229 +0000 UTC m=+0.147167557 container create b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_faraday, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, architecture=x86_64, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, io.openshift.expose-services=, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., distribution-scope=public, description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 2026-02-01 09:45:36.136922964 +0000 UTC m=+0.062843372 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: Started libpod-conmon-b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4.scope.
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 2026-02-01 09:45:36.284243704 +0000 UTC m=+0.210164032 container init b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_faraday, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, release=1764794109, name=rhceph, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, ceph=True, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, version=7, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 2026-02-01 09:45:36.295109165 +0000 UTC m=+0.221029513 container start b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_faraday, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, io.buildah.version=1.41.4, name=rhceph, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, vendor=Red Hat, Inc.)
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 2026-02-01 09:45:36.295435185 +0000 UTC m=+0.221355533 container attach b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_faraday, io.openshift.expose-services=, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_BRANCH=main, ceph=True, architecture=x86_64, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, release=1764794109, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z)
Feb 01 09:45:36 np0005604215.localdomain brave_faraday[291305]: 167 167
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: libpod-b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4.scope: Deactivated successfully.
Feb 01 09:45:36 np0005604215.localdomain podman[291262]: 2026-02-01 09:45:36.298558563 +0000 UTC m=+0.224478911 container died b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_faraday, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, release=1764794109, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, distribution-scope=public)
Feb 01 09:45:36 np0005604215.localdomain podman[291310]: 2026-02-01 09:45:36.389912878 +0000 UTC m=+0.078761111 container remove b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_faraday, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, CEPH_POINT_RELEASE=, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., ceph=True, version=7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main)
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: libpod-conmon-b322f0cf8fae886545ef230189cf10c954be561f331441de40d3dc40748c1fd4.scope: Deactivated successfully.
Feb 01 09:45:36 np0005604215.localdomain sudo[291216]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:36 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f49ffeb647d38caaddcb8c037d8a58e108e178889d5964c3091d6bc5045890cf-merged.mount: Deactivated successfully.
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.270758) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137270799, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 1377, "num_deletes": 250, "total_data_size": 3800561, "memory_usage": 3902408, "flush_reason": "Manual Compaction"}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137281919, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 2227760, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18375, "largest_seqno": 19747, "table_properties": {"data_size": 2221662, "index_size": 3179, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 17003, "raw_average_key_size": 23, "raw_value_size": 2208016, "raw_average_value_size": 2987, "num_data_blocks": 137, "num_entries": 739, "num_filter_entries": 739, "num_deletions": 250, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939109, "oldest_key_time": 1769939109, "file_creation_time": 1769939137, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 11203 microseconds, and 5961 cpu microseconds.
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.281965) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 2227760 bytes OK
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.281987) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.285768) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.285789) EVENT_LOG_v1 {"time_micros": 1769939137285784, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.285811) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 3793432, prev total WAL file size 3793756, number of live WAL files 2.
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.286833) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033353039' seq:72057594037927935, type:22 .. '6D6772737461740033373630' seq:0, type:0; will stop at (end)
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(2175KB)], [24(18MB)]
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137286905, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 21622282, "oldest_snapshot_seqno": -1}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 10727 keys, 19443280 bytes, temperature: kUnknown
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137367725, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 19443280, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19382232, "index_size": 33107, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 26885, "raw_key_size": 286365, "raw_average_key_size": 26, "raw_value_size": 19199269, "raw_average_value_size": 1789, "num_data_blocks": 1265, "num_entries": 10727, "num_filter_entries": 10727, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769939137, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.368087) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 19443280 bytes
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.369896) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 267.2 rd, 240.3 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.1, 18.5 +0.0 blob) out(18.5 +0.0 blob), read-write-amplify(18.4) write-amplify(8.7) OK, records in: 11256, records dropped: 529 output_compression: NoCompression
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.369925) EVENT_LOG_v1 {"time_micros": 1769939137369911, "job": 12, "event": "compaction_finished", "compaction_time_micros": 80922, "compaction_time_cpu_micros": 39714, "output_level": 6, "num_output_files": 1, "total_output_size": 19443280, "num_input_records": 11256, "num_output_records": 10727, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137370453, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939137373223, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.286659) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.373270) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.373278) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.373281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.373289) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:37 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:45:37.373293) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:45:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:38 np0005604215.localdomain ceph-mon[278949]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:38 np0005604215.localdomain sudo[291326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:45:38 np0005604215.localdomain sudo[291326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:38 np0005604215.localdomain sudo[291326]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:40.115 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:40.116 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:45:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:40.116 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:45:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:40.161 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:45:40 np0005604215.localdomain ceph-mon[278949]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:41.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:45:41.763 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:45:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:45:41.763 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:45:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:45:41.763 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:45:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:42.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:42.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:42.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:42 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "status", "format": "json"} v 0)
Feb 01 09:45:42 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.200:0/3879925325' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:45:42 np0005604215.localdomain ceph-mon[278949]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:42 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:42 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/3879925325' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:45:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:43.108 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:43.109 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:43.109 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:45:43 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e86 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.120 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.121 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.121 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.121 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.122 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3970058675' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.586 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='client.34462 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: Reconfig service osd.default_drive_group
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/3638642379' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:44 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/3970058675' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.826 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.829 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12346MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.829 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.830 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.957 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:45:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:44.958 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.057 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.130 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.132 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.152 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.189 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.217 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1640226999' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.637 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.420s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/915385531' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/1640226999' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.645 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.676 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.679 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.679 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.850s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.680 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.681 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 09:45:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:45.694 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 09:45:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:45:45 np0005604215.localdomain sudo[291388]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:45:45 np0005604215.localdomain sudo[291388]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:45 np0005604215.localdomain sudo[291388]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:45 np0005604215.localdomain systemd[1]: tmp-crun.UPZlcn.mount: Deactivated successfully.
Feb 01 09:45:45 np0005604215.localdomain podman[291405]: 2026-02-01 09:45:45.916103393 +0000 UTC m=+0.122857928 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:45:45 np0005604215.localdomain podman[291405]: 2026-02-01 09:45:45.951919773 +0000 UTC m=+0.158674298 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:45:45 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/3274300624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:46 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/3274300624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:46.690 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:46.713 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:45:46.714 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1618993870' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/1618993870' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e87 e87: 6 total, 6 up, 6 in
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr handle_mgr_map Activating!
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr handle_mgr_map I am now activating
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604211"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).mds e16 all = 0
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).mds e16 all = 0
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).mds e16 all = 0
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mds metadata"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).mds e16 all = 1
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mon metadata"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain sshd[289666]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:45:48 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #43. Immutable memtables: 0.
Feb 01 09:45:48 np0005604215.localdomain systemd[1]: session-68.scope: Deactivated successfully.
Feb 01 09:45:48 np0005604215.localdomain systemd[1]: session-68.scope: Consumed 10.141s CPU time.
Feb 01 09:45:48 np0005604215.localdomain systemd-logind[761]: Session 68 logged out. Waiting for processes to exit.
Feb 01 09:45:48 np0005604215.localdomain systemd-logind[761]: Removed session 68.
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: balancer
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:45:48
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: cephadm
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: crash
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: devicehealth
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [devicehealth INFO root] Starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: iostat
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: nfs
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: orchestrator
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: pg_autoscaler
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: progress
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Loading...
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7fce92315400>, <progress.module.GhostEvent object at 0x7fce92315430>, <progress.module.GhostEvent object at 0x7fce92315460>, <progress.module.GhostEvent object at 0x7fce92315490>, <progress.module.GhostEvent object at 0x7fce923154c0>, <progress.module.GhostEvent object at 0x7fce923154f0>, <progress.module.GhostEvent object at 0x7fce92315520>, <progress.module.GhostEvent object at 0x7fce92315550>, <progress.module.GhostEvent object at 0x7fce92315580>, <progress.module.GhostEvent object at 0x7fce923155b0>, <progress.module.GhostEvent object at 0x7fce923155e0>, <progress.module.GhostEvent object at 0x7fce92315610>, <progress.module.GhostEvent object at 0x7fce92315640>, <progress.module.GhostEvent object at 0x7fce92315670>, <progress.module.GhostEvent object at 0x7fce923156a0>, <progress.module.GhostEvent object at 0x7fce923156d0>, <progress.module.GhostEvent object at 0x7fce92315700>, <progress.module.GhostEvent object at 0x7fce92315730>, <progress.module.GhostEvent object at 0x7fce92315760>, <progress.module.GhostEvent object at 0x7fce92315790>, <progress.module.GhostEvent object at 0x7fce923157c0>, <progress.module.GhostEvent object at 0x7fce923157f0>, <progress.module.GhostEvent object at 0x7fce92315820>, <progress.module.GhostEvent object at 0x7fce92315850>, <progress.module.GhostEvent object at 0x7fce92315880>, <progress.module.GhostEvent object at 0x7fce923158b0>, <progress.module.GhostEvent object at 0x7fce923158e0>, <progress.module.GhostEvent object at 0x7fce92315910>, <progress.module.GhostEvent object at 0x7fce92315940>, <progress.module.GhostEvent object at 0x7fce92315970>, <progress.module.GhostEvent object at 0x7fce923159a0>, <progress.module.GhostEvent object at 0x7fce923159d0>, <progress.module.GhostEvent object at 0x7fce92315a00>, <progress.module.GhostEvent object at 0x7fce92315a30>, <progress.module.GhostEvent object at 0x7fce92315a60>, <progress.module.GhostEvent object at 0x7fce92315a90>, <progress.module.GhostEvent object at 0x7fce92315ac0>, <progress.module.GhostEvent object at 0x7fce92315af0>, <progress.module.GhostEvent object at 0x7fce92315b20>, <progress.module.GhostEvent object at 0x7fce92315b50>, <progress.module.GhostEvent object at 0x7fce92315b80>, <progress.module.GhostEvent object at 0x7fce92315bb0>, <progress.module.GhostEvent object at 0x7fce92315be0>, <progress.module.GhostEvent object at 0x7fce92315c10>, <progress.module.GhostEvent object at 0x7fce92315c40>, <progress.module.GhostEvent object at 0x7fce92315c70>, <progress.module.GhostEvent object at 0x7fce92315ca0>, <progress.module.GhostEvent object at 0x7fce92315cd0>, <progress.module.GhostEvent object at 0x7fce92315d00>, <progress.module.GhostEvent object at 0x7fce92315d30>] historic events
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Loaded OSDMap, ready.
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] recovery thread starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] starting setup
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: rbd_support
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: restful
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: status
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [restful INFO root] server_addr: :: server_port: 8003
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [restful WARNING root] server not running: no certificate configured
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: telemetry
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] PerfHandler: starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: vms, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: volumes
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: volumes, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: images, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.602+0000 7fce781f3640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.602+0000 7fce781f3640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.602+0000 7fce781f3640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.602+0000 7fce781f3640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.602+0000 7fce781f3640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: backups, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.606+0000 7fce7b1f9640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.606+0000 7fce7b1f9640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.606+0000 7fce7b1f9640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.606+0000 7fce7b1f9640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:45:48.606+0000 7fce7b1f9640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TaskHandler: starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} v 0)
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Feb 01 09:45:48 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] setup complete
Feb 01 09:45:48 np0005604215.localdomain sshd[291564]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 566 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' 
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26657 172.18.0.106:0/947533198' entity='mgr.np0005604212.oynhpm' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/3579560302' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604215.uhhqtv
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: osdmap e87: 6 total, 6 up, 6 in
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/3579560302' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: mgrmap e36: np0005604215.uhhqtv(active, starting, since 0.0432062s), standbys: np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604215.uhhqtv is now available
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch
Feb 01 09:45:48 np0005604215.localdomain sshd[291564]: Accepted publickey for ceph-admin from 192.168.122.108 port 52386 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:45:48 np0005604215.localdomain systemd-logind[761]: New session 69 of user ceph-admin.
Feb 01 09:45:48 np0005604215.localdomain systemd[1]: Started Session 69 of User ceph-admin.
Feb 01 09:45:48 np0005604215.localdomain sshd[291564]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:45:49 np0005604215.localdomain sudo[291568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:49 np0005604215.localdomain sudo[291568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:49 np0005604215.localdomain sudo[291568]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:49 np0005604215.localdomain sudo[291586]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:45:49 np0005604215.localdomain sudo[291586]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:49] ENGINE Bus STARTING
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:49] ENGINE Bus STARTING
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:49] ENGINE Serving on https://172.18.0.108:7150
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:49] ENGINE Serving on https://172.18.0.108:7150
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:49] ENGINE Client ('172.18.0.108', 56488) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:49] ENGINE Client ('172.18.0.108', 56488) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:49] ENGINE Serving on http://172.18.0.108:8765
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:49] ENGINE Serving on http://172.18.0.108:8765
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:45:49] ENGINE Bus STARTED
Feb 01 09:45:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:45:49] ENGINE Bus STARTED
Feb 01 09:45:49 np0005604215.localdomain podman[291701]: 2026-02-01 09:45:49.914798465 +0000 UTC m=+0.101345504 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, name=rhceph, io.openshift.expose-services=, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, version=7, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7)
Feb 01 09:45:50 np0005604215.localdomain podman[291701]: 2026-02-01 09:45:50.023827588 +0000 UTC m=+0.210374627 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, name=rhceph, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, io.buildah.version=1.41.4, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., architecture=x86_64, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:45:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mgrmap e37: np0005604215.uhhqtv(active, since 1.10815s), standbys: np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:45:50 np0005604215.localdomain ceph-mgr[278126]: [devicehealth INFO root] Check health
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:45:50 np0005604215.localdomain sudo[291586]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:45:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:45:50 np0005604215.localdomain sudo[291832]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:50 np0005604215.localdomain sudo[291832]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:50 np0005604215.localdomain sudo[291832]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:50 np0005604215.localdomain sudo[291850]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:45:50 np0005604215.localdomain sudo[291850]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:51 np0005604215.localdomain sudo[291850]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:49] ENGINE Bus STARTING
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:49] ENGINE Serving on https://172.18.0.108:7150
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:49] ENGINE Client ('172.18.0.108', 56488) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:49] ENGINE Serving on http://172.18.0.108:8765
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:45:49] ENGINE Bus STARTED
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: mgrmap e38: np0005604215.uhhqtv(active, since 2s), standbys: np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11  adding peer [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] to list of hints
Feb 01 09:45:51 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:51 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:51 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory
Feb 01 09:45:51 np0005604215.localdomain sudo[291899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:45:51 np0005604215.localdomain sudo[291899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:45:51 np0005604215.localdomain sudo[291899]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:51 np0005604215.localdomain sudo[291918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:45:51 np0005604215.localdomain sudo[291918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:51 np0005604215.localdomain podman[291917]: 2026-02-01 09:45:51.81299154 +0000 UTC m=+0.086374475 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:45:51 np0005604215.localdomain podman[291917]: 2026-02-01 09:45:51.829588029 +0000 UTC m=+0.102970944 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:45:51 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:45:52 np0005604215.localdomain sudo[291918]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain sudo[291978]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:45:52 np0005604215.localdomain sudo[291978]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[291978]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain sudo[291996]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:45:52 np0005604215.localdomain sudo[291996]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[291996]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:52 np0005604215.localdomain sudo[292014]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:52 np0005604215.localdomain sudo[292014]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292014]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain sudo[292032]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:52 np0005604215.localdomain sudo[292032]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292032]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e11 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (2) No such file or directory
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain sudo[292050]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:52 np0005604215.localdomain sudo[292050]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292050]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain sudo[292084]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:52 np0005604215.localdomain sudo[292084]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292084]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain sudo[292102]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:45:52 np0005604215.localdomain sudo[292102]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292102]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain sudo[292120]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain sudo[292120]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292120]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain sudo[292138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:52 np0005604215.localdomain sudo[292138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292138]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:52 np0005604215.localdomain sudo[292156]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:52 np0005604215.localdomain sudo[292156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:52 np0005604215.localdomain sudo[292156]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(probing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604211"} v 0)
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:45:52 np0005604215.localdomain ceph-mon[278949]: paxos.1).electionLogic(52) init, last seen epoch 52
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0)
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0)
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:53 np0005604215.localdomain sudo[292174]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292174]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292174]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain sudo[292192]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:53 np0005604215.localdomain sudo[292192]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292192]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain sudo[292210]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292210]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292210]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain sudo[292244]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292244]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292244]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain sudo[292262]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292262]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292262]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain sudo[292280]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:53 np0005604215.localdomain sudo[292280]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292280]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mgr.np0005604212.oynhpm 172.18.0.106:0/3809435654; not ready for session (expect reconnect)
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:53 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:53 np0005604215.localdomain sudo[292298]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:45:53 np0005604215.localdomain sudo[292298]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292298]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain sudo[292316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:45:53 np0005604215.localdomain sudo[292316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292316]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain sudo[292334]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292334]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain sudo[292352]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:53 np0005604215.localdomain sudo[292352]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292352]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:53 np0005604215.localdomain sudo[292370]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292370]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292370]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:53 np0005604215.localdomain sudo[292404]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:53 np0005604215.localdomain sudo[292404]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:53 np0005604215.localdomain sudo[292404]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain sudo[292422]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:45:54 np0005604215.localdomain sudo[292422]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292422]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain sudo[292440]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain sudo[292440]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292440]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain sudo[292458]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:54 np0005604215.localdomain sudo[292458]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292458]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain sudo[292476]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:45:54 np0005604215.localdomain sudo[292476]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292476]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:45:54 np0005604215.localdomain sudo[292494]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:54 np0005604215.localdomain sudo[292494]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292494]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:54 np0005604215.localdomain sudo[292512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:54 np0005604215.localdomain sudo[292512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292512]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain sudo[292530]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:54 np0005604215.localdomain sudo[292530]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292530]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mgr.np0005604212.oynhpm 172.18.0.106:0/3809435654; not ready for session (expect reconnect)
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:54 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:54 np0005604215.localdomain sudo[292564]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:54 np0005604215.localdomain sudo[292564]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292564]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain sudo[292582]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:45:54 np0005604215.localdomain sudo[292582]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292582]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain sudo[292600]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:54 np0005604215.localdomain sudo[292600]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:54 np0005604215.localdomain sudo[292600]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:45:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:45:55 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:45:55 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mgr.np0005604212.oynhpm 172.18.0.106:0/3809435654; not ready for session (expect reconnect)
Feb 01 09:45:55 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:55 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:55 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:55 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 01 09:45:56 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mgr.np0005604212.oynhpm 172.18.0.106:0/3809435654; not ready for session (expect reconnect)
Feb 01 09:45:56 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:56 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:56 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:56 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:57 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mgr.np0005604212.oynhpm 172.18.0.106:0/3809435654; not ready for session (expect reconnect)
Feb 01 09:45:57 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:57 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:57 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:57 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 0 B/s wr, 16 op/s
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604212 in quorum (ranks 0,1,2)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: monmap epoch 12
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:45:52.979956+0000
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: osdmap e87: 6 total, 6 up, 6 in
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mgrmap e39: np0005604215.uhhqtv(active, since 9s), standbys: np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1/4 mons down, quorum np0005604211,np0005604215,np0005604212 (MON_DOWN)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604212.oynhpm started
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: Health detail: HEALTH_WARN 1/4 mons down, quorum np0005604211,np0005604215,np0005604212
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: [WRN] MON_DOWN: 1/4 mons down, quorum np0005604211,np0005604215,np0005604212
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]:     mon.np0005604213 (rank 3) addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] is down (out of quorum)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 0 B/s wr, 13 op/s
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 8b45aa57-831a-4f08-88ca-febe5b017770 (Updating node-proxy deployment (+4 -> 4))
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 8b45aa57-831a-4f08-88ca-febe5b017770 (Updating node-proxy deployment (+4 -> 4))
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 8b45aa57-831a-4f08-88ca-febe5b017770 (Updating node-proxy deployment (+4 -> 4)) in 0 seconds
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: paxos.1).electionLogic(54) init, last seen epoch 54
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(electing) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: mgr finish mon failed to return metadata for mon.np0005604213: (22) Invalid argument
Feb 01 09:45:58 np0005604215.localdomain sudo[292618]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:45:58 np0005604215.localdomain sudo[292618]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:45:58 np0005604215.localdomain sudo[292618]: pam_unix(sudo:session): session closed for user root
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:45:58 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:45:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 0 B/s wr, 13 op/s
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604213 calling monitor election
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 calling monitor election
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604211 is new leader, mons np0005604211,np0005604215,np0005604212,np0005604213 in quorum (ranks 0,1,2,3)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: monmap epoch 12
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:45:52.979956+0000
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604211
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: 3: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: osdmap e87: 6 total, 6 up, 6 in
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mgrmap e40: np0005604215.uhhqtv(active, since 10s), standbys: np0005604211.cuflqz, np0005604209.isqrps, np0005604213.caiaeh, np0005604212.oynhpm
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: Health check cleared: MON_DOWN (was: 1/4 mons down, quorum np0005604211,np0005604215,np0005604212)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: overall HEALTH_OK
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:45:59 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mon.np0005604213 172.18.0.107:0/1601224003; not ready for session (expect reconnect)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:45:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:45:59 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:45:59 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:45:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.34525 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:46:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:46:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:46:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:46:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:46:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:46:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17793 "" "Go-http-client/1.1"
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:00 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:46:00 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_report got status from non-daemon mon.np0005604213
Feb 01 09:46:00 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:00.549+0000 7fcea83d3640 -1 mgr.server handle_report got status from non-daemon mon.np0005604213
Feb 01 09:46:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:46:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:46:00 np0005604215.localdomain podman[292636]: 2026-02-01 09:46:00.863060971 +0000 UTC m=+0.076117593 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:46:00 np0005604215.localdomain podman[292636]: 2026-02-01 09:46:00.927644933 +0000 UTC m=+0.140701495 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:46:00 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:46:00 np0005604215.localdomain podman[292637]: 2026-02-01 09:46:00.929405239 +0000 UTC m=+0.136202996 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:46:01 np0005604215.localdomain podman[292637]: 2026-02-01 09:46:01.009494155 +0000 UTC m=+0.216291942 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:46:01 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:01 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring osd.1 (monmap changed)...
Feb 01 09:46:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring osd.1 (monmap changed)...
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.1"} v 0)
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:46:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='client.34525 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:46:01 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:46:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:46:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:46:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:46:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:46:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.34533 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Saving service mon spec with placement label:mon
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Saving service mon spec with placement label:mon
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring osd.4 (monmap changed)...
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring osd.4 (monmap changed)...
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.4"} v 0)
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:02 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:46:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: from='client.34533 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: Saving service mon spec with placement label:mon
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:03 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:46:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:03 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:03 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:46:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:46:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.54103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604213", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:04 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:46:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:46:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='client.54103 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604213", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:04 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:05 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:46:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:05 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:05 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:46:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:06 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:46:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:46:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:46:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/1728698498' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:06 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:46:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:46:06 np0005604215.localdomain podman[292683]: 2026-02-01 09:46:06.826587048 +0000 UTC m=+0.083092163 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, release=1769056855, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, version=9.7, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal)
Feb 01 09:46:06 np0005604215.localdomain podman[292683]: 2026-02-01 09:46:06.84168197 +0000 UTC m=+0.098187085 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7)
Feb 01 09:46:06 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:46:06 np0005604215.localdomain podman[292684]: 2026-02-01 09:46:06.929780718 +0000 UTC m=+0.183174356 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:46:06 np0005604215.localdomain podman[292684]: 2026-02-01 09:46:06.935053893 +0000 UTC m=+0.188447551 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 01 09:46:06 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:07 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring osd.0 (monmap changed)...
Feb 01 09:46:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring osd.0 (monmap changed)...
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.0"} v 0)
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:07 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:46:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 0 B/s wr, 9 op/s
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:46:07 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:08 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring osd.3 (monmap changed)...
Feb 01 09:46:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring osd.3 (monmap changed)...
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "osd.3"} v 0)
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:08 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:46:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:46:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:46:08 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e87 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:09 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:46:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:09 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:46:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:09 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:10 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:46:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:46:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:46:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/2918964831' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json"} : dispatch
Feb 01 09:46:10 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:11 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:46:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "auth get", "entity": "mon."} v 0)
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config get", "who": "mon", "key": "public_network"} v 0)
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e12 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:11 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:46:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: from='mgr.26672 172.18.0.108:0/2709119860' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:11 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e88 e88: 6 total, 6 up, 6 in
Feb 01 09:46:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:12.014+0000 7fcf04926640 -1 mgr handle_mgr_map I was active but no longer am
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr handle_mgr_map I was active but no longer am
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  e: '/usr/bin/ceph-mgr'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  0: '/usr/bin/ceph-mgr'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  1: '-n'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  2: 'mgr.np0005604215.uhhqtv'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  3: '-f'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  4: '--setuser'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  5: 'ceph'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  6: '--setgroup'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  7: 'ceph'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  8: '--default-log-to-file=false'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  9: '--default-log-to-journald=true'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  10: '--default-log-to-stderr=false'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn respawning with exe /usr/bin/ceph-mgr
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr respawn  exe_path /proc/self/exe
Feb 01 09:46:12 np0005604215.localdomain sshd[291564]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:46:12 np0005604215.localdomain systemd[1]: session-69.scope: Deactivated successfully.
Feb 01 09:46:12 np0005604215.localdomain systemd[1]: session-69.scope: Consumed 5.763s CPU time.
Feb 01 09:46:12 np0005604215.localdomain systemd-logind[761]: Session 69 logged out. Waiting for processes to exit.
Feb 01 09:46:12 np0005604215.localdomain systemd-logind[761]: Removed session 69.
Feb 01 09:46:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: ignoring --setuser ceph since I am not root
Feb 01 09:46:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: ignoring --setgroup ceph since I am not root
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mgr, pid 2
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: pidfile_write: ignore empty --pid-file
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'alerts'
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'balancer'
Feb 01 09:46:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:12.205+0000 7f947e9a3140 -1 mgr[py] Module alerts has missing NOTIFY_TYPES member
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'cephadm'
Feb 01 09:46:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:12.271+0000 7f947e9a3140 -1 mgr[py] Module balancer has missing NOTIFY_TYPES member
Feb 01 09:46:12 np0005604215.localdomain sshd[292745]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.200:0/1843935985' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604211.cuflqz
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: osdmap e88: 6 total, 6 up, 6 in
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: mgrmap e41: np0005604211.cuflqz(active, starting, since 0.0491842s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604212.oynhpm
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604211"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604211.cuflqz", "id": "np0005604211.cuflqz"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604211.cuflqz is now available
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/mirror_snapshot_schedule"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604211.cuflqz/trash_purge_schedule"} : dispatch
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.657685) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172657727, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 1766, "num_deletes": 253, "total_data_size": 7165358, "memory_usage": 7353568, "flush_reason": "Manual Compaction"}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172683856, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 4271920, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19752, "largest_seqno": 21513, "table_properties": {"data_size": 4264410, "index_size": 4207, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2245, "raw_key_size": 19590, "raw_average_key_size": 21, "raw_value_size": 4247752, "raw_average_value_size": 4740, "num_data_blocks": 177, "num_entries": 896, "num_filter_entries": 896, "num_deletions": 252, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939137, "oldest_key_time": 1769939137, "file_creation_time": 1769939172, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 26298 microseconds, and 9333 cpu microseconds.
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.683979) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 4271920 bytes OK
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.684033) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.686059) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.686085) EVENT_LOG_v1 {"time_micros": 1769939172686078, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.686109) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 7156247, prev total WAL file size 7156247, number of live WAL files 2.
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.688369) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031323630' seq:72057594037927935, type:22 .. '6B760031353132' seq:0, type:0; will stop at (end)
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(4171KB)], [27(18MB)]
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172688426, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 23715200, "oldest_snapshot_seqno": -1}
Feb 01 09:46:12 np0005604215.localdomain sshd[292745]: Accepted publickey for ceph-admin from 192.168.122.105 port 38658 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:46:12 np0005604215.localdomain systemd-logind[761]: New session 70 of user ceph-admin.
Feb 01 09:46:12 np0005604215.localdomain systemd[1]: Started Session 70 of User ceph-admin.
Feb 01 09:46:12 np0005604215.localdomain sshd[292745]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 11111 keys, 22775032 bytes, temperature: kUnknown
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172811470, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 22775032, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22711025, "index_size": 35106, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27845, "raw_key_size": 297350, "raw_average_key_size": 26, "raw_value_size": 22520845, "raw_average_value_size": 2026, "num_data_blocks": 1333, "num_entries": 11111, "num_filter_entries": 11111, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769939172, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.811722) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 22775032 bytes
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.815981) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.6 rd, 185.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 18.5 +0.0 blob) out(21.7 +0.0 blob), read-write-amplify(10.9) write-amplify(5.3) OK, records in: 11623, records dropped: 512 output_compression: NoCompression
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.815998) EVENT_LOG_v1 {"time_micros": 1769939172815991, "job": 14, "event": "compaction_finished", "compaction_time_micros": 123121, "compaction_time_cpu_micros": 27234, "output_level": 6, "num_output_files": 1, "total_output_size": 22775032, "num_input_records": 11623, "num_output_records": 11111, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172816370, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939172817695, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.687783) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.817760) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.817766) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.817768) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.817770) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:12 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:12.817772) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:12 np0005604215.localdomain sudo[292754]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:12 np0005604215.localdomain sudo[292754]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:12 np0005604215.localdomain sudo[292754]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'crash'
Feb 01 09:46:12 np0005604215.localdomain sudo[292772]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:46:12 np0005604215.localdomain sudo[292772]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 01 09:46:12 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'dashboard'
Feb 01 09:46:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:12.945+0000 7f947e9a3140 -1 mgr[py] Module crash has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'devicehealth'
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'diskprediction_local'
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:13.539+0000 7f947e9a3140 -1 mgr[py] Module devicehealth has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: /lib64/python3.9/site-packages/scipy/__init__.py:73: UserWarning: NumPy was imported from a Python sub-interpreter but NumPy does not properly support sub-interpreters. This will likely work for most users but might cause hard to track down issues or subtle bugs. A common user of the rare sub-interpreter feature is wsgi which also allows single-interpreter mode.
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Improvements in the case of bugs are welcome, but is not on the NumPy roadmap, and full support may require significant effort to achieve.
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]:   from numpy import show_config as show_numpy_config
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:13.672+0000 7f947e9a3140 -1 mgr[py] Module diskprediction_local has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'influx'
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'insights'
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:13.730+0000 7f947e9a3140 -1 mgr[py] Module influx has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain podman[292863]: 2026-02-01 09:46:13.757552768 +0000 UTC m=+0.100613361 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, com.redhat.component=rhceph-container, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, GIT_CLEAN=True, ceph=True, io.buildah.version=1.41.4, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'iostat'
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'k8sevents'
Feb 01 09:46:13 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:13.841+0000 7f947e9a3140 -1 mgr[py] Module iostat has missing NOTIFY_TYPES member
Feb 01 09:46:13 np0005604215.localdomain podman[292863]: 2026-02-01 09:46:13.858756416 +0000 UTC m=+0.201817009 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, version=7, ceph=True, CEPH_POINT_RELEASE=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, vendor=Red Hat, Inc., vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, architecture=x86_64)
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'localpool'
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: mgrmap e42: np0005604211.cuflqz(active, since 1.2233s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604212.oynhpm
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:46:13] ENGINE Bus STARTING
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:46:13] ENGINE Serving on http://172.18.0.105:8765
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:46:13] ENGINE Serving on https://172.18.0.105:7150
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:46:13] ENGINE Bus STARTED
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:46:13] ENGINE Client ('172.18.0.105', 33394) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'mds_autoscaler'
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.292512) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174292933, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 297, "num_deletes": 251, "total_data_size": 935403, "memory_usage": 953808, "flush_reason": "Manual Compaction"}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174299586, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 622124, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 21518, "largest_seqno": 21810, "table_properties": {"data_size": 620159, "index_size": 204, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5516, "raw_average_key_size": 19, "raw_value_size": 616084, "raw_average_value_size": 2200, "num_data_blocks": 10, "num_entries": 280, "num_filter_entries": 280, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939173, "oldest_key_time": 1769939173, "file_creation_time": 1769939174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 7392 microseconds, and 3615 cpu microseconds.
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.299905) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 622124 bytes OK
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.300068) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.302223) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.302249) EVENT_LOG_v1 {"time_micros": 1769939174302243, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.302272) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 933193, prev total WAL file size 949465, number of live WAL files 2.
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.306483) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003130353432' seq:72057594037927935, type:22 .. '7061786F73003130373934' seq:0, type:0; will stop at (end)
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(607KB)], [30(21MB)]
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174306536, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 23397156, "oldest_snapshot_seqno": -1}
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'mirroring'
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 10876 keys, 19748803 bytes, temperature: kUnknown
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174405619, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 19748803, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19688809, "index_size": 31733, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 27205, "raw_key_size": 292945, "raw_average_key_size": 26, "raw_value_size": 19504975, "raw_average_value_size": 1793, "num_data_blocks": 1188, "num_entries": 10876, "num_filter_entries": 10876, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769938864, "oldest_key_time": 0, "file_creation_time": 1769939174, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c9a40fa3-7e53-4325-8a76-a86e4a0fff5d", "db_session_id": "7PKSWXLLH9M8NB5FULPW", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.406054) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 19748803 bytes
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.411411) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 235.9 rd, 199.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 21.7 +0.0 blob) out(18.8 +0.0 blob), read-write-amplify(69.4) write-amplify(31.7) OK, records in: 11391, records dropped: 515 output_compression: NoCompression
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.411452) EVENT_LOG_v1 {"time_micros": 1769939174411434, "job": 16, "event": "compaction_finished", "compaction_time_micros": 99201, "compaction_time_cpu_micros": 38956, "output_level": 6, "num_output_files": 1, "total_output_size": 19748803, "num_input_records": 11391, "num_output_records": 10876, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174412435, "job": 16, "event": "table_file_deletion", "file_number": 32}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939174417327, "job": 16, "event": "table_file_deletion", "file_number": 30}
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.306413) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.417578) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.417585) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.417587) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.417589) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:14 np0005604215.localdomain ceph-mon[278949]: rocksdb: (Original Log Time 2026/02/01-09:46:14.417591) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'nfs'
Feb 01 09:46:14 np0005604215.localdomain sudo[292772]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:14 np0005604215.localdomain sudo[292985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:14 np0005604215.localdomain sudo[292985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:14 np0005604215.localdomain sudo[292985]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'orchestrator'
Feb 01 09:46:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:14.584+0000 7f947e9a3140 -1 mgr[py] Module nfs has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain sudo[293003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:46:14 np0005604215.localdomain sudo[293003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'osd_perf_query'
Feb 01 09:46:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:14.725+0000 7f947e9a3140 -1 mgr[py] Module orchestrator has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'osd_support'
Feb 01 09:46:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:14.788+0000 7f947e9a3140 -1 mgr[py] Module osd_perf_query has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'pg_autoscaler'
Feb 01 09:46:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:14.842+0000 7f947e9a3140 -1 mgr[py] Module osd_support has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'progress'
Feb 01 09:46:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:14.906+0000 7f947e9a3140 -1 mgr[py] Module pg_autoscaler has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:14.965+0000 7f947e9a3140 -1 mgr[py] Module progress has missing NOTIFY_TYPES member
Feb 01 09:46:14 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'prometheus'
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'rbd_support'
Feb 01 09:46:15 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:15.255+0000 7f947e9a3140 -1 mgr[py] Module prometheus has missing NOTIFY_TYPES member
Feb 01 09:46:15 np0005604215.localdomain sudo[293003]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'restful'
Feb 01 09:46:15 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:15.336+0000 7f947e9a3140 -1 mgr[py] Module rbd_support has missing NOTIFY_TYPES member
Feb 01 09:46:15 np0005604215.localdomain sudo[293052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:15 np0005604215.localdomain sudo[293052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:15 np0005604215.localdomain sudo[293052]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'rgw'
Feb 01 09:46:15 np0005604215.localdomain sudo[293070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:46:15 np0005604215.localdomain sudo[293070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 01 09:46:15 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'rook'
Feb 01 09:46:15 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:15.656+0000 7f947e9a3140 -1 mgr[py] Module rgw has missing NOTIFY_TYPES member
Feb 01 09:46:15 np0005604215.localdomain sudo[293070]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'selftest'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.068+0000 7f947e9a3140 -1 mgr[py] Module rook has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain sudo[293107]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:46:16 np0005604215.localdomain sudo[293107]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:46:16 np0005604215.localdomain sudo[293107]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'snap_schedule'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.127+0000 7f947e9a3140 -1 mgr[py] Module selftest has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'stats'
Feb 01 09:46:16 np0005604215.localdomain sudo[293131]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:46:16 np0005604215.localdomain sudo[293131]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293131]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'status'
Feb 01 09:46:16 np0005604215.localdomain podman[293125]: 2026-02-01 09:46:16.265937675 +0000 UTC m=+0.145106853 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:46:16 np0005604215.localdomain podman[293125]: 2026-02-01 09:46:16.301063355 +0000 UTC m=+0.180232563 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:46:16 np0005604215.localdomain sudo[293153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:16 np0005604215.localdomain sudo[293153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: mgrmap e43: np0005604211.cuflqz(active, since 3s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604212.oynhpm
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd/host:np0005604211", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:16 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:16 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module status has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'telegraf'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.312+0000 7f947e9a3140 -1 mgr[py] Module status has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain sudo[293153]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'telemetry'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.369+0000 7f947e9a3140 -1 mgr[py] Module telegraf has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain sudo[293177]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:16 np0005604215.localdomain sudo[293177]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293177]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain sudo[293195]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:16 np0005604215.localdomain sudo[293195]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293195]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'test_orchestrator'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.498+0000 7f947e9a3140 -1 mgr[py] Module telemetry has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain sudo[293229]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:16 np0005604215.localdomain sudo[293229]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293229]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'volumes'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.640+0000 7f947e9a3140 -1 mgr[py] Module test_orchestrator has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain sudo[293247]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:16 np0005604215.localdomain sudo[293247]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293247]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain sudo[293265]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:46:16 np0005604215.localdomain sudo[293265]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293265]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain sudo[293283]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:46:16 np0005604215.localdomain sudo[293283]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293283]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Loading python module 'zabbix'
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.824+0000 7f947e9a3140 -1 mgr[py] Module volumes has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:46:16.881+0000 7f947e9a3140 -1 mgr[py] Module zabbix has missing NOTIFY_TYPES member
Feb 01 09:46:16 np0005604215.localdomain sudo[293301]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55d1775411e0 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Feb 01 09:46:16 np0005604215.localdomain sudo[293301]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293301]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:16 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.105:6800/155238379
Feb 01 09:46:16 np0005604215.localdomain sudo[293319]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:16 np0005604215.localdomain sudo[293319]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:16 np0005604215.localdomain sudo[293319]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293337]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:17 np0005604215.localdomain sudo[293337]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293337]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293355]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293355]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293355]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293389]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293389]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293389]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:17 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:17 np0005604215.localdomain ceph-mon[278949]: Standby manager daemon np0005604215.uhhqtv started
Feb 01 09:46:17 np0005604215.localdomain sudo[293407]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293407]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293407]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:17 np0005604215.localdomain sudo[293425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293425]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:46:17 np0005604215.localdomain sudo[293443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293443]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293461]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:46:17 np0005604215.localdomain sudo[293461]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293461]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293479]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293479]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293479]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293497]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:17 np0005604215.localdomain sudo[293497]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293497]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293515]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293515]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293515]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293549]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293549]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293549]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:17 np0005604215.localdomain sudo[293567]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:46:17 np0005604215.localdomain sudo[293567]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:17 np0005604215.localdomain sudo[293567]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293585]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain sudo[293585]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293585]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293603]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:46:18 np0005604215.localdomain sudo[293603]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293603]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293621]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:46:18 np0005604215.localdomain sudo[293621]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293621]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293639]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:46:18 np0005604215.localdomain sudo[293639]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293639]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: mgrmap e44: np0005604211.cuflqz(active, since 5s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain sudo[293657]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:18 np0005604215.localdomain sudo[293657]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293657]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293675]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:46:18 np0005604215.localdomain sudo[293675]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293675]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293709]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:46:18 np0005604215.localdomain sudo[293709]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293709]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293727]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:46:18 np0005604215.localdomain sudo[293727]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293727]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:18 np0005604215.localdomain sudo[293745]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:46:18 np0005604215.localdomain sudo[293745]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:18 np0005604215.localdomain sudo[293745]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:19 np0005604215.localdomain sudo[293763]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:46:19 np0005604215.localdomain sudo[293763]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:19 np0005604215.localdomain sudo[293763]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 0 B/s wr, 20 op/s
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:19 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604211 (monmap changed)...
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604211 on np0005604211.localdomain
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:20 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:46:21 np0005604215.localdomain sudo[293781]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:21 np0005604215.localdomain sudo[293781]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:21 np0005604215.localdomain sudo[293781]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:21 np0005604215.localdomain sudo[293799]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:21 np0005604215.localdomain sudo[293799]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 2026-02-01 09:46:21.568571589 +0000 UTC m=+0.056952343 container create 98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_lumiere, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, vcs-type=git, architecture=x86_64, io.openshift.tags=rhceph ceph, name=rhceph, vendor=Red Hat, Inc., distribution-scope=public, version=7, io.buildah.version=1.41.4, GIT_BRANCH=main, CEPH_POINT_RELEASE=, ceph=True, io.openshift.expose-services=, RELEASE=main, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:46:21 np0005604215.localdomain systemd[1]: Started libpod-conmon-98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418.scope.
Feb 01 09:46:21 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 2026-02-01 09:46:21.632249343 +0000 UTC m=+0.120630087 container init 98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_lumiere, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc., distribution-scope=public, name=rhceph, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, RELEASE=main, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 2026-02-01 09:46:21.538399304 +0000 UTC m=+0.026780078 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 2026-02-01 09:46:21.642040019 +0000 UTC m=+0.130420773 container start 98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_lumiere, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, release=1764794109, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, io.openshift.expose-services=, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vendor=Red Hat, Inc.)
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 2026-02-01 09:46:21.642281697 +0000 UTC m=+0.130662441 container attach 98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_lumiere, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_CLEAN=True, version=7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.expose-services=, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, io.openshift.tags=rhceph ceph, distribution-scope=public, vcs-type=git)
Feb 01 09:46:21 np0005604215.localdomain sad_lumiere[293850]: 167 167
Feb 01 09:46:21 np0005604215.localdomain systemd[1]: libpod-98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418.scope: Deactivated successfully.
Feb 01 09:46:21 np0005604215.localdomain podman[293834]: 2026-02-01 09:46:21.645979813 +0000 UTC m=+0.134360587 container died 98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_lumiere, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, com.redhat.component=rhceph-container, release=1764794109, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, version=7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:46:21 np0005604215.localdomain podman[293855]: 2026-02-01 09:46:21.739421778 +0000 UTC m=+0.085246569 container remove 98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sad_lumiere, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., release=1764794109, name=rhceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.expose-services=, version=7, vcs-type=git, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:46:21 np0005604215.localdomain systemd[1]: libpod-conmon-98950d09a78752b281c28ebedddf5a4f4059657c22fcc2fdf9fba3eccd192418.scope: Deactivated successfully.
Feb 01 09:46:21 np0005604215.localdomain sudo[293799]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:21 np0005604215.localdomain sudo[293873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:21 np0005604215.localdomain sudo[293873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:46:21 np0005604215.localdomain sudo[293873]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:21 np0005604215.localdomain sudo[293893]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:21 np0005604215.localdomain sudo[293893]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:46:22 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:22 np0005604215.localdomain podman[293891]: 2026-02-01 09:46:22.029985534 +0000 UTC m=+0.085318892 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:46:22 np0005604215.localdomain podman[293891]: 2026-02-01 09:46:22.044712655 +0000 UTC m=+0.100045993 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 2026-02-01 09:46:22.437811772 +0000 UTC m=+0.072324495 container create 9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_archimedes, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, ceph=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, name=rhceph, architecture=x86_64, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, description=Red Hat Ceph Storage 7)
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: Started libpod-conmon-9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1.scope.
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 2026-02-01 09:46:22.499901686 +0000 UTC m=+0.134414369 container init 9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_archimedes, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, RELEASE=main, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, ceph=True, vendor=Red Hat, Inc., vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 2026-02-01 09:46:22.508428863 +0000 UTC m=+0.142941556 container start 9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_archimedes, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4)
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 2026-02-01 09:46:22.409520476 +0000 UTC m=+0.044033209 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 2026-02-01 09:46:22.508736222 +0000 UTC m=+0.143248945 container attach 9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_archimedes, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, distribution-scope=public, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, ceph=True, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Feb 01 09:46:22 np0005604215.localdomain serene_archimedes[293965]: 167 167
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: libpod-9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1.scope: Deactivated successfully.
Feb 01 09:46:22 np0005604215.localdomain podman[293950]: 2026-02-01 09:46:22.511691235 +0000 UTC m=+0.146203918 container died 9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_archimedes, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, ceph=True, RELEASE=main, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, vcs-type=git, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-55655cd690fecfa1d4e0404d797cbebf575f77fd9568a992fbad2b73b46c62e0-merged.mount: Deactivated successfully.
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-7355f4304d8145a5ffd2a4c7ab5e9c091c729df4505fb0e16731f5c869b96903-merged.mount: Deactivated successfully.
Feb 01 09:46:22 np0005604215.localdomain podman[293970]: 2026-02-01 09:46:22.606889205 +0000 UTC m=+0.086275622 container remove 9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=serene_archimedes, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, version=7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, ceph=True, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=)
Feb 01 09:46:22 np0005604215.localdomain systemd[1]: libpod-conmon-9057ef178b0eaf3fb98e297138d954d5b409604b998470d3bb090718061315e1.scope: Deactivated successfully.
Feb 01 09:46:22 np0005604215.localdomain sudo[293893]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:22 np0005604215.localdomain sudo[293993]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:22 np0005604215.localdomain sudo[293993]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:22 np0005604215.localdomain sudo[293993]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:22 np0005604215.localdomain sudo[294011]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:22 np0005604215.localdomain sudo[294011]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:46:23 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 2026-02-01 09:46:23.433918976 +0000 UTC m=+0.075704641 container create 22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_maxwell, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, vcs-type=git, distribution-scope=public, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, name=rhceph, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, release=1764794109)
Feb 01 09:46:23 np0005604215.localdomain systemd[1]: Started libpod-conmon-22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112.scope.
Feb 01 09:46:23 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 2026-02-01 09:46:23.492480649 +0000 UTC m=+0.134266324 container init 22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_maxwell, GIT_BRANCH=main, GIT_CLEAN=True, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, build-date=2025-12-08T17:28:53Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, version=7, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 2026-02-01 09:46:23.501562793 +0000 UTC m=+0.143348468 container start 22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_maxwell, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, distribution-scope=public, name=rhceph, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., release=1764794109)
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 2026-02-01 09:46:23.50177286 +0000 UTC m=+0.143558525 container attach 22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_maxwell, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, version=7, distribution-scope=public, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, RELEASE=main, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=)
Feb 01 09:46:23 np0005604215.localdomain stoic_maxwell[294062]: 167 167
Feb 01 09:46:23 np0005604215.localdomain systemd[1]: libpod-22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112.scope: Deactivated successfully.
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 2026-02-01 09:46:23.404389772 +0000 UTC m=+0.046175437 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:23 np0005604215.localdomain podman[294047]: 2026-02-01 09:46:23.504432193 +0000 UTC m=+0.146217868 container died 22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_maxwell, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main)
Feb 01 09:46:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c6e646e99461bd97fd31ba8fe7b9c7950b64b59f9a3df4dd987b2f788144cea4-merged.mount: Deactivated successfully.
Feb 01 09:46:23 np0005604215.localdomain podman[294067]: 2026-02-01 09:46:23.600747109 +0000 UTC m=+0.088079629 container remove 22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=stoic_maxwell, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, distribution-scope=public, name=rhceph, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, CEPH_POINT_RELEASE=, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, vendor=Red Hat, Inc., GIT_BRANCH=main, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Feb 01 09:46:23 np0005604215.localdomain systemd[1]: libpod-conmon-22f6591f5e1ab28a726996327dac521bc09600b560b999ced150ae556b94e112.scope: Deactivated successfully.
Feb 01 09:46:23 np0005604215.localdomain sudo[294011]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:23 np0005604215.localdomain sudo[294090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:23 np0005604215.localdomain sudo[294090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:23 np0005604215.localdomain sudo[294090]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:23 np0005604215.localdomain sudo[294108]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:23 np0005604215.localdomain sudo[294108]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 12 op/s
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 2026-02-01 09:46:24.434142749 +0000 UTC m=+0.077301380 container create 1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_einstein, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, architecture=x86_64, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git)
Feb 01 09:46:24 np0005604215.localdomain systemd[1]: Started libpod-conmon-1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5.scope.
Feb 01 09:46:24 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 2026-02-01 09:46:24.492890348 +0000 UTC m=+0.136048919 container init 1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_einstein, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, distribution-scope=public, release=1764794109, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True)
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 2026-02-01 09:46:24.501222819 +0000 UTC m=+0.144381390 container start 1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_einstein, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, version=7, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, GIT_CLEAN=True, name=rhceph, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, architecture=x86_64)
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 2026-02-01 09:46:24.50157031 +0000 UTC m=+0.144728951 container attach 1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_einstein, release=1764794109, vcs-type=git, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, vendor=Red Hat, Inc., architecture=x86_64, CEPH_POINT_RELEASE=, GIT_BRANCH=main, distribution-scope=public, io.openshift.expose-services=, ceph=True, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:46:24 np0005604215.localdomain frosty_einstein[294158]: 167 167
Feb 01 09:46:24 np0005604215.localdomain systemd[1]: libpod-1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5.scope: Deactivated successfully.
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 2026-02-01 09:46:24.403798229 +0000 UTC m=+0.046956850 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:24 np0005604215.localdomain podman[294142]: 2026-02-01 09:46:24.50413546 +0000 UTC m=+0.147294051 container died 1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_einstein, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, description=Red Hat Ceph Storage 7, distribution-scope=public, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, RELEASE=main)
Feb 01 09:46:24 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-182c22d55c8d3da1a484822226743b2b5da0237dc7a0e9f0b6834c12333edfc4-merged.mount: Deactivated successfully.
Feb 01 09:46:24 np0005604215.localdomain podman[294163]: 2026-02-01 09:46:24.603428679 +0000 UTC m=+0.089208934 container remove 1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=frosty_einstein, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, GIT_BRANCH=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_CLEAN=True, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:46:24 np0005604215.localdomain systemd[1]: libpod-conmon-1de7e6cf76dcd093f1b04eb9b4aa9099da292a89196c98eacf4e07b60ad8e7c5.scope: Deactivated successfully.
Feb 01 09:46:24 np0005604215.localdomain sudo[294108]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:24 np0005604215.localdomain sudo[294180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:24 np0005604215.localdomain sudo[294180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:24 np0005604215.localdomain sudo[294180]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:24 np0005604215.localdomain sudo[294198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:24 np0005604215.localdomain sudo[294198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: from='client.54143 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 2026-02-01 09:46:25.26052094 +0000 UTC m=+0.066570376 container create 3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_chandrasekhar, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, distribution-scope=public, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.openshift.tags=rhceph ceph, vcs-type=git, GIT_CLEAN=True, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, ceph=True, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:46:25 np0005604215.localdomain systemd[1]: Started libpod-conmon-3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea.scope.
Feb 01 09:46:25 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 2026-02-01 09:46:25.314464308 +0000 UTC m=+0.120513774 container init 3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_chandrasekhar, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, release=1764794109, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, architecture=x86_64, vcs-type=git, ceph=True)
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 2026-02-01 09:46:25.322014575 +0000 UTC m=+0.128064001 container start 3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_chandrasekhar, vendor=Red Hat, Inc., vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, release=1764794109, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_BRANCH=main, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, io.openshift.expose-services=, com.redhat.component=rhceph-container)
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 2026-02-01 09:46:25.32216612 +0000 UTC m=+0.128215586 container attach 3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_chandrasekhar, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, release=1764794109, io.openshift.tags=rhceph ceph, name=rhceph, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main)
Feb 01 09:46:25 np0005604215.localdomain pensive_chandrasekhar[294248]: 167 167
Feb 01 09:46:25 np0005604215.localdomain systemd[1]: libpod-3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea.scope: Deactivated successfully.
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 2026-02-01 09:46:25.325799523 +0000 UTC m=+0.131848999 container died 3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_chandrasekhar, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_BRANCH=main, architecture=x86_64, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc., ceph=True, name=rhceph, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:46:25 np0005604215.localdomain podman[294233]: 2026-02-01 09:46:25.235125615 +0000 UTC m=+0.041175121 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:25 np0005604215.localdomain podman[294253]: 2026-02-01 09:46:25.41031064 +0000 UTC m=+0.072228673 container remove 3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=pensive_chandrasekhar, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, version=7, distribution-scope=public, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., release=1764794109, architecture=x86_64, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:46:25 np0005604215.localdomain systemd[1]: libpod-conmon-3c92c8dbf22e88f763f483ea9e812282bbdd295d298cee569076156f37048eea.scope: Deactivated successfully.
Feb 01 09:46:25 np0005604215.localdomain sudo[294198]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:25 np0005604215.localdomain sudo[294269]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2b1b8a8c9e98b7aa8b652aab237c92cd4c11beb3559288e18608219177f3cdd5-merged.mount: Deactivated successfully.
Feb 01 09:46:25 np0005604215.localdomain sudo[294269]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:25 np0005604215.localdomain sudo[294269]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:25 np0005604215.localdomain sudo[294287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:25 np0005604215.localdomain sudo[294287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:25.972 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 2026-02-01 09:46:26.139933291 +0000 UTC m=+0.087573293 container create dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7)
Feb 01 09:46:26 np0005604215.localdomain systemd[1]: Started libpod-conmon-dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f.scope.
Feb 01 09:46:26 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 2026-02-01 09:46:26.196241853 +0000 UTC m=+0.143881855 container init dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, release=1764794109, com.redhat.component=rhceph-container, GIT_BRANCH=main, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., io.openshift.expose-services=, io.buildah.version=1.41.4)
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 2026-02-01 09:46:26.20540152 +0000 UTC m=+0.153041532 container start dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, vcs-type=git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, distribution-scope=public, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, io.buildah.version=1.41.4)
Feb 01 09:46:26 np0005604215.localdomain quizzical_albattani[294337]: 167 167
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 2026-02-01 09:46:26.108453175 +0000 UTC m=+0.056093207 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 2026-02-01 09:46:26.20763589 +0000 UTC m=+0.155275902 container attach dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, GIT_BRANCH=main, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, name=rhceph, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, distribution-scope=public, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True)
Feb 01 09:46:26 np0005604215.localdomain systemd[1]: libpod-dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f.scope: Deactivated successfully.
Feb 01 09:46:26 np0005604215.localdomain podman[294322]: 2026-02-01 09:46:26.209650073 +0000 UTC m=+0.157290105 container died dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, name=rhceph, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, version=7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, vendor=Red Hat, Inc., io.buildah.version=1.41.4, release=1764794109, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:46:26 np0005604215.localdomain podman[294342]: 2026-02-01 09:46:26.30121078 +0000 UTC m=+0.079448039 container remove dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=quizzical_albattani, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, version=7, GIT_CLEAN=True, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, RELEASE=main, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, vcs-type=git, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, release=1764794109, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:46:26 np0005604215.localdomain systemd[1]: libpod-conmon-dcbca3df31049aa99a02d838ee0849fde295b92c7c6c69ad04e2b6252e50a22f.scope: Deactivated successfully.
Feb 01 09:46:26 np0005604215.localdomain sudo[294287]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:26 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-394099c3ffdb20646fdcc5a07e5886b7845fcf544a535cece1bf07640258b255-merged.mount: Deactivated successfully.
Feb 01 09:46:26 np0005604215.localdomain sudo[294359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:46:26 np0005604215.localdomain sudo[294359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:26 np0005604215.localdomain sudo[294359]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604215 (monmap changed)...
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:46:27 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55d1775411e0 mon_map magic: 0 from mon.1 v2:172.18.0.108:3300/0
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@1(peon) e13  my rank is now 0 (was 1)
Feb 01 09:46:27 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 01 09:46:27 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.108:3300/0
Feb 01 09:46:27 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55d17e178000 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: paxos.0).electionLogic(56) init, last seen epoch 56
Feb 01 09:46:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:46:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:46:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:46:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:46:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:46:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:46:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17790 "" "Go-http-client/1.1"
Feb 01 09:46:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:46:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:46:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:46:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:46:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:46:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:46:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:46:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:46:31 np0005604215.localdomain podman[294378]: 2026-02-01 09:46:31.889985644 +0000 UTC m=+0.094044946 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:46:31 np0005604215.localdomain systemd[1]: tmp-crun.YNiQAV.mount: Deactivated successfully.
Feb 01 09:46:31 np0005604215.localdomain podman[294377]: 2026-02-01 09:46:31.951464508 +0000 UTC m=+0.155688365 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127)
Feb 01 09:46:31 np0005604215.localdomain podman[294378]: 2026-02-01 09:46:31.957796866 +0000 UTC m=+0.161856148 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:46:31 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:46:31 np0005604215.localdomain podman[294377]: 2026-02-01 09:46:31.995739104 +0000 UTC m=+0.199963021 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 01 09:46:32 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 is new leader, mons np0005604215,np0005604213 in quorum (ranks 0,2)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : monmap epoch 13
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : last_changed 2026-02-01T09:46:27.712705+0000
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : created 2026-02-01T07:37:52.883666+0000
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : mgrmap e44: np0005604211.cuflqz(active, since 20s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : Health check failed: 1/3 mons down, quorum np0005604215,np0005604213 (MON_DOWN)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604215,np0005604213
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] :     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] :     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : [WRN] MON_DOWN: 1/3 mons down, quorum np0005604215,np0005604213
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] :     mon.np0005604212 (rank 1) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: paxos.0).electionLogic(59) init, last seen epoch 59, mid-election, bumping
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(electing) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : mon.np0005604215 is new leader, mons np0005604215,np0005604212,np0005604213 in quorum (ranks 0,1,2)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : monmap epoch 13
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : last_changed 2026-02-01T09:46:27.712705+0000
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : created 2026-02-01T07:37:52.883666+0000
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : min_mon_release 18 (reef)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : election_strategy: 1
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : osdmap e88: 6 total, 6 up, 6 in
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : mgrmap e44: np0005604211.cuflqz(active, since 20s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604215,np0005604213)
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] :     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] : [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:46:32 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [WRN] :     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:46:32 np0005604215.localdomain sudo[294427]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:46:32 np0005604215.localdomain sudo[294427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:32 np0005604215.localdomain sudo[294427]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:32 np0005604215.localdomain sudo[294445]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:46:32 np0005604215.localdomain sudo[294445]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:32 np0005604215.localdomain sudo[294445]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294463]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294463]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294463]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:33 np0005604215.localdomain sudo[294481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:33 np0005604215.localdomain sudo[294481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294481]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294499]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294533]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294533]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294533]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294551]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294569]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:46:33 np0005604215.localdomain sudo[294569]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294569]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294587]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:46:33 np0005604215.localdomain sudo[294587]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294587]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294605]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:46:33 np0005604215.localdomain sudo[294605]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294605]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294623]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294623]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294623]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294641]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:33 np0005604215.localdomain sudo[294641]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294641]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: mon.np0005604212 calling monitor election
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604215,np0005604213
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]:     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]:     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005604215,np0005604213
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]:     mon.np0005604212 (rank 1) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 calling monitor election
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215 is new leader, mons np0005604215,np0005604212,np0005604213 in quorum (ranks 0,1,2)
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: monmap epoch 13
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: last_changed 2026-02-01T09:46:27.712705+0000
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: min_mon_release 18 (reef)
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: election_strategy: 1
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: osdmap e88: 6 total, 6 up, 6 in
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: mgrmap e44: np0005604211.cuflqz(active, since 20s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604215,np0005604213)
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]:     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]:     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: from='client.54158 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604211.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:33 np0005604215.localdomain ceph-mon[278949]: Removed label mon from host np0005604211.localdomain
Feb 01 09:46:33 np0005604215.localdomain sudo[294659]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294659]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294659]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:33 np0005604215.localdomain sudo[294693]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:33 np0005604215.localdomain sudo[294693]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:33 np0005604215.localdomain sudo[294693]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:34 np0005604215.localdomain sudo[294711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:46:34 np0005604215.localdomain sudo[294711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:34 np0005604215.localdomain sudo[294711]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:46:34 np0005604215.localdomain sudo[294729]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:34 np0005604215.localdomain sudo[294729]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:34 np0005604215.localdomain sudo[294729]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:34 np0005604215.localdomain sudo[294747]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:46:34 np0005604215.localdomain sudo[294747]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:34 np0005604215.localdomain sudo[294747]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:46:34 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/4277238030' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.32:0/4277238030' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 01 09:46:35 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='client.34578 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604211.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: Removed label mgr from host np0005604211.localdomain
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain.devices.0}] v 0)
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604211.localdomain}] v 0)
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:46:36 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: from='client.34587 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604211.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: Removed label _admin from host np0005604211.localdomain
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:37 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:46:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:46:37 np0005604215.localdomain systemd[1]: tmp-crun.J2UGCZ.mount: Deactivated successfully.
Feb 01 09:46:37 np0005604215.localdomain podman[294765]: 2026-02-01 09:46:37.879399359 +0000 UTC m=+0.088824722 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9/ubi-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.openshift.expose-services=, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, container_name=openstack_network_exporter, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc.)
Feb 01 09:46:37 np0005604215.localdomain podman[294766]: 2026-02-01 09:46:37.925388728 +0000 UTC m=+0.131807887 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:46:37 np0005604215.localdomain podman[294765]: 2026-02-01 09:46:37.944341271 +0000 UTC m=+0.153766644 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9/ubi-minimal, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container)
Feb 01 09:46:37 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:46:37 np0005604215.localdomain podman[294766]: 2026-02-01 09:46:37.961729646 +0000 UTC m=+0.168148825 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:46:37 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:38 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 01 09:46:39 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:46:40 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:46:41.763 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:46:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:46:41.764 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:46:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:46:41.764 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:42.195 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:42.196 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:46:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:42.196 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:46:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:42.253 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:46:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:42.253 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:42.254 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:42 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:43.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:43.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:46:43 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:44.097 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:44.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:44 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/3190989409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:46:45 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.126 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.126 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.126 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.127 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.127 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.106:0/992751241' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/329388596' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.600 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.473s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:46:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.818 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.819 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12351MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.820 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.820 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:46 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:46 np0005604215.localdomain podman[294826]: 2026-02-01 09:46:46.873834008 +0000 UTC m=+0.083655490 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.880 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.881 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:46:46 np0005604215.localdomain podman[294826]: 2026-02-01 09:46:46.888719725 +0000 UTC m=+0.098541217 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 01 09:46:46 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:46:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:46.900 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/766827414' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/329388596' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/728333592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:47.340 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:46:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:47.345 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:46:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:47.382 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:46:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:47.384 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:46:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:47.384 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} v 0)
Feb 01 09:46:47 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:47 np0005604215.localdomain sudo[294867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:47 np0005604215.localdomain sudo[294867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:47 np0005604215.localdomain sudo[294867]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:47 np0005604215.localdomain sudo[294885]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:47 np0005604215.localdomain sudo[294885]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='client.44647 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005604211.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: Added label _no_schedule to host np0005604211.localdomain
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604211.localdomain
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.108:0/728333592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='client.? 172.18.0.107:0/2418149454' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 2026-02-01 09:46:48.319831817 +0000 UTC m=+0.076414214 container create a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_brown, GIT_CLEAN=True, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, architecture=x86_64, distribution-scope=public, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:46:48 np0005604215.localdomain systemd[1]: Started libpod-conmon-a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d.scope.
Feb 01 09:46:48 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:46:48.385 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 2026-02-01 09:46:48.38671 +0000 UTC m=+0.143292447 container init a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_brown, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, version=7, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, architecture=x86_64, distribution-scope=public, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, ceph=True, vendor=Red Hat, Inc., name=rhceph, description=Red Hat Ceph Storage 7)
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 2026-02-01 09:46:48.290339674 +0000 UTC m=+0.046922121 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 2026-02-01 09:46:48.402198205 +0000 UTC m=+0.158780602 container start a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_brown, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, version=7, ceph=True, name=rhceph, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 2026-02-01 09:46:48.403345721 +0000 UTC m=+0.159928158 container attach a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_brown, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, ceph=True, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, name=rhceph, io.buildah.version=1.41.4, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:46:48 np0005604215.localdomain suspicious_brown[294936]: 167 167
Feb 01 09:46:48 np0005604215.localdomain systemd[1]: libpod-a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d.scope: Deactivated successfully.
Feb 01 09:46:48 np0005604215.localdomain podman[294921]: 2026-02-01 09:46:48.406654504 +0000 UTC m=+0.163236961 container died a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_brown, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, release=1764794109, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, io.openshift.tags=rhceph ceph, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, io.buildah.version=1.41.4, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:46:48 np0005604215.localdomain podman[294941]: 2026-02-01 09:46:48.499884204 +0000 UTC m=+0.080139570 container remove a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=suspicious_brown, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vcs-type=git, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, RELEASE=main, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, distribution-scope=public, GIT_BRANCH=main, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:46:48 np0005604215.localdomain systemd[1]: libpod-conmon-a54934a68e008827a9a722fc7212de528d14b5e1624e1edfbe0ad44b56871f2d.scope: Deactivated successfully.
Feb 01 09:46:48 np0005604215.localdomain sudo[294885]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:46:48 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:48 np0005604215.localdomain sudo[294958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:48 np0005604215.localdomain sudo[294958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:48 np0005604215.localdomain sudo[294958]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:48 np0005604215.localdomain sudo[294976]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:48 np0005604215.localdomain sudo[294976]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 2026-02-01 09:46:49.200770646 +0000 UTC m=+0.073212643 container create 0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_cerf, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, GIT_BRANCH=main, ceph=True, name=rhceph, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., vcs-type=git, release=1764794109, distribution-scope=public, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: Started libpod-conmon-0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d.scope.
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 2026-02-01 09:46:49.262412585 +0000 UTC m=+0.134854582 container init 0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_cerf, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, RELEASE=main, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, architecture=x86_64, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 2026-02-01 09:46:49.171614353 +0000 UTC m=+0.044056350 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 2026-02-01 09:46:49.273062759 +0000 UTC m=+0.145504746 container start 0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_cerf, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, ceph=True, io.openshift.tags=rhceph ceph, version=7, distribution-scope=public, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:46:49 np0005604215.localdomain busy_cerf[295026]: 167 167
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 2026-02-01 09:46:49.273381289 +0000 UTC m=+0.145823286 container attach 0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_cerf, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, io.openshift.expose-services=, com.redhat.component=rhceph-container, distribution-scope=public, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, vendor=Red Hat, Inc., version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: libpod-0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d.scope: Deactivated successfully.
Feb 01 09:46:49 np0005604215.localdomain podman[295011]: 2026-02-01 09:46:49.278688815 +0000 UTC m=+0.151130832 container died 0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_cerf, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, CEPH_POINT_RELEASE=, architecture=x86_64, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, release=1764794109, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, ceph=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, GIT_CLEAN=True, distribution-scope=public, build-date=2025-12-08T17:28:53Z)
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: tmp-crun.boHbY4.mount: Deactivated successfully.
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e11f52e1d3bcd452f9d900af7111a68cb0f04ee9b60ee4323eb0c70b3eec4005-merged.mount: Deactivated successfully.
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-f41329cc93b1e237145818483abb0af9dd83da1b15a02825f45bfa12b653db5c-merged.mount: Deactivated successfully.
Feb 01 09:46:49 np0005604215.localdomain podman[295031]: 2026-02-01 09:46:49.367418342 +0000 UTC m=+0.082745471 container remove 0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=busy_cerf, RELEASE=main, name=rhceph, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., io.openshift.expose-services=, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, architecture=x86_64)
Feb 01 09:46:49 np0005604215.localdomain systemd[1]: libpod-conmon-0efa30b9489ec26262c948305f9cd42c16bcf7818e570a8de554aaf38f2b1d3d.scope: Deactivated successfully.
Feb 01 09:46:49 np0005604215.localdomain sudo[294976]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='client.34605 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005604211.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:49 np0005604215.localdomain sudo[295052]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:49 np0005604215.localdomain sudo[295052]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:49 np0005604215.localdomain sudo[295052]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:49 np0005604215.localdomain sudo[295070]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:49 np0005604215.localdomain sudo[295070]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/inventory}] v 0)
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} v 0)
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch
Feb 01 09:46:49 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"}]': finished
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 2026-02-01 09:46:50.157515577 +0000 UTC m=+0.103746219 container create 5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_gagarin, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, release=1764794109, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, architecture=x86_64, distribution-scope=public)
Feb 01 09:46:50 np0005604215.localdomain systemd[1]: Started libpod-conmon-5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779.scope.
Feb 01 09:46:50 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 2026-02-01 09:46:50.221121819 +0000 UTC m=+0.167352461 container init 5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_gagarin, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, release=1764794109, ceph=True, name=rhceph, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.openshift.expose-services=, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 2026-02-01 09:46:50.129659886 +0000 UTC m=+0.075890578 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 2026-02-01 09:46:50.229853732 +0000 UTC m=+0.176084374 container start 5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_gagarin, GIT_CLEAN=True, architecture=x86_64, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, release=1764794109, ceph=True, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, version=7, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main)
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 2026-02-01 09:46:50.23012396 +0000 UTC m=+0.176354602 container attach 5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_gagarin, ceph=True, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vendor=Red Hat, Inc., name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1764794109, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, GIT_CLEAN=True, RELEASE=main, architecture=x86_64)
Feb 01 09:46:50 np0005604215.localdomain funny_gagarin[295122]: 167 167
Feb 01 09:46:50 np0005604215.localdomain systemd[1]: libpod-5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779.scope: Deactivated successfully.
Feb 01 09:46:50 np0005604215.localdomain podman[295106]: 2026-02-01 09:46:50.234097395 +0000 UTC m=+0.180328037 container died 5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_gagarin, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, build-date=2025-12-08T17:28:53Z, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., io.openshift.expose-services=, architecture=x86_64, release=1764794109, name=rhceph, io.buildah.version=1.41.4)
Feb 01 09:46:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5095ae3d1f1875cf72808787b0464c8d94a8ff08d48db63c17468128e829c2ee-merged.mount: Deactivated successfully.
Feb 01 09:46:50 np0005604215.localdomain podman[295127]: 2026-02-01 09:46:50.329619856 +0000 UTC m=+0.086571472 container remove 5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=funny_gagarin, RELEASE=main, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, architecture=x86_64, ceph=True, com.redhat.component=rhceph-container, release=1764794109, distribution-scope=public, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_BRANCH=main, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:46:50 np0005604215.localdomain systemd[1]: libpod-conmon-5097327bb2f8f0b79b6e22a981dc9508f1ae6bb9d2051b3685585b0c11763779.scope: Deactivated successfully.
Feb 01 09:46:50 np0005604215.localdomain sudo[295070]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} v 0)
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:50 np0005604215.localdomain sudo[295151]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:50 np0005604215.localdomain sudo[295151]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:50 np0005604215.localdomain sudo[295151]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:50 np0005604215.localdomain sudo[295169]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:50 np0005604215.localdomain sudo[295169]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='client.44659 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005604211.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"}]': finished
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: Removed host np0005604211.localdomain
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:46:50 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 2026-02-01 09:46:51.138198309 +0000 UTC m=+0.073076999 container create 4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_austin, io.openshift.tags=rhceph ceph, release=1764794109, vendor=Red Hat, Inc., name=rhceph, build-date=2025-12-08T17:28:53Z, ceph=True, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=)
Feb 01 09:46:51 np0005604215.localdomain systemd[1]: Started libpod-conmon-4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d.scope.
Feb 01 09:46:51 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 2026-02-01 09:46:51.1085298 +0000 UTC m=+0.043408510 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 2026-02-01 09:46:51.218113811 +0000 UTC m=+0.152992511 container init 4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_austin, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, distribution-scope=public, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, architecture=x86_64, RELEASE=main, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.41.4, vcs-type=git, io.openshift.tags=rhceph ceph)
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 2026-02-01 09:46:51.227677021 +0000 UTC m=+0.162555711 container start 4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_austin, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.tags=rhceph ceph, version=7, release=1764794109, GIT_CLEAN=True)
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 2026-02-01 09:46:51.228071683 +0000 UTC m=+0.162950403 container attach 4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_austin, release=1764794109, distribution-scope=public, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, org.opencontainers.image.created=2025-12-08T17:28:53Z, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, architecture=x86_64, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph)
Feb 01 09:46:51 np0005604215.localdomain blissful_austin[295219]: 167 167
Feb 01 09:46:51 np0005604215.localdomain systemd[1]: libpod-4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d.scope: Deactivated successfully.
Feb 01 09:46:51 np0005604215.localdomain podman[295204]: 2026-02-01 09:46:51.230823249 +0000 UTC m=+0.165701949 container died 4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_austin, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-type=git, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, io.openshift.expose-services=, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109)
Feb 01 09:46:51 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-30ab57fa4e60f5c9ca36354f9d3d7e15d27c733f3ff3b7ceb7d3c889f265c436-merged.mount: Deactivated successfully.
Feb 01 09:46:51 np0005604215.localdomain podman[295224]: 2026-02-01 09:46:51.327170265 +0000 UTC m=+0.083443424 container remove 4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=blissful_austin, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, architecture=x86_64, description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.)
Feb 01 09:46:51 np0005604215.localdomain systemd[1]: libpod-conmon-4bbde1a943942ea234759b6d4a8cf23823f1a309cbec76566bfc46f066fb450d.scope: Deactivated successfully.
Feb 01 09:46:51 np0005604215.localdomain sudo[295169]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} v 0)
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:51 np0005604215.localdomain sudo[295240]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:46:51 np0005604215.localdomain sudo[295240]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:51 np0005604215.localdomain sudo[295240]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:51 np0005604215.localdomain sudo[295258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:46:51 np0005604215.localdomain sudo[295258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:46:51 np0005604215.localdomain ceph-mon[278949]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 2026-02-01 09:46:52.069496344 +0000 UTC m=+0.072613314 container create b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hofstadter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, version=7, release=1764794109, io.buildah.version=1.41.4, distribution-scope=public, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.tags=rhceph ceph, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Started libpod-conmon-b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224.scope.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 2026-02-01 09:46:52.128905514 +0000 UTC m=+0.132022474 container init b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hofstadter, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, release=1764794109, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, version=7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: tmp-crun.7CYf8s.mount: Deactivated successfully.
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 2026-02-01 09:46:52.041832978 +0000 UTC m=+0.044949978 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 2026-02-01 09:46:52.145840414 +0000 UTC m=+0.148957374 container start b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hofstadter, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, release=1764794109, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, CEPH_POINT_RELEASE=, architecture=x86_64, com.redhat.component=rhceph-container, name=rhceph, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:46:52 np0005604215.localdomain lucid_hofstadter[295310]: 167 167
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 2026-02-01 09:46:52.146125303 +0000 UTC m=+0.149242313 container attach b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hofstadter, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, ceph=True, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, distribution-scope=public, GIT_BRANCH=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, release=1764794109, io.openshift.expose-services=, GIT_CLEAN=True)
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: libpod-b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224.scope: Deactivated successfully.
Feb 01 09:46:52 np0005604215.localdomain podman[295294]: 2026-02-01 09:46:52.151195822 +0000 UTC m=+0.154312812 container died b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hofstadter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, release=1764794109, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:46:52 np0005604215.localdomain sshd[295338]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:46:52 np0005604215.localdomain podman[295309]: 2026-02-01 09:46:52.203797429 +0000 UTC m=+0.089339568 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:46:52 np0005604215.localdomain podman[295323]: 2026-02-01 09:46:52.276911037 +0000 UTC m=+0.124207970 container remove b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=lucid_hofstadter, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, version=7, distribution-scope=public, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: libpod-conmon-b07e8d9c31d2c18b4e8772dc855f8241f248ecf7cc75e90f5f1f179d84ca4224.scope: Deactivated successfully.
Feb 01 09:46:52 np0005604215.localdomain podman[295309]: 2026-02-01 09:46:52.291879446 +0000 UTC m=+0.177421575 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:46:52 np0005604215.localdomain sshd[295338]: Accepted publickey for tripleo-admin from 192.168.122.11 port 33144 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:46:52 np0005604215.localdomain systemd-logind[761]: New session 71 of user tripleo-admin.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 1003.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/1003...
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8eccd5196d661e5de1cbd620e5a8927c8d0bbc2e9c327afd3a9f0f5f8ca48b22-merged.mount: Deactivated successfully.
Feb 01 09:46:52 np0005604215.localdomain sudo[295258]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/1003.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Starting User Manager for UID 1003...
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: pam_unix(systemd-user:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Queued start job for default target Main User Target.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Created slice User Application Slice.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Reached target Paths.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Reached target Timers.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Starting D-Bus User Message Bus Socket...
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Starting Create User's Volatile Files and Directories...
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Listening on D-Bus User Message Bus Socket.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Reached target Sockets.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Finished Create User's Volatile Files and Directories.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Reached target Basic System.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Reached target Main User Target.
Feb 01 09:46:52 np0005604215.localdomain systemd[295356]: Startup finished in 144ms.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Started User Manager for UID 1003.
Feb 01 09:46:52 np0005604215.localdomain systemd[1]: Started Session 71 of User tripleo-admin.
Feb 01 09:46:52 np0005604215.localdomain sshd[295338]: pam_unix(sshd:session): session opened for user tripleo-admin(uid=1003) by (uid=0)
Feb 01 09:46:53 np0005604215.localdomain sudo[295496]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-qxckfirqygcaifcdjdisrkruajawezar ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769939212.6607316-61965-48496957398892/AnsiballZ_lineinfile.py
Feb 01 09:46:53 np0005604215.localdomain sudo[295496]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 09:46:53 np0005604215.localdomain python3[295498]: ansible-ansible.builtin.lineinfile Invoked with dest=/etc/os-net-config/tripleo_config.yaml insertafter=172.18.0 line=    - ip_netmask: 172.18.0.105/24 backup=True path=/etc/os-net-config/tripleo_config.yaml state=present backrefs=False create=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Feb 01 09:46:53 np0005604215.localdomain sudo[295496]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:53 np0005604215.localdomain sudo[295642]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-xlcyzmuuohpfyyqvumuddfmmmaszyael ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769939213.431876-61981-35549812643927/AnsiballZ_command.py
Feb 01 09:46:53 np0005604215.localdomain sudo[295642]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 09:46:53 np0005604215.localdomain python3[295644]: ansible-ansible.legacy.command Invoked with _raw_params=ip a add 172.18.0.105/24 dev vlan21 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:46:53 np0005604215.localdomain sudo[295642]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:54 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:46:54 np0005604215.localdomain sudo[295787]: tripleo-admin : TTY=pts/0 ; PWD=/home/tripleo-admin ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-ludpfdxrtmweueserirgqwulmwrhijxy ; /usr/bin/python3 /home/tripleo-admin/.ansible/tmp/ansible-tmp-1769939214.1300766-61992-259729372492936/AnsiballZ_command.py
Feb 01 09:46:54 np0005604215.localdomain sudo[295787]: pam_unix(sudo:session): session opened for user root(uid=0) by tripleo-admin(uid=1003)
Feb 01 09:46:54 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.105:6800/155238379
Feb 01 09:46:54 np0005604215.localdomain python3[295789]: ansible-ansible.legacy.command Invoked with _raw_params=ping -W1 -c 3 172.18.0.105 _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 09:46:56 np0005604215.localdomain sudo[295787]: pam_unix(sudo:session): session closed for user root
Feb 01 09:46:59 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:47:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:47:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:47:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:47:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:47:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17794 "" "Go-http-client/1.1"
Feb 01 09:47:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:47:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:47:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:47:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:47:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:47:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:47:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:47:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:47:02 np0005604215.localdomain podman[295808]: 2026-02-01 09:47:02.871245064 +0000 UTC m=+0.086336634 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 09:47:02 np0005604215.localdomain podman[295809]: 2026-02-01 09:47:02.947623805 +0000 UTC m=+0.160984731 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:47:02 np0005604215.localdomain podman[295809]: 2026-02-01 09:47:02.959980202 +0000 UTC m=+0.173341178 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:47:02 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:47:02 np0005604215.localdomain podman[295808]: 2026-02-01 09:47:02.977677996 +0000 UTC m=+0.192769556 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Feb 01 09:47:02 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.405 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:47:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:47:04 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:47:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:47:08 np0005604215.localdomain podman[295856]: 2026-02-01 09:47:08.8644615 +0000 UTC m=+0.077855479 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.openshift.expose-services=, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, config_id=openstack_network_exporter, vendor=Red Hat, Inc., vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:47:08 np0005604215.localdomain podman[295856]: 2026-02-01 09:47:08.878740277 +0000 UTC m=+0.092134246 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, architecture=x86_64, release=1769056855, io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:47:08 np0005604215.localdomain systemd[1]: tmp-crun.8Y3E6J.mount: Deactivated successfully.
Feb 01 09:47:08 np0005604215.localdomain podman[295857]: 2026-02-01 09:47:08.920128873 +0000 UTC m=+0.129604568 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 01 09:47:08 np0005604215.localdomain podman[295857]: 2026-02-01 09:47:08.929780516 +0000 UTC m=+0.139256201 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:47:08 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:47:08 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:47:09 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:14 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:47:17 np0005604215.localdomain podman[295894]: 2026-02-01 09:47:17.857910938 +0000 UTC m=+0.077051773 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:47:17 np0005604215.localdomain podman[295894]: 2026-02-01 09:47:17.868366955 +0000 UTC m=+0.087507810 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute)
Feb 01 09:47:17 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:47:19 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:47:22 np0005604215.localdomain podman[295913]: 2026-02-01 09:47:22.861978916 +0000 UTC m=+0.076754974 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:47:22 np0005604215.localdomain podman[295913]: 2026-02-01 09:47:22.87263987 +0000 UTC m=+0.087415958 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:47:22 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e88 do_prune osdmap full prune enabled
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Activating manager daemon np0005604209.isqrps
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Manager daemon np0005604211.cuflqz is unresponsive, replacing it with standby daemon np0005604209.isqrps
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e89 e89: 6 total, 6 up, 6 in
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : osdmap e89: 6 total, 6 up, 6 in
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : mgrmap e45: np0005604209.isqrps(active, starting, since 0.0467249s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).mds e16 all = 0
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).mds e16 all = 0
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).mds e16 all = 0
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mds metadata"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).mds e16 all = 1
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mon metadata"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: Activating manager daemon np0005604209.isqrps
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604211.cuflqz is unresponsive, replacing it with standby daemon np0005604209.isqrps
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: osdmap e89: 6 total, 6 up, 6 in
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mgrmap e45: np0005604209.isqrps(active, starting, since 0.0467249s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Manager daemon np0005604209.isqrps is now available
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} v 0)
Feb 01 09:47:24 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch
Feb 01 09:47:24 np0005604215.localdomain sshd[295936]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:47:24 np0005604215.localdomain sshd[295936]: Accepted publickey for ceph-admin from 192.168.122.103 port 40896 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:47:24 np0005604215.localdomain systemd-logind[761]: New session 73 of user ceph-admin.
Feb 01 09:47:24 np0005604215.localdomain systemd[1]: Started Session 73 of User ceph-admin.
Feb 01 09:47:24 np0005604215.localdomain sshd[295936]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:47:24 np0005604215.localdomain sudo[295940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:24 np0005604215.localdomain sudo[295940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:24 np0005604215.localdomain sudo[295940]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:24 np0005604215.localdomain sudo[295958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:47:24 np0005604215.localdomain sudo[295958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: Manager daemon np0005604209.isqrps is now available
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : mgrmap e46: np0005604209.isqrps(active, since 1.16558s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/spec.mon}] v 0)
Feb 01 09:47:25 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:25 np0005604215.localdomain podman[296047]: 2026-02-01 09:47:25.748476981 +0000 UTC m=+0.087049966 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, io.buildah.version=1.41.4, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:47:25 np0005604215.localdomain podman[296047]: 2026-02-01 09:47:25.882905099 +0000 UTC m=+0.221478104 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.component=rhceph-container, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, CEPH_POINT_RELEASE=, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, io.openshift.expose-services=, ceph=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: removing stray HostCache host record np0005604211.localdomain.devices.0
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mgrmap e46: np0005604209.isqrps(active, since 1.16558s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [INF] : Cluster is now healthy
Feb 01 09:47:26 np0005604215.localdomain sudo[295958]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:26 np0005604215.localdomain sudo[296165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:47:26 np0005604215.localdomain sudo[296165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:26 np0005604215.localdomain sudo[296165]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:26 np0005604215.localdomain sudo[296183]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:47:26 np0005604215.localdomain sudo[296183]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:47:26 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain sudo[296183]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:27 np0005604215.localdomain sudo[296233]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:27 np0005604215.localdomain sudo[296233]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:27 np0005604215.localdomain sudo[296233]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='client.34614 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: Saving service mon spec with placement label:mon
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: Cluster is now healthy
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:47:27] ENGINE Bus STARTING
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:47:27] ENGINE Serving on http://172.18.0.200:8765
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:47:27] ENGINE Serving on https://172.18.0.200:7150
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:47:27] ENGINE Bus STARTED
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: [01/Feb/2026:09:47:27] ENGINE Client ('172.18.0.200', 32790) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:47:27 np0005604215.localdomain sudo[296251]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:47:27 np0005604215.localdomain sudo[296251]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:27 np0005604215.localdomain sudo[296251]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: log_channel(cluster) log [DBG] : mgrmap e47: np0005604209.isqrps(active, since 3s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:27 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain sudo[296287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:47:28 np0005604215.localdomain sudo[296287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296287]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:47:28 np0005604215.localdomain sudo[296305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296305]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296323]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:28 np0005604215.localdomain sudo[296323]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296323]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296341]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:28 np0005604215.localdomain sudo[296341]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296341]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296359]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:28 np0005604215.localdomain sudo[296359]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296359]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296393]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:28 np0005604215.localdomain sudo[296393]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296393]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: mgrmap e47: np0005604209.isqrps(active, since 3s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:28 np0005604215.localdomain ceph-mon[278949]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:28 np0005604215.localdomain sudo[296411]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:28 np0005604215.localdomain sudo[296411]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296411]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296429]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:47:28 np0005604215.localdomain sudo[296429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296429]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:28 np0005604215.localdomain sudo[296447]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:28 np0005604215.localdomain sudo[296447]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:28 np0005604215.localdomain sudo[296447]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296465]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:29 np0005604215.localdomain sudo[296465]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296465]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296483]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:29 np0005604215.localdomain sudo[296501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296501]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader).osd e89 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:47:29 np0005604215.localdomain sudo[296519]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296519]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296519]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296553]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296571]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296571]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296571]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296589]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:29 np0005604215.localdomain sudo[296589]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296589]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296607]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:47:29 np0005604215.localdomain sudo[296607]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296607]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296625]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:47:29 np0005604215.localdomain sudo[296625]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296625]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296643]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296643]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296643]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296661]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:29 np0005604215.localdomain sudo[296661]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296661]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296679]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:29 np0005604215.localdomain sudo[296713]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:47:29 np0005604215.localdomain sudo[296713]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:29 np0005604215.localdomain sudo[296713]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:47:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:47:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:47:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:47:30 np0005604215.localdomain sudo[296731]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:47:30 np0005604215.localdomain sudo[296731]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296731]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:47:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17792 "" "Go-http-client/1.1"
Feb 01 09:47:30 np0005604215.localdomain sudo[296749]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:47:30 np0005604215.localdomain sudo[296749]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296749]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296767]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:30 np0005604215.localdomain sudo[296767]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296767]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:30 np0005604215.localdomain sudo[296785]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:30 np0005604215.localdomain sudo[296785]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296785]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296803]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:47:30 np0005604215.localdomain sudo[296803]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296803]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296821]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:30 np0005604215.localdomain sudo[296821]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296821]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296839]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:47:30 np0005604215.localdomain sudo[296839]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296839]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296873]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:47:30 np0005604215.localdomain sudo[296873]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296873]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296891]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:47:30 np0005604215.localdomain sudo[296891]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296891]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain sudo[296909]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:47:30 np0005604215.localdomain sudo[296909]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:30 np0005604215.localdomain sudo[296909]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:47:30 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [DBG] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e13 handle_command mon_command({"prefix": "mon rm", "name": "np0005604215"} v 0)
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: log_channel(audit) log [INF] : from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon rm", "name": "np0005604215"} : dispatch
Feb 01 09:47:31 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55d1775411e0 mon_map magic: 0 from mon.0 v2:172.18.0.108:3300/0
Feb 01 09:47:31 np0005604215.localdomain ceph-mon[278949]: mon.np0005604215@0(leader) e14  removed from monmap, suicide.
Feb 01 09:47:31 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Feb 01 09:47:31 np0005604215.localdomain ceph-mgr[278126]: client.0 ms_handle_reset on v2:172.18.0.104:3300/0
Feb 01 09:47:31 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55d177541080 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 01 09:47:31 np0005604215.localdomain sudo[296927]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:31 np0005604215.localdomain sudo[296927]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:31 np0005604215.localdomain sudo[296927]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:31 np0005604215.localdomain podman[296941]: 2026-02-01 09:47:31.232272628 +0000 UTC m=+0.052445123 container died e5584900e40475bfb0e0992a38ca26dd007e21b74ac9ad70262abebad82b75d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, distribution-scope=public, build-date=2025-12-08T17:28:53Z, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, version=7, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=)
Feb 01 09:47:31 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e83968a87c9b2ae83e102a25fe5279ef93ea6c64bdb6aa577d60da58f4409de1-merged.mount: Deactivated successfully.
Feb 01 09:47:31 np0005604215.localdomain podman[296941]: 2026-02-01 09:47:31.275965675 +0000 UTC m=+0.096138130 container remove e5584900e40475bfb0e0992a38ca26dd007e21b74ac9ad70262abebad82b75d8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, io.openshift.expose-services=, GIT_BRANCH=main, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, vcs-type=git, io.openshift.tags=rhceph ceph, name=rhceph, RELEASE=main, distribution-scope=public, GIT_CLEAN=True, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, release=1764794109, com.redhat.component=rhceph-container)
Feb 01 09:47:31 np0005604215.localdomain sudo[296956]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 rm-daemon --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e --name mon.np0005604215 --force
Feb 01 09:47:31 np0005604215.localdomain sudo[296956]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:31 np0005604215.localdomain sudo[296971]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:47:31 np0005604215.localdomain sudo[296971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:31 np0005604215.localdomain sudo[296971]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:47:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:47:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:47:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:47:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:47:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e@mon.np0005604215.service: Deactivated successfully.
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: Stopped Ceph mon.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e@mon.np0005604215.service: Consumed 14.373s CPU time.
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:47:32 np0005604215.localdomain systemd-rc-local-generator[297124]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:47:32 np0005604215.localdomain systemd-sysv-generator[297128]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:32 np0005604215.localdomain sudo[296956]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:47:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:47:33 np0005604215.localdomain podman[297134]: 2026-02-01 09:47:33.867221797 +0000 UTC m=+0.079662184 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:47:33 np0005604215.localdomain podman[297135]: 2026-02-01 09:47:33.917207822 +0000 UTC m=+0.127464951 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:47:33 np0005604215.localdomain podman[297134]: 2026-02-01 09:47:33.929762206 +0000 UTC m=+0.142202603 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:47:33 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:47:33 np0005604215.localdomain podman[297135]: 2026-02-01 09:47:33.949991069 +0000 UTC m=+0.160248218 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:47:33 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:47:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:47:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:47:39 np0005604215.localdomain podman[297180]: 2026-02-01 09:47:39.869838847 +0000 UTC m=+0.081867434 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, architecture=x86_64, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vcs-type=git, version=9.7, config_id=openstack_network_exporter)
Feb 01 09:47:39 np0005604215.localdomain podman[297180]: 2026-02-01 09:47:39.885789256 +0000 UTC m=+0.097817823 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, config_id=openstack_network_exporter, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container)
Feb 01 09:47:39 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:47:39 np0005604215.localdomain systemd[1]: tmp-crun.zoXmtY.mount: Deactivated successfully.
Feb 01 09:47:39 np0005604215.localdomain podman[297181]: 2026-02-01 09:47:39.989695459 +0000 UTC m=+0.194219782 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 01 09:47:40 np0005604215.localdomain podman[297181]: 2026-02-01 09:47:40.023840928 +0000 UTC m=+0.228365231 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent)
Feb 01 09:47:40 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:47:41 np0005604215.localdomain sudo[297217]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:41 np0005604215.localdomain sudo[297217]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:41 np0005604215.localdomain sudo[297217]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:41 np0005604215.localdomain sudo[297235]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:41 np0005604215.localdomain sudo[297235]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 2026-02-01 09:47:41.717505511 +0000 UTC m=+0.078581211 container create a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_swanson, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, name=rhceph, GIT_CLEAN=True, com.redhat.component=rhceph-container, ceph=True, version=7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, GIT_BRANCH=main, distribution-scope=public)
Feb 01 09:47:41 np0005604215.localdomain systemd[1]: Started libpod-conmon-a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5.scope.
Feb 01 09:47:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:47:41.764 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:47:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:47:41.765 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:47:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:47:41.765 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:47:41 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 2026-02-01 09:47:41.684718754 +0000 UTC m=+0.045794484 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 2026-02-01 09:47:41.790407753 +0000 UTC m=+0.151483443 container init a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_swanson, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, name=rhceph, CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_CLEAN=True, com.redhat.component=rhceph-container, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, RELEASE=main, architecture=x86_64)
Feb 01 09:47:41 np0005604215.localdomain systemd[1]: tmp-crun.WBsMUL.mount: Deactivated successfully.
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 2026-02-01 09:47:41.809708157 +0000 UTC m=+0.170783857 container start a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_swanson, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, io.openshift.expose-services=)
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 2026-02-01 09:47:41.810683237 +0000 UTC m=+0.171758957 container attach a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_swanson, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, RELEASE=main, architecture=x86_64, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.expose-services=, vcs-type=git, distribution-scope=public, CEPH_POINT_RELEASE=, release=1764794109, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:47:41 np0005604215.localdomain gallant_swanson[297284]: 167 167
Feb 01 09:47:41 np0005604215.localdomain systemd[1]: libpod-a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5.scope: Deactivated successfully.
Feb 01 09:47:41 np0005604215.localdomain podman[297269]: 2026-02-01 09:47:41.815275671 +0000 UTC m=+0.176351361 container died a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_swanson, name=rhceph, CEPH_POINT_RELEASE=, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, ceph=True, version=7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=rhceph ceph, GIT_BRANCH=main)
Feb 01 09:47:41 np0005604215.localdomain podman[297291]: 2026-02-01 09:47:41.911897506 +0000 UTC m=+0.087779279 container remove a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=gallant_swanson, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, vcs-type=git, version=7, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, release=1764794109, RELEASE=main, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph)
Feb 01 09:47:41 np0005604215.localdomain systemd[1]: libpod-conmon-a090c2217890b10125338593243171cd4c2405f495df00f4fa9e1b57d4c9d9e5.scope: Deactivated successfully.
Feb 01 09:47:41 np0005604215.localdomain sudo[297235]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:42 np0005604215.localdomain sudo[297308]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:42.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:42.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:47:42 np0005604215.localdomain sudo[297308]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:42.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:47:42 np0005604215.localdomain sudo[297308]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:42.118 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:47:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:42.118 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:42 np0005604215.localdomain sudo[297326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:42 np0005604215.localdomain sudo[297326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 2026-02-01 09:47:42.641227198 +0000 UTC m=+0.065538492 container create 627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_khorana, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_BRANCH=main, GIT_CLEAN=True, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, version=7, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:47:42 np0005604215.localdomain systemd[1]: Started libpod-conmon-627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074.scope.
Feb 01 09:47:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 2026-02-01 09:47:42.610538657 +0000 UTC m=+0.034850051 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 2026-02-01 09:47:42.714398318 +0000 UTC m=+0.138709612 container init 627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_khorana, name=rhceph, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, vcs-type=git, RELEASE=main, architecture=x86_64, build-date=2025-12-08T17:28:53Z, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0)
Feb 01 09:47:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-9fdf3252c86cd77dd30ba16b0d5bab8522a6e21dec0e9b1ce9d9a425a94eb75a-merged.mount: Deactivated successfully.
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 2026-02-01 09:47:42.730696189 +0000 UTC m=+0.155007483 container start 627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_khorana, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, io.openshift.tags=rhceph ceph, vcs-type=git)
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 2026-02-01 09:47:42.731078291 +0000 UTC m=+0.155389585 container attach 627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_khorana, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, com.redhat.component=rhceph-container, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, architecture=x86_64, io.openshift.expose-services=, name=rhceph, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=)
Feb 01 09:47:42 np0005604215.localdomain brave_khorana[297376]: 167 167
Feb 01 09:47:42 np0005604215.localdomain systemd[1]: libpod-627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074.scope: Deactivated successfully.
Feb 01 09:47:42 np0005604215.localdomain podman[297361]: 2026-02-01 09:47:42.734130886 +0000 UTC m=+0.158442230 container died 627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_khorana, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, version=7, com.redhat.component=rhceph-container, ceph=True, GIT_BRANCH=main, distribution-scope=public, name=rhceph, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, build-date=2025-12-08T17:28:53Z)
Feb 01 09:47:42 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-70cfb5f07481b3f3fad16c67cdb1c1c7fd1e6402160e72f58db10ad112738e8a-merged.mount: Deactivated successfully.
Feb 01 09:47:42 np0005604215.localdomain podman[297381]: 2026-02-01 09:47:42.836102159 +0000 UTC m=+0.088534483 container remove 627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=brave_khorana, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, vcs-type=git, RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.openshift.expose-services=, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Feb 01 09:47:42 np0005604215.localdomain systemd[1]: libpod-conmon-627079e2047fcf3f15a8576168eb5db554a5e44dc47191e90a02f447ec2e1074.scope: Deactivated successfully.
Feb 01 09:47:43 np0005604215.localdomain sudo[297326]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:43.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:43.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:47:43 np0005604215.localdomain sudo[297405]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:43 np0005604215.localdomain sudo[297405]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:43 np0005604215.localdomain sudo[297405]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:43 np0005604215.localdomain sudo[297423]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:43 np0005604215.localdomain sudo[297423]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 2026-02-01 09:47:43.645578881 +0000 UTC m=+0.076466424 container create 4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pare, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, ceph=True, io.openshift.expose-services=, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, distribution-scope=public, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:47:43 np0005604215.localdomain systemd[1]: Started libpod-conmon-4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379.scope.
Feb 01 09:47:43 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 2026-02-01 09:47:43.614084155 +0000 UTC m=+0.044971748 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 2026-02-01 09:47:43.713860028 +0000 UTC m=+0.144747571 container init 4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pare, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, com.redhat.component=rhceph-container, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, io.buildah.version=1.41.4, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, name=rhceph, CEPH_POINT_RELEASE=)
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 2026-02-01 09:47:43.725041198 +0000 UTC m=+0.155928741 container start 4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pare, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, version=7, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, architecture=x86_64, GIT_CLEAN=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, release=1764794109)
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 2026-02-01 09:47:43.725367598 +0000 UTC m=+0.156255181 container attach 4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pare, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, vcs-type=git, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, architecture=x86_64, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, io.buildah.version=1.41.4, RELEASE=main, vendor=Red Hat, Inc., ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True)
Feb 01 09:47:43 np0005604215.localdomain naughty_pare[297473]: 167 167
Feb 01 09:47:43 np0005604215.localdomain systemd[1]: libpod-4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379.scope: Deactivated successfully.
Feb 01 09:47:43 np0005604215.localdomain podman[297458]: 2026-02-01 09:47:43.727559457 +0000 UTC m=+0.158447010 container died 4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pare, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, vcs-type=git)
Feb 01 09:47:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-d865387eec8712add5ecac852d57eea35d162b81658ca3d6623164bee555a68e-merged.mount: Deactivated successfully.
Feb 01 09:47:43 np0005604215.localdomain podman[297479]: 2026-02-01 09:47:43.827536857 +0000 UTC m=+0.083787874 container remove 4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_pare, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, distribution-scope=public, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, CEPH_POINT_RELEASE=, GIT_BRANCH=main, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, release=1764794109, architecture=x86_64)
Feb 01 09:47:43 np0005604215.localdomain systemd[1]: libpod-conmon-4b770e78160ff2dde941722b3949925eed50d2fdd604b07d076f765327072379.scope: Deactivated successfully.
Feb 01 09:47:44 np0005604215.localdomain sudo[297423]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:44.097 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:44.098 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:44.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:44 np0005604215.localdomain sudo[297502]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:44 np0005604215.localdomain sudo[297502]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:44 np0005604215.localdomain sudo[297502]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:44 np0005604215.localdomain sudo[297520]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:44 np0005604215.localdomain sudo[297520]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 2026-02-01 09:47:44.690118331 +0000 UTC m=+0.076248478 container create bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kapitsa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, RELEASE=main, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, vendor=Red Hat, Inc., vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, io.openshift.expose-services=, distribution-scope=public, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z)
Feb 01 09:47:44 np0005604215.localdomain systemd[1]: Started libpod-conmon-bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d.scope.
Feb 01 09:47:44 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 2026-02-01 09:47:44.755316792 +0000 UTC m=+0.141446959 container init bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kapitsa, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, io.openshift.tags=rhceph ceph, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, distribution-scope=public, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image.)
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 2026-02-01 09:47:44.660919797 +0000 UTC m=+0.047049994 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:44 np0005604215.localdomain systemd[1]: tmp-crun.7d4eaN.mount: Deactivated successfully.
Feb 01 09:47:44 np0005604215.localdomain kind_kapitsa[297570]: 167 167
Feb 01 09:47:44 np0005604215.localdomain systemd[1]: libpod-bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d.scope: Deactivated successfully.
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 2026-02-01 09:47:44.773247283 +0000 UTC m=+0.159377440 container start bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kapitsa, ceph=True, name=rhceph, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, release=1764794109, version=7, vcs-type=git, distribution-scope=public, build-date=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc.)
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 2026-02-01 09:47:44.773582924 +0000 UTC m=+0.159713121 container attach bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kapitsa, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, version=7, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, build-date=2025-12-08T17:28:53Z, architecture=x86_64, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, name=rhceph)
Feb 01 09:47:44 np0005604215.localdomain podman[297555]: 2026-02-01 09:47:44.777100314 +0000 UTC m=+0.163230501 container died bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kapitsa, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, GIT_CLEAN=True, io.buildah.version=1.41.4, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, RELEASE=main, description=Red Hat Ceph Storage 7, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7)
Feb 01 09:47:44 np0005604215.localdomain podman[297575]: 2026-02-01 09:47:44.872532902 +0000 UTC m=+0.084375402 container remove bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=kind_kapitsa, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, RELEASE=main, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.expose-services=, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, architecture=x86_64, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7)
Feb 01 09:47:44 np0005604215.localdomain systemd[1]: libpod-conmon-bc1a00cc4bfacedea660fa9cf40dd7d4c83c5e887dbdea713da30dcf655c9f6d.scope: Deactivated successfully.
Feb 01 09:47:44 np0005604215.localdomain sudo[297520]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:45 np0005604215.localdomain sudo[297592]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:45 np0005604215.localdomain sudo[297592]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:45 np0005604215.localdomain sudo[297592]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:45 np0005604215.localdomain sudo[297610]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:45 np0005604215.localdomain sudo[297610]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 2026-02-01 09:47:45.5717244 +0000 UTC m=+0.080896732 container create 5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wozniak, CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=rhceph-container, distribution-scope=public, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, io.openshift.expose-services=)
Feb 01 09:47:45 np0005604215.localdomain systemd[1]: Started libpod-conmon-5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f.scope.
Feb 01 09:47:45 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 2026-02-01 09:47:45.632245055 +0000 UTC m=+0.141417397 container init 5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wozniak, GIT_REPO=https://github.com/ceph/ceph-container.git, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, com.redhat.component=rhceph-container, distribution-scope=public, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, build-date=2025-12-08T17:28:53Z, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, io.buildah.version=1.41.4, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, io.openshift.tags=rhceph ceph, version=7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 2026-02-01 09:47:45.535805307 +0000 UTC m=+0.044977649 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 2026-02-01 09:47:45.641386071 +0000 UTC m=+0.150558403 container start 5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wozniak, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7, ceph=True, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, vcs-type=git, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, release=1764794109, io.buildah.version=1.41.4)
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 2026-02-01 09:47:45.641713531 +0000 UTC m=+0.150885903 container attach 5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wozniak, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, vcs-type=git, vendor=Red Hat, Inc., GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, build-date=2025-12-08T17:28:53Z, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, ceph=True, release=1764794109, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, version=7, RELEASE=main, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:47:45 np0005604215.localdomain hungry_wozniak[297659]: 167 167
Feb 01 09:47:45 np0005604215.localdomain systemd[1]: libpod-5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f.scope: Deactivated successfully.
Feb 01 09:47:45 np0005604215.localdomain podman[297644]: 2026-02-01 09:47:45.644170668 +0000 UTC m=+0.153343000 container died 5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wozniak, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, com.redhat.component=rhceph-container, distribution-scope=public, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, vcs-type=git, version=7, GIT_CLEAN=True, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:47:45 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-da9b2cbf6412129e240e02aa28ac27361f675fa9db3a0d86033788be561790c5-merged.mount: Deactivated successfully.
Feb 01 09:47:45 np0005604215.localdomain podman[297664]: 2026-02-01 09:47:45.740891507 +0000 UTC m=+0.082851565 container remove 5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=hungry_wozniak, com.redhat.component=rhceph-container, release=1764794109, build-date=2025-12-08T17:28:53Z, architecture=x86_64, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=rhceph ceph, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main)
Feb 01 09:47:45 np0005604215.localdomain systemd[1]: libpod-conmon-5f2e5d2550a2cea1bbcc92e862efc358d725bf143043da6b08a687f7de89430f.scope: Deactivated successfully.
Feb 01 09:47:45 np0005604215.localdomain sudo[297610]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:45 np0005604215.localdomain sudo[297679]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:45 np0005604215.localdomain sudo[297679]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:45 np0005604215.localdomain sudo[297679]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:45 np0005604215.localdomain sudo[297697]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:47:45 np0005604215.localdomain sudo[297697]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.123 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.124 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.124 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.124 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.125 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.599 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.788 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.790 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12379MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.791 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.791 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:47:46 np0005604215.localdomain systemd[1]: tmp-crun.lutCVS.mount: Deactivated successfully.
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.858 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.859 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:47:46 np0005604215.localdomain podman[297807]: 2026-02-01 09:47:46.868561729 +0000 UTC m=+0.103867803 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, vendor=Red Hat, Inc., GIT_CLEAN=True, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, GIT_BRANCH=main, ceph=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, version=7, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Feb 01 09:47:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:46.882 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:47:46 np0005604215.localdomain podman[297807]: 2026-02-01 09:47:46.993737848 +0000 UTC m=+0.229043872 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.openshift.tags=rhceph ceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, com.redhat.component=rhceph-container, GIT_CLEAN=True, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, release=1764794109, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, vendor=Red Hat, Inc., distribution-scope=public, ceph=True)
Feb 01 09:47:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:47.314 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.431s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:47:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:47.322 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:47:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:47.340 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:47:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:47.342 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:47:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:47.343 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.552s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:47:47 np0005604215.localdomain sudo[297903]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:47 np0005604215.localdomain sudo[297903]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain sudo[297903]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:47 np0005604215.localdomain sudo[297935]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:47 np0005604215.localdomain sudo[297935]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain sudo[297697]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:47 np0005604215.localdomain sudo[297967]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:47:47 np0005604215.localdomain sudo[297967]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain sudo[297967]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:47 np0005604215.localdomain sudo[297985]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:47:47 np0005604215.localdomain sudo[297985]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain sudo[297985]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:47 np0005604215.localdomain sudo[298003]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:47 np0005604215.localdomain sudo[298003]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain sudo[298003]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:47 np0005604215.localdomain sudo[298035]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:47 np0005604215.localdomain sudo[298035]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain sudo[298035]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:47 np0005604215.localdomain sudo[298065]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:47 np0005604215.localdomain sudo[298065]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:47:47 np0005604215.localdomain sudo[298065]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 2026-02-01 09:47:48.066595765 +0000 UTC m=+0.081533153 container create a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_faraday, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, vcs-type=git, architecture=x86_64, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.created=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, ceph=True, distribution-scope=public, vendor=Red Hat, Inc., GIT_CLEAN=True, name=rhceph, GIT_BRANCH=main)
Feb 01 09:47:48 np0005604215.localdomain podman[298098]: 2026-02-01 09:47:48.069398603 +0000 UTC m=+0.091607559 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:47:48 np0005604215.localdomain podman[298098]: 2026-02-01 09:47:48.083658149 +0000 UTC m=+0.105867105 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: Started libpod-conmon-a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b.scope.
Feb 01 09:47:48 np0005604215.localdomain sudo[298138]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:48 np0005604215.localdomain sudo[298138]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 2026-02-01 09:47:48.032152547 +0000 UTC m=+0.047089955 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:48 np0005604215.localdomain sudo[298138]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 2026-02-01 09:47:48.147315222 +0000 UTC m=+0.162252610 container init a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_faraday, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, release=1764794109, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, com.redhat.component=rhceph-container, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, GIT_CLEAN=True, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, name=rhceph, CEPH_POINT_RELEASE=, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z)
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 2026-02-01 09:47:48.15841769 +0000 UTC m=+0.173355068 container start a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_faraday, CEPH_POINT_RELEASE=, release=1764794109, GIT_CLEAN=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, RELEASE=main, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git)
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 2026-02-01 09:47:48.159865645 +0000 UTC m=+0.174803063 container attach a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_faraday, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, RELEASE=main, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, distribution-scope=public, ceph=True, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, vcs-type=git, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:47:48 np0005604215.localdomain xenodochial_faraday[298163]: 167 167
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: libpod-a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b.scope: Deactivated successfully.
Feb 01 09:47:48 np0005604215.localdomain podman[298099]: 2026-02-01 09:47:48.162960302 +0000 UTC m=+0.177897700 container died a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_faraday, GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, description=Red Hat Ceph Storage 7, distribution-scope=public, RELEASE=main, CEPH_POINT_RELEASE=, version=7, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main)
Feb 01 09:47:48 np0005604215.localdomain sudo[298167]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:48 np0005604215.localdomain sudo[298167]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain sudo[298167]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain podman[298183]: 2026-02-01 09:47:48.260933319 +0000 UTC m=+0.085327812 container remove a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=xenodochial_faraday, description=Red Hat Ceph Storage 7, GIT_CLEAN=True, io.openshift.expose-services=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, GIT_BRANCH=main, release=1764794109, vcs-type=git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, version=7, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, ceph=True, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: libpod-conmon-a7aae9f3543217d68a7cd7f3879eab8df30212d3933852bd5bc5e48c4dde950b.scope: Deactivated successfully.
Feb 01 09:47:48 np0005604215.localdomain sudo[298197]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:47:48 np0005604215.localdomain sudo[298197]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain sudo[298197]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 2026-02-01 09:47:48.374751122 +0000 UTC m=+0.076669741 container create c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_almeida, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, distribution-scope=public, io.openshift.tags=rhceph ceph, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, architecture=x86_64, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, release=1764794109, com.redhat.component=rhceph-container, version=7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_CLEAN=True)
Feb 01 09:47:48 np0005604215.localdomain sudo[298227]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:48 np0005604215.localdomain sudo[298227]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain sudo[298227]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: Started libpod-conmon-c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31.scope.
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:47:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83f95aabb4684321a51913ff627785e4f9ef9e2e575c9fcc7c90d932b9524c90/merged/tmp/config supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83f95aabb4684321a51913ff627785e4f9ef9e2e575c9fcc7c90d932b9524c90/merged/tmp/keyring supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83f95aabb4684321a51913ff627785e4f9ef9e2e575c9fcc7c90d932b9524c90/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/83f95aabb4684321a51913ff627785e4f9ef9e2e575c9fcc7c90d932b9524c90/merged/var/lib/ceph/mon/ceph-np0005604215 supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 2026-02-01 09:47:48.34433958 +0000 UTC m=+0.046258199 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 2026-02-01 09:47:48.443842196 +0000 UTC m=+0.145760825 container init c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_almeida, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, version=7, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, RELEASE=main, architecture=x86_64, GIT_BRANCH=main, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, ceph=True, vendor=Red Hat, Inc., release=1764794109)
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 2026-02-01 09:47:48.456597734 +0000 UTC m=+0.158516353 container start c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_almeida, GIT_BRANCH=main, CEPH_POINT_RELEASE=, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, version=7, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, release=1764794109, io.openshift.expose-services=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, name=rhceph, vcs-type=git, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public)
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 2026-02-01 09:47:48.456868893 +0000 UTC m=+0.158787522 container attach c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_almeida, vcs-type=git, distribution-scope=public, release=1764794109, GIT_BRANCH=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:47:48 np0005604215.localdomain sudo[298252]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:48 np0005604215.localdomain sudo[298252]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain sudo[298252]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain sudo[298276]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:48 np0005604215.localdomain sudo[298276]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain sudo[298276]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: libpod-c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31.scope: Deactivated successfully.
Feb 01 09:47:48 np0005604215.localdomain podman[298219]: 2026-02-01 09:47:48.54972777 +0000 UTC m=+0.251646419 container died c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_almeida, vcs-type=git, ceph=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, architecture=x86_64, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=7)
Feb 01 09:47:48 np0005604215.localdomain sudo[298316]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:48 np0005604215.localdomain sudo[298316]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:48 np0005604215.localdomain sudo[298316]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:48 np0005604215.localdomain podman[298315]: 2026-02-01 09:47:48.647900923 +0000 UTC m=+0.085800666 container remove c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=reverent_almeida, io.openshift.expose-services=, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, GIT_CLEAN=True, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, com.redhat.component=rhceph-container, name=rhceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True)
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: libpod-conmon-c91d1b8551f793249a5fabc41d1193573791bf883dc0873fc31d3e8bf12b1a31.scope: Deactivated successfully.
Feb 01 09:47:48 np0005604215.localdomain sudo[298346]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:47:48 np0005604215.localdomain systemd-rc-local-generator[298390]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:47:48 np0005604215.localdomain systemd-sysv-generator[298395]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-6c185dde61299f3aeca0de7063e02a902e7d92abbfe51bc200c13c432d55b3e8-merged.mount: Deactivated successfully.
Feb 01 09:47:49 np0005604215.localdomain sudo[298346]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:49 np0005604215.localdomain sudo[298346]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: Reloading.
Feb 01 09:47:49 np0005604215.localdomain sudo[298425]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:49 np0005604215.localdomain systemd-rc-local-generator[298465]: /etc/rc.d/rc.local is not marked executable, skipping.
Feb 01 09:47:49 np0005604215.localdomain systemd-sysv-generator[298468]: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtsecretd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtqemud.service:25: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtproxyd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnodedevd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/insights-client-boot.service:24: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon.
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtstoraged.service:20: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnwfilterd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtnetworkd.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: /usr/lib/systemd/system/virtinterfaced.service:18: Failed to parse service type, ignoring: notify-reload
Feb 01 09:47:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:49.340 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:49.359 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:47:49.359 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:47:49 np0005604215.localdomain sudo[298425]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:49 np0005604215.localdomain sudo[298425]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: Starting Ceph mon.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e...
Feb 01 09:47:49 np0005604215.localdomain sudo[298480]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:49 np0005604215.localdomain sudo[298480]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:49 np0005604215.localdomain sudo[298480]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:49 np0005604215.localdomain sudo[298512]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:49 np0005604215.localdomain sudo[298512]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:49 np0005604215.localdomain sudo[298512]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:49 np0005604215.localdomain podman[298568]: 
Feb 01 09:47:49 np0005604215.localdomain podman[298568]: 2026-02-01 09:47:49.869743685 +0000 UTC m=+0.079870172 container create 06016fd8ed9ea17ea04edc1114fee86a5099952604beba493d4c6ae63ee431e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, architecture=x86_64, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, GIT_BRANCH=main, vcs-type=git, io.openshift.expose-services=, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, name=rhceph, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc., build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, release=1764794109)
Feb 01 09:47:49 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0296bb360dd52703e2f6e172cd39b61af6d54393dd30e9b797ae2b34c6c8938/merged/etc/ceph/ceph.conf supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:49 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0296bb360dd52703e2f6e172cd39b61af6d54393dd30e9b797ae2b34c6c8938/merged/var/log/ceph supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:49 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0296bb360dd52703e2f6e172cd39b61af6d54393dd30e9b797ae2b34c6c8938/merged/var/lib/ceph/crash supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:49 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f0296bb360dd52703e2f6e172cd39b61af6d54393dd30e9b797ae2b34c6c8938/merged/var/lib/ceph/mon/ceph-np0005604215 supports timestamps until 2038 (0x7fffffff)
Feb 01 09:47:49 np0005604215.localdomain podman[298568]: 2026-02-01 09:47:49.837237777 +0000 UTC m=+0.047364284 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:47:49 np0005604215.localdomain podman[298568]: 2026-02-01 09:47:49.937009271 +0000 UTC m=+0.147135768 container init 06016fd8ed9ea17ea04edc1114fee86a5099952604beba493d4c6ae63ee431e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, com.redhat.component=rhceph-container, description=Red Hat Ceph Storage 7, RELEASE=main, io.openshift.tags=rhceph ceph, name=rhceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.k8s.description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.expose-services=, release=1764794109, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main)
Feb 01 09:47:49 np0005604215.localdomain podman[298568]: 2026-02-01 09:47:49.945506447 +0000 UTC m=+0.155632944 container start 06016fd8ed9ea17ea04edc1114fee86a5099952604beba493d4c6ae63ee431e1 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mon-np0005604215, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, ceph=True, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, version=7, GIT_BRANCH=main)
Feb 01 09:47:49 np0005604215.localdomain bash[298568]: 06016fd8ed9ea17ea04edc1114fee86a5099952604beba493d4c6ae63ee431e1
Feb 01 09:47:49 np0005604215.localdomain systemd[1]: Started Ceph mon.np0005604215 for 33fac0b9-80c7-560f-918a-c92d3021ca1e.
Feb 01 09:47:49 np0005604215.localdomain sudo[298581]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:47:49 np0005604215.localdomain sudo[298581]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:49 np0005604215.localdomain sudo[298581]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:49 np0005604215.localdomain ceph-mon[298604]: set uid:gid to 167:167 (ceph:ceph)
Feb 01 09:47:49 np0005604215.localdomain ceph-mon[298604]: ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable), process ceph-mon, pid 2
Feb 01 09:47:49 np0005604215.localdomain ceph-mon[298604]: pidfile_write: ignore empty --pid-file
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: load: jerasure load: lrc 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: RocksDB version: 7.9.2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Git sha 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Compile date 2025-09-23 00:00:00
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: DB SUMMARY
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: DB Session ID:  HRI08R8OB38WGRLS0V9F
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: CURRENT file:  CURRENT
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: IDENTITY file:  IDENTITY
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: MANIFEST file:  MANIFEST-000005 size: 59 Bytes
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: SST files in /var/lib/ceph/mon/ceph-np0005604215/store.db dir, Total Num: 0, files: 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Write Ahead Log file in /var/lib/ceph/mon/ceph-np0005604215/store.db: 000004.log size: 636 ; 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                         Options.error_if_exists: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.create_if_missing: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                         Options.paranoid_checks: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.flush_verify_memtable_count: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                               Options.track_and_verify_wals_in_manifest: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:        Options.verify_sst_unique_id_in_manifest: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                                     Options.env: 0x562ae5f5a9e0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                                      Options.fs: PosixFileSystem
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                                Options.info_log: 0x562ae8602d20
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.max_file_opening_threads: 16
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                              Options.statistics: (nil)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                               Options.use_fsync: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.max_log_file_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.max_manifest_file_size: 1073741824
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.log_file_time_to_roll: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.keep_log_file_num: 1000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                    Options.recycle_log_file_num: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                         Options.allow_fallocate: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                        Options.allow_mmap_reads: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.allow_mmap_writes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                        Options.use_direct_reads: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                        Options.use_direct_io_for_flush_and_compaction: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.create_missing_column_families: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                              Options.db_log_dir: 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                                 Options.wal_dir: 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.table_cache_numshardbits: 6
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                         Options.WAL_ttl_seconds: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.WAL_size_limit_MB: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                        Options.max_write_batch_group_size_bytes: 1048576
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.manifest_preallocation_size: 4194304
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                     Options.is_fd_close_on_exec: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.advise_random_on_open: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                    Options.db_write_buffer_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                    Options.write_buffer_manager: 0x562ae8613540
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.access_hint_on_compaction_start: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:           Options.random_access_max_buffer_size: 1048576
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                      Options.use_adaptive_mutex: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                            Options.rate_limiter: (nil)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:     Options.sst_file_manager.rate_bytes_per_sec: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.wal_recovery_mode: 2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.enable_thread_tracking: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.enable_pipelined_write: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.unordered_write: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.allow_concurrent_memtable_write: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:      Options.enable_write_thread_adaptive_yield: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.write_thread_max_yield_usec: 100
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:            Options.write_thread_slow_yield_usec: 3
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                               Options.row_cache: None
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                              Options.wal_filter: None
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.avoid_flush_during_recovery: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.allow_ingest_behind: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.two_write_queues: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.manual_wal_flush: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.wal_compression: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.atomic_flush: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.avoid_unnecessary_blocking_io: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.persist_stats_to_disk: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.write_dbid_to_manifest: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.log_readahead_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.file_checksum_gen_factory: Unknown
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.best_efforts_recovery: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.max_bgerror_resume_count: 2147483647
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:            Options.bgerror_resume_retry_interval: 1000000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.allow_data_in_errors: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.db_host_id: __hostname__
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.enforce_single_del_contracts: true
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.max_background_jobs: 2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.max_background_compactions: -1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.max_subcompactions: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.avoid_flush_during_shutdown: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:           Options.writable_file_max_buffer_size: 1048576
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.delayed_write_rate : 16777216
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.max_total_wal_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.delete_obsolete_files_period_micros: 21600000000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.stats_dump_period_sec: 600
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.stats_persist_period_sec: 600
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.stats_history_buffer_size: 1048576
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                          Options.max_open_files: -1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                          Options.bytes_per_sync: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                      Options.wal_bytes_per_sync: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.strict_bytes_per_sync: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:       Options.compaction_readahead_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.max_background_flushes: -1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Compression algorithms supported:
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kZSTD supported: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kXpressCompression supported: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kBZip2Compression supported: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kZSTDNotFinalCompression supported: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kLZ4Compression supported: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kZlibCompression supported: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kLZ4HCCompression supported: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         kSnappyCompression supported: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Fast CRC32 supported: Supported on x86
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: DMutex implementation: pthread_mutex_t
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:5527] Recovering from manifest file: /var/lib/ceph/mon/ceph-np0005604215/store.db/MANIFEST-000005
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/column_family.cc:630] --------------- Options for column family [default]:
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:               Options.comparator: leveldb.BytewiseComparator
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:           Options.merge_operator: 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:        Options.compaction_filter: None
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:        Options.compaction_filter_factory: None
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:  Options.sst_partitioner_factory: None
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.memtable_factory: SkipListFactory
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:            Options.table_factory: BlockBasedTable
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:            table_factory options:   flush_block_policy_factory: FlushBlockBySizePolicyFactory (0x562ae8602980)
                                                             cache_index_and_filter_blocks: 1
                                                             cache_index_and_filter_blocks_with_high_priority: 0
                                                             pin_l0_filter_and_index_blocks_in_cache: 0
                                                             pin_top_level_index_and_filter: 1
                                                             index_type: 0
                                                             data_block_index_type: 0
                                                             index_shortening: 1
                                                             data_block_hash_table_util_ratio: 0.750000
                                                             checksum: 4
                                                             no_block_cache: 0
                                                             block_cache: 0x562ae85ff1f0
                                                             block_cache_name: BinnedLRUCache
                                                             block_cache_options:
                                                               capacity : 536870912
                                                               num_shard_bits : 4
                                                               strict_capacity_limit : 0
                                                               high_pri_pool_ratio: 0.000
                                                             block_cache_compressed: (nil)
                                                             persistent_cache: (nil)
                                                             block_size: 4096
                                                             block_size_deviation: 10
                                                             block_restart_interval: 16
                                                             index_block_restart_interval: 1
                                                             metadata_block_size: 4096
                                                             partition_filters: 0
                                                             use_delta_encoding: 1
                                                             filter_policy: bloomfilter
                                                             whole_key_filtering: 1
                                                             verify_compression: 0
                                                             read_amp_bytes_per_bit: 0
                                                             format_version: 5
                                                             enable_index_compression: 1
                                                             block_align: 0
                                                             max_auto_readahead_size: 262144
                                                             prepopulate_block_cache: 0
                                                             initial_auto_readahead_size: 8192
                                                             num_file_reads_for_auto_readahead: 2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:        Options.write_buffer_size: 33554432
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:  Options.max_write_buffer_number: 2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.compression: NoCompression
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.bottommost_compression: Disabled
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:       Options.prefix_extractor: nullptr
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:   Options.memtable_insert_with_hint_prefix_extractor: nullptr
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.num_levels: 7
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:        Options.min_write_buffer_number_to_merge: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:     Options.max_write_buffer_number_to_maintain: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:     Options.max_write_buffer_size_to_maintain: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:            Options.bottommost_compression_opts.window_bits: -14
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.bottommost_compression_opts.level: 32767
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:               Options.bottommost_compression_opts.strategy: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.bottommost_compression_opts.max_dict_bytes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.bottommost_compression_opts.zstd_max_train_bytes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.bottommost_compression_opts.parallel_threads: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.bottommost_compression_opts.enabled: false
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.bottommost_compression_opts.max_dict_buffer_bytes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.bottommost_compression_opts.use_zstd_dict_trainer: true
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:            Options.compression_opts.window_bits: -14
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.compression_opts.level: 32767
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:               Options.compression_opts.strategy: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.compression_opts.max_dict_bytes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.compression_opts.zstd_max_train_bytes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.compression_opts.use_zstd_dict_trainer: true
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.compression_opts.parallel_threads: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                  Options.compression_opts.enabled: false
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:         Options.compression_opts.max_dict_buffer_bytes: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:      Options.level0_file_num_compaction_trigger: 4
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.level0_slowdown_writes_trigger: 20
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:              Options.level0_stop_writes_trigger: 36
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.target_file_size_base: 67108864
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:             Options.target_file_size_multiplier: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.max_bytes_for_level_base: 268435456
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.level_compaction_dynamic_level_bytes: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.max_bytes_for_level_multiplier: 10.000000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[0]: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[1]: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[2]: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[3]: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[4]: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[5]: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.max_bytes_for_level_multiplier_addtl[6]: 1
Feb 01 09:47:50 np0005604215.localdomain sudo[297935]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:       Options.max_sequential_skip_in_iterations: 8
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                    Options.max_compaction_bytes: 1677721600
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:   Options.ignore_max_compaction_bytes_for_input: true
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                        Options.arena_block_size: 1048576
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:   Options.soft_pending_compaction_bytes_limit: 68719476736
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:   Options.hard_pending_compaction_bytes_limit: 274877906944
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.disable_auto_compactions: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                        Options.compaction_style: kCompactionStyleLevel
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                          Options.compaction_pri: kMinOverlappingRatio
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_universal.size_ratio: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_universal.min_merge_width: 2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_universal.max_merge_width: 4294967295
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_universal.max_size_amplification_percent: 200
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_universal.compression_size_percent: -1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_universal.stop_style: kCompactionStopStyleTotalSize
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_fifo.max_table_files_size: 1073741824
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.compaction_options_fifo.allow_compaction: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.table_properties_collectors: 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.inplace_update_support: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                 Options.inplace_update_num_locks: 10000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:               Options.memtable_prefix_bloom_size_ratio: 0.000000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:               Options.memtable_whole_key_filtering: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:   Options.memtable_huge_page_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                           Options.bloom_locality: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                    Options.max_successive_merges: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.optimize_filters_for_hits: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.paranoid_file_checks: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.force_consistency_checks: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.report_bg_io_stats: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                               Options.ttl: 2592000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.periodic_compaction_seconds: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:  Options.preclude_last_level_data_seconds: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:    Options.preserve_internal_time_seconds: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                       Options.enable_blob_files: false
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                           Options.min_blob_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                          Options.blob_file_size: 268435456
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                   Options.blob_compression_type: NoCompression
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.enable_blob_garbage_collection: false
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:      Options.blob_garbage_collection_age_cutoff: 0.250000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.blob_garbage_collection_force_threshold: 1.000000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:          Options.blob_compaction_readahead_size: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb:                Options.blob_file_starting_level: 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: Options.experimental_mempurge_threshold: 0.000000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:5566] Recovered from manifest file:/var/lib/ceph/mon/ceph-np0005604215/store.db/MANIFEST-000005 succeeded,manifest_file_number is 5, next_file_number is 7, last_sequence is 0, log_number is 0,prev_log_number is 0,max_column_family is 0,min_log_number_to_keep is 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:5581] Column family [default] (ID 0), log number is 0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_open.cc:539] DB ID: c098c70d-588d-409e-9f3c-16c3b4da1135
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939270005681, "job": 1, "event": "recovery_started", "wal_files": [4]}
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_open.cc:1043] Recovering log #4 mode 2
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939270008160, "cf_name": "default", "job": 1, "event": "table_file_creation", "file_number": 8, "file_size": 1762, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 1, "largest_seqno": 5, "table_properties": {"data_size": 648, "index_size": 31, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 69, "raw_key_size": 115, "raw_average_key_size": 23, "raw_value_size": 526, "raw_average_value_size": 105, "num_data_blocks": 1, "num_entries": 5, "num_filter_entries": 5, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 0, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 8, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939270008382, "job": 1, "event": "recovery_finished"}
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:5047] Creating manifest 10
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000004.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_open.cc:1987] SstFileManager instance 0x562ae8626e00
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: DB pointer 0x562ae871c000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Cumulative writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 GB, 0.00 MB/s
                                                           Cumulative WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 0 writes, 0 keys, 0 commit groups, 0.0 writes per commit group, ingest: 0.00 MB, 0.00 MB/s
                                                           Interval WAL: 0 writes, 0 syncs, 0.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      1/0    1.72 KB   0.2      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Sum      1/0    1.72 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 0.0 total, 0.0 interval
                                                           Flush(GB): cumulative 0.000, interval 0.000
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Interval compaction: 0.00 GB write, 0.10 MB/s write, 0.00 GB read, 0.00 MB/s read, 0.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x562ae85ff1f0#2 capacity: 512.00 MB usage: 0.22 KB table_size: 0 occupancy: 18446744073709551615 collections: 1 last_copies: 0 last_secs: 3.2e-05 secs_since: 0
                                                           Block cache entry stats(count,size,portion): FilterBlock(1,0.11 KB,2.08616e-05%) IndexBlock(1,0.11 KB,2.08616e-05%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215 does not exist in monmap, will attempt to join an existing cluster
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: using public_addr v2:172.18.0.105:0/0 -> [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0]
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: starting mon.np0005604215 rank -1 at public addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] at bind addrs [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon_data /var/lib/ceph/mon/ceph-np0005604215 fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(???) e0 preinit fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing) e14 sync_obtain_latest_monmap
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing) e14 sync_obtain_latest_monmap obtained monmap e14
Feb 01 09:47:50 np0005604215.localdomain sudo[298644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:47:50 np0005604215.localdomain sudo[298644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:50 np0005604215.localdomain sudo[298644]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:50 np0005604215.localdomain sudo[298662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:47:50 np0005604215.localdomain sudo[298662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).mds e16 new map
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).mds e16 print_map
                                                           e16
                                                           enable_multiple, ever_enabled_multiple: 1,1
                                                           default compat: compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           legacy client fscid: 1
                                                            
                                                           Filesystem 'cephfs' (1)
                                                           fs_name        cephfs
                                                           epoch        14
                                                           flags        12 joinable allow_snaps allow_multimds_snaps
                                                           created        2026-02-01T07:59:04.480309+0000
                                                           modified        2026-02-01T09:39:55.510678+0000
                                                           tableserver        0
                                                           root        0
                                                           session_timeout        60
                                                           session_autoclose        300
                                                           max_file_size        1099511627776
                                                           required_client_features        {}
                                                           last_failure        0
                                                           last_failure_osd_epoch        79
                                                           compat        compat={},rocompat={},incompat={1=base v0.20,2=client writeable ranges,3=default file layouts on dirs,4=dir inode in separate object,5=mds uses versioned encoding,6=dirfrag is stored in omap,7=mds uses inline data,8=no anchor table,9=file layout v2,10=snaprealm v2,12=quiesce subvolumes}
                                                           max_mds        1
                                                           in        0
                                                           up        {0=26329}
                                                           failed        
                                                           damaged        
                                                           stopped        
                                                           data_pools        [6]
                                                           metadata_pool        7
                                                           inline_data        disabled
                                                           balancer        
                                                           bal_rank_mask        -1
                                                           standby_count_wanted        1
                                                           qdb_cluster        leader: 26329 members: 26329
                                                           [mds.mds.np0005604212.tkdkxt{0:26329} state up:active seq 12 addr [v2:172.18.0.106:6808/1133321306,v1:172.18.0.106:6809/1133321306] compat {c=[1],r=[1],i=[17ff]}]
                                                            
                                                            
                                                           Standby daemons:
                                                            
                                                           [mds.mds.np0005604215.rwvxvg{-1:16872} state up:standby seq 1 addr [v2:172.18.0.108:6808/2262553558,v1:172.18.0.108:6809/2262553558] compat {c=[1],r=[1],i=[17ff]}]
                                                           [mds.mds.np0005604213.jdbvyh{-1:16878} state up:standby seq 1 addr [v2:172.18.0.107:6808/3323601884,v1:172.18.0.107:6809/3323601884] compat {c=[1],r=[1],i=[17ff]}]
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).osd e89 crush map has features 3314933000852226048, adjusting msgr requires
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).osd e89 crush map has features 288514051259236352, adjusting msgr requires
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.54143 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mon.np0005604215 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 calling monitor election
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604211.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604215,np0005604213
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]:     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]:     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005604215,np0005604213
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]:     mon.np0005604212 (rank 1) addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] is down (out of quorum)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215 calling monitor election
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215 is new leader, mons np0005604215,np0005604212,np0005604213 in quorum (ranks 0,1,2)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: monmap epoch 13
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: last_changed 2026-02-01T09:46:27.712705+0000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: min_mon_release 18 (reef)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: election_strategy: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: 0: [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] mon.np0005604215
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: 1: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: 2: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: osdmap e88: 6 total, 6 up, 6 in
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mgrmap e44: np0005604211.cuflqz(active, since 20s), standbys: np0005604209.isqrps, np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604215,np0005604213)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]:     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]:     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.54158 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604211.localdomain", "label": "mon", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Removed label mon from host np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604211.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604211.cuflqz", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4277238030' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4277238030' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.34578 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604211.localdomain", "label": "mgr", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Removed label mgr from host np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604211.cuflqz (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604211.cuflqz on np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604211.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604211 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604211 on np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.34587 -' entity='client.admin' cmd=[{"prefix": "orch host label rm", "hostname": "np0005604211.localdomain", "label": "_admin", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Removed label _admin from host np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3190989409' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/992751241' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/766827414' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/329388596' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.44647 -' entity='client.admin' cmd=[{"prefix": "orch host drain", "hostname": "np0005604211.localdomain", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Added label _no_schedule to host np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Added label SpecialHostLabels.DRAIN_CONF_KEYRING to host np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/728333592' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2418149454' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.34605 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "host_pattern": "np0005604211.localdomain", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.44659 -' entity='client.admin' cmd=[{"prefix": "orch host rm", "hostname": "np0005604211.localdomain", "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain"}]': finished
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Removed host np0005604211.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 583 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 ' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.26720 172.18.0.105:0/4119751104' entity='mgr.np0005604211.cuflqz' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Activating manager daemon np0005604209.isqrps
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Manager daemon np0005604211.cuflqz is unresponsive, replacing it with standby daemon np0005604209.isqrps
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: osdmap e89: 6 total, 6 up, 6 in
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mgrmap e45: np0005604209.isqrps(active, starting, since 0.0467249s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Manager daemon np0005604209.isqrps is now available
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd='[{"prefix":"config-key del","key":"mgr/cephadm/host.np0005604211.localdomain.devices.0"}]': finished
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/mirror_snapshot_schedule"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604209.isqrps/trash_purge_schedule"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: removing stray HostCache host record np0005604211.localdomain.devices.0
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mgrmap e46: np0005604209.isqrps(active, since 1.16558s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.34614 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Saving service mon spec with placement label:mon
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Cluster is now healthy
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:47:27] ENGINE Bus STARTING
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:47:27] ENGINE Serving on http://172.18.0.200:8765
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:47:27] ENGINE Serving on https://172.18.0.200:7150
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:47:27] ENGINE Bus STARTED
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:47:27] ENGINE Client ('172.18.0.200', 32790) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mgrmap e47: np0005604209.isqrps(active, since 3s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.34656 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604215", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 0 B/s wr, 20 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.54263 -' entity='client.admin' cmd=[{"prefix": "orch daemon rm", "names": ["mon.np0005604215"], "force": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Remove daemons mon.np0005604215
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Safe to remove mon.np0005604215: new quorum should be ['np0005604212', 'np0005604213'] (from ['np0005604212', 'np0005604213'])
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Removing monitor np0005604215 from monmap...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Removing daemon mon.np0005604215 from np0005604215.localdomain -- ports []
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 calling monitor election
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604213 calling monitor election
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 is new leader, mons np0005604212,np0005604213 in quorum (ranks 0,1)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: monmap epoch 14
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: last_changed 2026-02-01T09:47:31.128772+0000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: min_mon_release 18 (reef)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: election_strategy: 1
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: osdmap e89: 6 total, 6 up, 6 in
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mgrmap e47: np0005604209.isqrps(active, since 6s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: overall HEALTH_OK
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604212.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604212 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604212 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2207591404' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2207591404' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 12 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4251366698' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3912738810' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1033900095' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1600835519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1049106932' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.54293 -' entity='client.admin' cmd=[{"prefix": "orch daemon add", "daemon_type": "mon", "placement": "np0005604215.localdomain:172.18.0.105", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2784023077' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Deploying daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:47:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(synchronizing).paxosservice(auth 1..38) refresh upgraded, format 0 -> 3
Feb 01 09:47:50 np0005604215.localdomain ceph-mgr[278126]: ms_deliver_dispatch: unhandled message 0x55d17e09e000 mon_map magic: 0 from mon.1 v2:172.18.0.104:3300/0
Feb 01 09:47:51 np0005604215.localdomain podman[298750]: 2026-02-01 09:47:51.034313602 +0000 UTC m=+0.094261041 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, architecture=x86_64, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_BRANCH=main, release=1764794109, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, ceph=True, vcs-type=git)
Feb 01 09:47:51 np0005604215.localdomain podman[298750]: 2026-02-01 09:47:51.134781078 +0000 UTC m=+0.194728517 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, architecture=x86_64, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_CLEAN=True, RELEASE=main, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_BRANCH=main, io.buildah.version=1.41.4, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., name=rhceph, release=1764794109, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, GIT_REPO=https://github.com/ceph/ceph-container.git)
Feb 01 09:47:51 np0005604215.localdomain sudo[298662]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@-1(probing) e15  my rank is now 2 (was -1)
Feb 01 09:47:52 np0005604215.localdomain ceph-mon[298604]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:47:52 np0005604215.localdomain ceph-mon[298604]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 01 09:47:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:47:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:47:53 np0005604215.localdomain podman[298873]: 2026-02-01 09:47:53.879483484 +0000 UTC m=+0.088412009 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:47:53 np0005604215.localdomain podman[298873]: 2026-02-01 09:47:53.916121181 +0000 UTC m=+0.125049716 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:47:53 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:47:54 np0005604215.localdomain ceph-mds[276952]: mds.beacon.mds.np0005604215.rwvxvg missed beacon ack from the monitors
Feb 01 09:47:55 np0005604215.localdomain sudo[298896]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:47:55 np0005604215.localdomain sudo[298896]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:55 np0005604215.localdomain sudo[298896]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:55 np0005604215.localdomain sudo[298914]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:47:55 np0005604215.localdomain sudo[298914]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:55 np0005604215.localdomain sudo[298914]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:55 np0005604215.localdomain sudo[298932]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:55 np0005604215.localdomain sudo[298932]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:55 np0005604215.localdomain sudo[298932]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:55 np0005604215.localdomain sudo[298950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:55 np0005604215.localdomain sudo[298950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:55 np0005604215.localdomain sudo[298950]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:55 np0005604215.localdomain sudo[298968]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:55 np0005604215.localdomain sudo[298968]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:55 np0005604215.localdomain sudo[298968]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299002]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:56 np0005604215.localdomain sudo[299002]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299002]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299020]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:47:56 np0005604215.localdomain sudo[299020]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299020]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299038]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:47:56 np0005604215.localdomain sudo[299038]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299038]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299056]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:56 np0005604215.localdomain sudo[299056]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299056]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299074]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:47:56 np0005604215.localdomain sudo[299074]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299074]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299092]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:56 np0005604215.localdomain sudo[299092]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299092]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299110]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:56 np0005604215.localdomain sudo[299110]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299110]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299128]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:56 np0005604215.localdomain sudo[299128]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299128]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sshd[295371]: Received disconnect from 192.168.122.11 port 33144:11: disconnected by user
Feb 01 09:47:56 np0005604215.localdomain sshd[295371]: Disconnected from user tripleo-admin 192.168.122.11 port 33144
Feb 01 09:47:56 np0005604215.localdomain sshd[295338]: pam_unix(sshd:session): session closed for user tripleo-admin
Feb 01 09:47:56 np0005604215.localdomain systemd[1]: session-71.scope: Deactivated successfully.
Feb 01 09:47:56 np0005604215.localdomain systemd[1]: session-71.scope: Consumed 1.694s CPU time.
Feb 01 09:47:56 np0005604215.localdomain systemd-logind[761]: Session 71 logged out. Waiting for processes to exit.
Feb 01 09:47:56 np0005604215.localdomain systemd-logind[761]: Removed session 71.
Feb 01 09:47:56 np0005604215.localdomain sudo[299162]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:56 np0005604215.localdomain sudo[299162]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299162]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299180]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:47:56 np0005604215.localdomain sudo[299180]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299180]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:56 np0005604215.localdomain sudo[299198]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:56 np0005604215.localdomain sudo[299198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:56 np0005604215.localdomain sudo[299198]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:57 np0005604215.localdomain sudo[299216]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:47:57 np0005604215.localdomain sudo[299216]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:47:57 np0005604215.localdomain sudo[299216]: pam_unix(sudo:session): session closed for user root
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 calling monitor election
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604213 calling monitor election
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 is new leader, mons np0005604212,np0005604213 in quorum (ranks 0,1)
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: monmap epoch 15
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: last_changed 2026-02-01T09:47:50.388496+0000
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: min_mon_release 18 (reef)
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: election_strategy: 1
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604215
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: osdmap e89: 6 total, 6 up, 6 in
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mgrmap e47: np0005604209.isqrps(active, since 31s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1/3 mons down, quorum np0005604212,np0005604213 (MON_DOWN)
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm; 1/3 mons down, quorum np0005604212,np0005604213
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]:     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]:     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: [WRN] MON_DOWN: 1/3 mons down, quorum np0005604212,np0005604213
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]:     mon.np0005604215 (rank 2) addr [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] is down (out of quorum)
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: log_channel(cluster) log [INF] : mon.np0005604215 calling monitor election
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: paxos.2).electionLogic(0) init, first boot, initializing epoch at 1 
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(electing) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={4=support erasure code pools,5=new-style osdmap encoding,6=support isa/lrc erasure code,7=support shec erasure code}
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 _apply_compatset_features enabling new quorum features: compat={},rocompat={},incompat={8=support monmap features,9=luminous ondisk layout,10=mimic ondisk layout,11=nautilus ondisk layout,12=octopus ondisk layout,13=pacific ondisk layout,14=quincy ondisk layout,15=reef ondisk layout}
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 collect_metadata vda:  no unique device id for vda: fallback method has no model nor serial
Feb 01 09:47:57 np0005604215.localdomain ceph-mon[298604]: mgrc update_daemon_metadata mon.np0005604215 metadata {addrs=[v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0],arch=x86_64,ceph_release=reef,ceph_version=ceph version 18.2.1-361.el9cp (439dcd6094d413840eb2ec590fe2194ec616687f) reef (stable),ceph_version_short=18.2.1-361.el9cp,compression_algorithms=none, snappy, zlib, zstd, lz4,container_hostname=np0005604215.localdomain,container_image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest,cpu=AMD EPYC-Rome Processor,device_ids=,device_paths=vda=/dev/disk/by-path/pci-0000:00:04.0,devices=vda,distro=rhel,distro_description=Red Hat Enterprise Linux 9.7 (Plow),distro_version=9.7,hostname=np0005604215.localdomain,kernel_description=#1 SMP PREEMPT_DYNAMIC Wed Apr 12 10:45:03 EDT 2023,kernel_version=5.14.0-284.11.1.el9_2.x86_64,mem_swap_kb=1048572,mem_total_kb=16116604,os=Linux}
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215 calling monitor election
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 calling monitor election
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215 calling monitor election
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604213 calling monitor election
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604212 is new leader, mons np0005604212,np0005604213,np0005604215 in quorum (ranks 0,1,2)
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: monmap epoch 15
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: last_changed 2026-02-01T09:47:50.388496+0000
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: created 2026-02-01T07:37:52.883666+0000
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: min_mon_release 18 (reef)
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: election_strategy: 1
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: 0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: 1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: 2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604215
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: fsmap cephfs:1 {0=mds.np0005604212.tkdkxt=up:active} 2 up:standby
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: osdmap e89: 6 total, 6 up, 6 in
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: mgrmap e47: np0005604209.isqrps(active, since 33s), standbys: np0005604213.caiaeh, np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: Health check cleared: MON_DOWN (was: 1/3 mons down, quorum np0005604212,np0005604213)
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: Health detail: HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_DAEMON: 1 stray daemon(s) not managed by cephadm
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]:     stray daemon mgr.np0005604209.isqrps on host np0005604209.localdomain not managed by cephadm
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]: [WRN] CEPHADM_STRAY_HOST: 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:47:58 np0005604215.localdomain ceph-mon[298604]:     stray host np0005604209.localdomain has 1 stray daemons: ['mgr.np0005604209.isqrps']
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.1 (monmap changed)...
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:47:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:48:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:48:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:48:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:48:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:48:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:48:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17780 "" "Go-http-client/1.1"
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.4 (monmap changed)...
Feb 01 09:48:00 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604212.tkdkxt", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604212.tkdkxt (monmap changed)...
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604212.tkdkxt on np0005604212.localdomain
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604212.oynhpm", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:48:01 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:48:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:48:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:48:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:48:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:48:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604212.oynhpm (monmap changed)...
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604212.oynhpm on np0005604212.localdomain
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.200:0/2790969842' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604213.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:48:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604213 (monmap changed)...
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604213 on np0005604213.localdomain
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.0"} : dispatch
Feb 01 09:48:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='client.54301 -' entity='client.admin' cmd=[{"prefix": "orch", "action": "reconfig", "service_name": "osd.default_drive_group", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: Reconfig service osd.default_drive_group
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.0 (monmap changed)...
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.0 on np0005604213.localdomain
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' 
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: from='mgr.27041 172.18.0.200:0/1452582069' entity='mgr.np0005604209.isqrps' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/1659607' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e89 _set_cache_ratios kv ratio 0.25 inc ratio 0.375 full ratio 0.375
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e89 register_cache_with_pcm pcm target: 2147483648 pcm max: 1020054732 pcm min: 134217728 inc_osd_cache size: 1
Feb 01 09:48:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 e90: 6 total, 6 up, 6 in
Feb 01 09:48:04 np0005604215.localdomain sshd[295936]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: session-73.scope: Deactivated successfully.
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: session-73.scope: Consumed 18.359s CPU time.
Feb 01 09:48:04 np0005604215.localdomain systemd-logind[761]: Session 73 logged out. Waiting for processes to exit.
Feb 01 09:48:04 np0005604215.localdomain systemd-logind[761]: Removed session 73.
Feb 01 09:48:04 np0005604215.localdomain podman[299234]: 2026-02-01 09:48:04.742181143 +0000 UTC m=+0.086723276 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Feb 01 09:48:04 np0005604215.localdomain podman[299234]: 2026-02-01 09:48:04.783649901 +0000 UTC m=+0.128192044 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: tmp-crun.qlTGh4.mount: Deactivated successfully.
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:48:04 np0005604215.localdomain podman[299235]: 2026-02-01 09:48:04.806640951 +0000 UTC m=+0.146884930 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:48:04 np0005604215.localdomain podman[299235]: 2026-02-01 09:48:04.84302458 +0000 UTC m=+0.183268559 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:48:04 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:48:04 np0005604215.localdomain sshd[299283]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:48:04 np0005604215.localdomain sshd[299283]: Accepted publickey for ceph-admin from 192.168.122.107 port 56606 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:48:05 np0005604215.localdomain systemd-logind[761]: New session 74 of user ceph-admin.
Feb 01 09:48:05 np0005604215.localdomain systemd[1]: Started Session 74 of User ceph-admin.
Feb 01 09:48:05 np0005604215.localdomain sshd[299283]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1019510607 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:05 np0005604215.localdomain sudo[299287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:05 np0005604215.localdomain sudo[299287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:05 np0005604215.localdomain sudo[299287]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:05 np0005604215.localdomain sudo[299305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:48:05 np0005604215.localdomain sudo[299305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: Activating manager daemon np0005604213.caiaeh
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.200:0/1659607' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: osdmap e90: 6 total, 6 up, 6 in
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: mgrmap e48: np0005604213.caiaeh(active, starting, since 0.0397891s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: Manager daemon np0005604213.caiaeh is now available
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/mirror_snapshot_schedule"} : dispatch
Feb 01 09:48:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604213.caiaeh/trash_purge_schedule"} : dispatch
Feb 01 09:48:06 np0005604215.localdomain podman[299392]: 2026-02-01 09:48:06.062618991 +0000 UTC m=+0.102611324 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, RELEASE=main, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, distribution-scope=public, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, vendor=Red Hat, Inc., release=1764794109, GIT_BRANCH=main)
Feb 01 09:48:06 np0005604215.localdomain podman[299392]: 2026-02-01 09:48:06.258858224 +0000 UTC m=+0.298850567 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, release=1764794109, GIT_BRANCH=main, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.openshift.tags=rhceph ceph, distribution-scope=public, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:48:06 np0005604215.localdomain ceph-mon[298604]: mgrmap e49: np0005604213.caiaeh(active, since 1.06946s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:48:06 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:48:05] ENGINE Bus STARTING
Feb 01 09:48:06 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:48:06] ENGINE Serving on https://172.18.0.107:7150
Feb 01 09:48:06 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:48:06] ENGINE Client ('172.18.0.107', 42754) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:48:06 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:48:06] ENGINE Serving on http://172.18.0.107:8765
Feb 01 09:48:06 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:48:06] ENGINE Bus STARTED
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 1003...
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Activating special unit Exit the Session...
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped target Main User Target.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped target Basic System.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped target Paths.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped target Sockets.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped target Timers.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Closed D-Bus User Message Bus Socket.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Stopped Create User's Volatile Files and Directories.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Removed slice User Application Slice.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Reached target Shutdown.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Finished Exit the Session.
Feb 01 09:48:06 np0005604215.localdomain systemd[295356]: Reached target Exit the Session.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: user@1003.service: Deactivated successfully.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 1003.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/1003...
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: run-user-1003.mount: Deactivated successfully.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: user-runtime-dir@1003.service: Deactivated successfully.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/1003.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 1003.
Feb 01 09:48:06 np0005604215.localdomain systemd[1]: user-1003.slice: Consumed 2.151s CPU time.
Feb 01 09:48:06 np0005604215.localdomain sudo[299305]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:06 np0005604215.localdomain sudo[299513]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:06 np0005604215.localdomain sudo[299513]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:06 np0005604215.localdomain sudo[299513]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:07 np0005604215.localdomain sudo[299531]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:48:07 np0005604215.localdomain sudo[299531]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: Cluster is now healthy
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:07 np0005604215.localdomain ceph-mon[298604]: mgrmap e50: np0005604213.caiaeh(active, since 2s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:48:07 np0005604215.localdomain sudo[299531]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:07 np0005604215.localdomain sudo[299580]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:07 np0005604215.localdomain sudo[299580]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:07 np0005604215.localdomain sudo[299580]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:07 np0005604215.localdomain sudo[299598]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:48:07 np0005604215.localdomain sudo[299598]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299598]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:08 np0005604215.localdomain sudo[299634]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:48:08 np0005604215.localdomain sudo[299634]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299634]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:08 np0005604215.localdomain sudo[299652]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:48:08 np0005604215.localdomain sudo[299652]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299652]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:08 np0005604215.localdomain sudo[299670]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:48:08 np0005604215.localdomain sudo[299670]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299670]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:08 np0005604215.localdomain sudo[299688]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:08 np0005604215.localdomain sudo[299688]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299688]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:08 np0005604215.localdomain sudo[299706]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:48:08 np0005604215.localdomain sudo[299706]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299706]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:08 np0005604215.localdomain sudo[299740]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:48:08 np0005604215.localdomain sudo[299740]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:08 np0005604215.localdomain sudo[299740]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299758]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:48:09 np0005604215.localdomain sudo[299758]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299758]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299776]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:48:09 np0005604215.localdomain sudo[299776]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299776]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299794]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:48:09 np0005604215.localdomain sudo[299794]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299794]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:48:09 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:48:09 np0005604215.localdomain sudo[299812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:48:09 np0005604215.localdomain sudo[299812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299812]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:48:09 np0005604215.localdomain sudo[299830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299830]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299848]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:09 np0005604215.localdomain sudo[299848]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299848]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299866]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:48:09 np0005604215.localdomain sudo[299866]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299866]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:48:09 np0005604215.localdomain sudo[299900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299900]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299918]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:48:09 np0005604215.localdomain sudo[299918]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299918]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299936]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:48:09 np0005604215.localdomain sudo[299936]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299936]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299954]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:48:09 np0005604215.localdomain sudo[299954]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain sudo[299954]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:09 np0005604215.localdomain sudo[299972]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:48:09 np0005604215.localdomain sudo[299972]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:48:09 np0005604215.localdomain sudo[299972]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020040841 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:10 np0005604215.localdomain sudo[299991]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[299991]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[299991]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:48:10 np0005604215.localdomain podman[299990]: 2026-02-01 09:48:10.062428249 +0000 UTC m=+0.088621346 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, build-date=2026-01-22T05:09:47Z, vcs-type=git, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., distribution-scope=public, container_name=openstack_network_exporter, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9)
Feb 01 09:48:10 np0005604215.localdomain podman[299990]: 2026-02-01 09:48:10.107779909 +0000 UTC m=+0.133973046 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.openshift.expose-services=, config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Feb 01 09:48:10 np0005604215.localdomain sudo[300025]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:10 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:48:10 np0005604215.localdomain sudo[300025]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300025]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300058]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[300058]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300058]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain podman[300031]: 2026-02-01 09:48:10.21197109 +0000 UTC m=+0.139732085 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:48:10 np0005604215.localdomain podman[300031]: 2026-02-01 09:48:10.216773411 +0000 UTC m=+0.144534376 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 01 09:48:10 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: mgrmap e51: np0005604213.caiaeh(active, since 4s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm
Feb 01 09:48:10 np0005604215.localdomain ceph-mon[298604]: Standby manager daemon np0005604209.isqrps started
Feb 01 09:48:10 np0005604215.localdomain sudo[300099]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[300099]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300099]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300117]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[300117]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300117]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300135]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:48:10 np0005604215.localdomain sudo[300135]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300135]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300153]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:48:10 np0005604215.localdomain sudo[300153]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300153]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300171]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:48:10 np0005604215.localdomain sudo[300171]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300171]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300189]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[300189]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300189]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300207]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:10 np0005604215.localdomain sudo[300207]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300207]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300225]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[300225]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:10 np0005604215.localdomain sudo[300225]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:10 np0005604215.localdomain sudo[300259]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:48:10 np0005604215.localdomain sudo[300259]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:11 np0005604215.localdomain sudo[300259]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:11 np0005604215.localdomain sudo[300277]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:48:11 np0005604215.localdomain sudo[300277]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:11 np0005604215.localdomain sudo[300277]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:11 np0005604215.localdomain sudo[300295]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:48:11 np0005604215.localdomain sudo[300295]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:11 np0005604215.localdomain sudo[300295]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: mgrmap e52: np0005604213.caiaeh(active, since 5s), standbys: np0005604215.uhhqtv, np0005604212.oynhpm, np0005604209.isqrps
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:48:11 np0005604215.localdomain sudo[300313]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:48:11 np0005604215.localdomain sudo[300313]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:11 np0005604215.localdomain sudo[300313]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 0 B/s wr, 20 op/s
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.1"} : dispatch
Feb 01 09:48:12 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.1 on np0005604212.localdomain
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.4"} : dispatch
Feb 01 09:48:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.4 on np0005604212.localdomain
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.3"} : dispatch
Feb 01 09:48:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054374 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.3 (monmap changed)...
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.3 on np0005604213.localdomain
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604213.jdbvyh", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:48:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604213.jdbvyh (monmap changed)...
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604213.jdbvyh on np0005604213.localdomain
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 0 B/s wr, 11 op/s
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604213.caiaeh", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:48:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:16 np0005604215.localdomain sudo[300331]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:16 np0005604215.localdomain sudo[300331]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:16 np0005604215.localdomain sudo[300331]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:16 np0005604215.localdomain sudo[300349]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:16 np0005604215.localdomain sudo[300349]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 2026-02-01 09:48:17.203683264 +0000 UTC m=+0.074620377 container create 306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_lamarr, io.openshift.expose-services=, name=rhceph, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.tags=rhceph ceph, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, architecture=x86_64, distribution-scope=public, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, ceph=True, CEPH_POINT_RELEASE=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, vcs-type=git)
Feb 01 09:48:17 np0005604215.localdomain systemd[1]: Started libpod-conmon-306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8.scope.
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 2026-02-01 09:48:17.171745495 +0000 UTC m=+0.042682628 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:48:17 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 2026-02-01 09:48:17.296801369 +0000 UTC m=+0.167738482 container init 306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_lamarr, architecture=x86_64, distribution-scope=public, ceph=True, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, name=rhceph, description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, build-date=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True)
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 2026-02-01 09:48:17.306771722 +0000 UTC m=+0.177708855 container start 306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_lamarr, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, version=7, GIT_BRANCH=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, name=rhceph, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, io.buildah.version=1.41.4, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, RELEASE=main, distribution-scope=public, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 2026-02-01 09:48:17.307099412 +0000 UTC m=+0.178036565 container attach 306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_lamarr, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, name=rhceph, version=7, description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, GIT_BRANCH=main, distribution-scope=public, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, RELEASE=main, release=1764794109, CEPH_POINT_RELEASE=)
Feb 01 09:48:17 np0005604215.localdomain naughty_lamarr[300399]: 167 167
Feb 01 09:48:17 np0005604215.localdomain systemd[1]: libpod-306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8.scope: Deactivated successfully.
Feb 01 09:48:17 np0005604215.localdomain podman[300384]: 2026-02-01 09:48:17.310363014 +0000 UTC m=+0.181300137 container died 306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_lamarr, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, version=7, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, distribution-scope=public, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, vendor=Red Hat, Inc., ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7)
Feb 01 09:48:17 np0005604215.localdomain podman[300404]: 2026-02-01 09:48:17.404704978 +0000 UTC m=+0.081438311 container remove 306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=naughty_lamarr, RELEASE=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, ceph=True, build-date=2025-12-08T17:28:53Z, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.tags=rhceph ceph, vcs-type=git, io.openshift.expose-services=, name=rhceph, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, version=7, GIT_BRANCH=main, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64)
Feb 01 09:48:17 np0005604215.localdomain systemd[1]: libpod-conmon-306a98355de5cfa494f60fdb9710420ff4bd256b373a9614be6ccaf418c295f8.scope: Deactivated successfully.
Feb 01 09:48:17 np0005604215.localdomain sudo[300349]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604213.caiaeh (monmap changed)...
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604213.caiaeh on np0005604213.localdomain
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: from='client.64121 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "client.crash.np0005604215.localdomain", "caps": ["mon", "profile crash", "mgr", "profile crash"]} : dispatch
Feb 01 09:48:17 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:17 np0005604215.localdomain sudo[300421]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:17 np0005604215.localdomain sudo[300421]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:17 np0005604215.localdomain sudo[300421]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:17 np0005604215.localdomain sudo[300439]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:17 np0005604215.localdomain sudo[300439]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 2026-02-01 09:48:18.110813303 +0000 UTC m=+0.075536636 container create 9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_bose, io.openshift.expose-services=, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, CEPH_POINT_RELEASE=, build-date=2025-12-08T17:28:53Z, version=7, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: Started libpod-conmon-9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022.scope.
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 2026-02-01 09:48:18.174825417 +0000 UTC m=+0.139548760 container init 9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_bose, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, ceph=True, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, version=7, RELEASE=main, com.redhat.component=rhceph-container, GIT_CLEAN=True, vcs-type=git, release=1764794109, vendor=Red Hat, Inc., GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 2026-02-01 09:48:18.08103324 +0000 UTC m=+0.045756613 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 2026-02-01 09:48:18.184961614 +0000 UTC m=+0.149684947 container start 9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_bose, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-type=git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.expose-services=, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_REPO=https://github.com/ceph/ceph-container.git, name=rhceph, ceph=True, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7)
Feb 01 09:48:18 np0005604215.localdomain beautiful_bose[300490]: 167 167
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 2026-02-01 09:48:18.185603714 +0000 UTC m=+0.150327087 container attach 9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_bose, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, version=7, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, RELEASE=main, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., CEPH_POINT_RELEASE=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, release=1764794109, io.openshift.tags=rhceph ceph, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: libpod-9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022.scope: Deactivated successfully.
Feb 01 09:48:18 np0005604215.localdomain podman[300474]: 2026-02-01 09:48:18.19345358 +0000 UTC m=+0.158176933 container died 9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_bose, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_CLEAN=True, vcs-type=git, description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, io.openshift.expose-services=, architecture=x86_64, RELEASE=main, name=rhceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, version=7)
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a606d324a3b8ca9b6901dafddc6753064925ed9a055ed0d5964d48ea316e641b-merged.mount: Deactivated successfully.
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: tmp-crun.7KKSOe.mount: Deactivated successfully.
Feb 01 09:48:18 np0005604215.localdomain podman[300491]: 2026-02-01 09:48:18.274490347 +0000 UTC m=+0.109614743 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:48:18 np0005604215.localdomain podman[300503]: 2026-02-01 09:48:18.305672013 +0000 UTC m=+0.099909588 container remove 9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=beautiful_bose, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.buildah.version=1.41.4, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.description=Red Hat Ceph Storage 7, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, release=1764794109, architecture=x86_64, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vendor=Red Hat, Inc., version=7, GIT_BRANCH=main, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=Red Hat Ceph Storage 7, vcs-type=git)
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: libpod-conmon-9792d942921930fcd1de432405f1a21fa2b2a02635540e3ad19746a59e94a022.scope: Deactivated successfully.
Feb 01 09:48:18 np0005604215.localdomain podman[300491]: 2026-02-01 09:48:18.338766339 +0000 UTC m=+0.173890745 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:48:18 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:48:18 np0005604215.localdomain sudo[300439]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: Reconfiguring crash.np0005604215 (monmap changed)...
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon crash.np0005604215 on np0005604215.localdomain
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.2"} : dispatch
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.2 (monmap changed)...
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.2 on np0005604215.localdomain
Feb 01 09:48:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:18 np0005604215.localdomain sudo[300535]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:18 np0005604215.localdomain sudo[300535]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:18 np0005604215.localdomain sudo[300535]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:18 np0005604215.localdomain sudo[300553]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:18 np0005604215.localdomain sudo[300553]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 2026-02-01 09:48:19.196612315 +0000 UTC m=+0.078560471 container create 0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_gagarin, name=rhceph, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, GIT_CLEAN=True, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.k8s.description=Red Hat Ceph Storage 7, version=7, io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>)
Feb 01 09:48:19 np0005604215.localdomain systemd[1]: tmp-crun.7MnEao.mount: Deactivated successfully.
Feb 01 09:48:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-0be70610547c819dc755855de0deb5cf79d8e6abe7149521be6fb591b2e205a3-merged.mount: Deactivated successfully.
Feb 01 09:48:19 np0005604215.localdomain systemd[1]: Started libpod-conmon-0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7.scope.
Feb 01 09:48:19 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 2026-02-01 09:48:19.262410415 +0000 UTC m=+0.144358581 container init 0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_gagarin, CEPH_POINT_RELEASE=, vcs-type=git, RELEASE=main, vendor=Red Hat, Inc., version=7, name=rhceph, distribution-scope=public, GIT_BRANCH=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.41.4, com.redhat.component=rhceph-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, build-date=2025-12-08T17:28:53Z)
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 2026-02-01 09:48:19.167155593 +0000 UTC m=+0.049103769 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 2026-02-01 09:48:19.274552155 +0000 UTC m=+0.156500311 container start 0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_gagarin, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_CLEAN=True, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., ceph=True, vcs-type=git, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, version=7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, distribution-scope=public, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, architecture=x86_64, io.openshift.tags=rhceph ceph, vendor=Red Hat, Inc.)
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 2026-02-01 09:48:19.275013599 +0000 UTC m=+0.156961795 container attach 0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_gagarin, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-type=git, build-date=2025-12-08T17:28:53Z, release=1764794109, ceph=True, com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_BRANCH=main, io.k8s.description=Red Hat Ceph Storage 7, description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.openshift.expose-services=, io.openshift.tags=rhceph ceph)
Feb 01 09:48:19 np0005604215.localdomain adoring_gagarin[300602]: 167 167
Feb 01 09:48:19 np0005604215.localdomain systemd[1]: libpod-0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7.scope: Deactivated successfully.
Feb 01 09:48:19 np0005604215.localdomain podman[300587]: 2026-02-01 09:48:19.277782276 +0000 UTC m=+0.159730432 container died 0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_gagarin, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, version=7, build-date=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, ceph=True, GIT_BRANCH=main, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, vendor=Red Hat, Inc., RELEASE=main, description=Red Hat Ceph Storage 7, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, io.openshift.tags=rhceph ceph, distribution-scope=public, architecture=x86_64, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:48:19 np0005604215.localdomain podman[300607]: 2026-02-01 09:48:19.375206776 +0000 UTC m=+0.087081497 container remove 0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=adoring_gagarin, name=rhceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., distribution-scope=public, io.openshift.expose-services=, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, io.buildah.version=1.41.4, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_CLEAN=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True)
Feb 01 09:48:19 np0005604215.localdomain systemd[1]: libpod-conmon-0cd53eb72945342f0bfe76643c49e5bdd455d8abff6f681e5a76c8cd171aaab7.scope: Deactivated successfully.
Feb 01 09:48:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "osd.5"} : dispatch
Feb 01 09:48:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:19 np0005604215.localdomain sudo[300553]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:19 np0005604215.localdomain sudo[300630]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:19 np0005604215.localdomain sudo[300630]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:19 np0005604215.localdomain sudo[300630]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:19 np0005604215.localdomain sudo[300648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:19 np0005604215.localdomain sudo[300648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054722 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e6d37cf46cf6759a7f6915b3c7b149c563d58c54b621d325e6edad63496636b9-merged.mount: Deactivated successfully.
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 2026-02-01 09:48:20.236122717 +0000 UTC m=+0.067660609 container create cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_dubinsky, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, description=Red Hat Ceph Storage 7, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, CEPH_POINT_RELEASE=, io.buildah.version=1.41.4, release=1764794109, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, RELEASE=main, name=rhceph, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, architecture=x86_64, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, com.redhat.component=rhceph-container, version=7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph, GIT_BRANCH=main, vcs-type=git, build-date=2025-12-08T17:28:53Z)
Feb 01 09:48:20 np0005604215.localdomain systemd[1]: Started libpod-conmon-cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e.scope.
Feb 01 09:48:20 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 2026-02-01 09:48:20.298327006 +0000 UTC m=+0.129864918 container init cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_dubinsky, io.openshift.tags=rhceph ceph, ceph=True, description=Red Hat Ceph Storage 7, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., com.redhat.component=rhceph-container, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_CLEAN=True, GIT_BRANCH=main, build-date=2025-12-08T17:28:53Z, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, CEPH_POINT_RELEASE=, release=1764794109, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.description=Red Hat Ceph Storage 7, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 2026-02-01 09:48:20.202325809 +0000 UTC m=+0.033863771 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 2026-02-01 09:48:20.314281494 +0000 UTC m=+0.145819416 container start cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_dubinsky, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_CLEAN=True, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, version=7, io.buildah.version=1.41.4, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_BRANCH=main, vendor=Red Hat, Inc., vcs-type=git, distribution-scope=public, build-date=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, ceph=True)
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 2026-02-01 09:48:20.314566333 +0000 UTC m=+0.146104255 container attach cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_dubinsky, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, io.buildah.version=1.41.4, architecture=x86_64, name=rhceph, GIT_BRANCH=main, vcs-type=git, distribution-scope=public, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, ceph=True, CEPH_POINT_RELEASE=, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.expose-services=)
Feb 01 09:48:20 np0005604215.localdomain festive_dubinsky[300698]: 167 167
Feb 01 09:48:20 np0005604215.localdomain systemd[1]: libpod-cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e.scope: Deactivated successfully.
Feb 01 09:48:20 np0005604215.localdomain podman[300683]: 2026-02-01 09:48:20.318530358 +0000 UTC m=+0.150068300 container died cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_dubinsky, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, release=1764794109, io.k8s.description=Red Hat Ceph Storage 7, name=rhceph, version=7, GIT_CLEAN=True, RELEASE=main, vendor=Red Hat, Inc., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.tags=rhceph ceph, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.buildah.version=1.41.4, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, ceph=True)
Feb 01 09:48:20 np0005604215.localdomain podman[300703]: 2026-02-01 09:48:20.419609382 +0000 UTC m=+0.085705324 container remove cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=festive_dubinsky, build-date=2025-12-08T17:28:53Z, org.opencontainers.image.created=2025-12-08T17:28:53Z, vcs-type=git, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, RELEASE=main, CEPH_POINT_RELEASE=, architecture=x86_64, name=rhceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, url=https://catalog.redhat.com/en/search?searchType=containers, version=7, distribution-scope=public, io.k8s.description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., GIT_BRANCH=main, io.openshift.expose-services=)
Feb 01 09:48:20 np0005604215.localdomain systemd[1]: libpod-conmon-cec86a538359b94622bf82feb9cec310c51d13a1624cbd9cb049bce09a092f0e.scope: Deactivated successfully.
Feb 01 09:48:20 np0005604215.localdomain sudo[300648]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: Reconfiguring osd.5 (monmap changed)...
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon osd.5 on np0005604215.localdomain
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='client.54349 -' entity='client.admin' cmd=[{"prefix": "orch apply", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: Saving service mon spec with placement label:mon
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mds.mds.np0005604215.rwvxvg", "caps": ["mon", "profile mds", "osd", "allow rw tag cephfs *=*", "mds", "allow"]} : dispatch
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get-or-create", "entity": "mgr.np0005604215.uhhqtv", "caps": ["mon", "profile mgr", "osd", "allow *", "mds", "allow *"]} : dispatch
Feb 01 09:48:20 np0005604215.localdomain sudo[300720]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:20 np0005604215.localdomain sudo[300720]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:20 np0005604215.localdomain sudo[300720]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:20 np0005604215.localdomain sudo[300738]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:20 np0005604215.localdomain sudo[300738]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 2026-02-01 09:48:21.115874909 +0000 UTC m=+0.074317747 container create a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_torvalds, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, GIT_CLEAN=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, ceph=True, io.buildah.version=1.41.4, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, GIT_BRANCH=main, architecture=x86_64, vcs-type=git, GIT_REPO=https://github.com/ceph/ceph-container.git, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., name=rhceph, RELEASE=main, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1764794109, version=7, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.openshift.tags=rhceph ceph, io.k8s.description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z)
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: Started libpod-conmon-a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3.scope.
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 2026-02-01 09:48:21.176671623 +0000 UTC m=+0.135114471 container init a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_torvalds, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, GIT_CLEAN=True, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=Red Hat Ceph Storage 7, version=7, distribution-scope=public, CEPH_POINT_RELEASE=, ceph=True, name=rhceph, GIT_REPO=https://github.com/ceph/ceph-container.git, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.openshift.tags=rhceph ceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, architecture=x86_64, vcs-type=git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., com.redhat.component=rhceph-container, GIT_BRANCH=main)
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 2026-02-01 09:48:21.08585833 +0000 UTC m=+0.044301218 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 2026-02-01 09:48:21.187039367 +0000 UTC m=+0.145482195 container start a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_torvalds, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, distribution-scope=public, description=Red Hat Ceph Storage 7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, name=rhceph, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, GIT_BRANCH=main, architecture=x86_64, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_REPO=https://github.com/ceph/ceph-container.git, RELEASE=main, vendor=Red Hat, Inc., io.k8s.description=Red Hat Ceph Storage 7, ceph=True, build-date=2025-12-08T17:28:53Z, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, io.buildah.version=1.41.4, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, version=7, CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-type=git)
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 2026-02-01 09:48:21.187276775 +0000 UTC m=+0.145719603 container attach a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_torvalds, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=Red Hat Ceph Storage 7, vendor=Red Hat, Inc., CEPH_POINT_RELEASE=, com.redhat.component=rhceph-container, ceph=True, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, GIT_CLEAN=True, architecture=x86_64, io.buildah.version=1.41.4, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, build-date=2025-12-08T17:28:53Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, name=rhceph, distribution-scope=public, io.openshift.tags=rhceph ceph, release=1764794109, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.openshift.expose-services=, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., RELEASE=main, url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main)
Feb 01 09:48:21 np0005604215.localdomain flamboyant_torvalds[300790]: 167 167
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: libpod-a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3.scope: Deactivated successfully.
Feb 01 09:48:21 np0005604215.localdomain podman[300775]: 2026-02-01 09:48:21.190847106 +0000 UTC m=+0.149289964 container died a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_torvalds, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, io.openshift.tags=rhceph ceph, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=rhceph-container, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_BRANCH=main, name=rhceph, distribution-scope=public, io.buildah.version=1.41.4, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, RELEASE=main, org.opencontainers.image.created=2025-12-08T17:28:53Z, build-date=2025-12-08T17:28:53Z, version=7, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.k8s.description=Red Hat Ceph Storage 7, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., maintainer=Guillaume Abrioux <gabrioux@redhat.com>, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #13. Immutable memtables: 0.
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.207741) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 3] Flushing memtable with next log file: 13
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301207858, "job": 3, "event": "flush_started", "num_memtables": 1, "num_entries": 13135, "num_deletes": 262, "total_data_size": 27193912, "memory_usage": 28596848, "flush_reason": "Manual Compaction"}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 3] Level-0 flush table #14: started
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: tmp-crun.QxktcG.mount: Deactivated successfully.
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e455adc3f6d13a9d9194ee1092d8de849e55318b22d3465b1f2b600ad4c0b1f8-merged.mount: Deactivated successfully.
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301290864, "cf_name": "default", "job": 3, "event": "table_file_creation", "file_number": 14, "file_size": 23674382, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 6, "largest_seqno": 13140, "table_properties": {"data_size": 23605400, "index_size": 37708, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30213, "raw_key_size": 322843, "raw_average_key_size": 26, "raw_value_size": 23399734, "raw_average_value_size": 1938, "num_data_blocks": 1428, "num_entries": 12072, "num_filter_entries": 12072, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 1769939270, "file_creation_time": 1769939301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 14, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 3] Flush lasted 83168 microseconds, and 39631 cpu microseconds.
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.290923) [db/flush_job.cc:967] [default] [JOB 3] Level-0 flush table #14: 23674382 bytes OK
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.290947) [db/memtable_list.cc:519] [default] Level-0 commit table #14 started
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.293228) [db/memtable_list.cc:722] [default] Level-0 commit table #14: memtable #1 done
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.293252) EVENT_LOG_v1 {"time_micros": 1769939301293241, "job": 3, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [2, 0, 0, 0, 0, 0, 0], "immutable_memtables": 0}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.293271) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: files[2 0 0 0 0 0 0] max score 0.50
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 3] Try to delete WAL files size 27105185, prev total WAL file size 27105185, number of live WAL files 2.
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000009.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.296419) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6B760031353131' seq:72057594037927935, type:22 .. '6B760031373637' seq:0, type:0; will stop at (end)
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 4] Compacting 2@0 files to L6, score -1.00
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 3 Base level 0, inputs: [14(22MB) 8(1762B)]
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301296517, "job": 4, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [14, 8], "score": -1, "input_data_size": 23676144, "oldest_snapshot_seqno": -1}
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-59874579fceaaf9680656a20fe5dd803ff018810c1c1a3e307e224656e8951e7-merged.mount: Deactivated successfully.
Feb 01 09:48:21 np0005604215.localdomain podman[300795]: 2026-02-01 09:48:21.312198956 +0000 UTC m=+0.108054654 container remove a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=flamboyant_torvalds, release=1764794109, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, name=rhceph, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, distribution-scope=public, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, version=7, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., architecture=x86_64, RELEASE=main, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, io.buildah.version=1.41.4, vcs-type=git)
Feb 01 09:48:21 np0005604215.localdomain systemd[1]: libpod-conmon-a73e8520acd295b7a509ea7c68475ec50ebf1429835ef79eb2b1c607dcb108c3.scope: Deactivated successfully.
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 4] Generated table #15: 11818 keys, 23670868 bytes, temperature: kUnknown
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301417926, "cf_name": "default", "job": 4, "event": "table_file_creation", "file_number": 15, "file_size": 23670868, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 23602560, "index_size": 37679, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29573, "raw_key_size": 319018, "raw_average_key_size": 26, "raw_value_size": 23400139, "raw_average_value_size": 1980, "num_data_blocks": 1427, "num_entries": 11818, "num_filter_entries": 11818, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939301, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 15, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.420619) [db/compaction/compaction_job.cc:1663] [default] [JOB 4] Compacted 2@0 files to L6 => 23670868 bytes
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.422205) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.0 rd, 192.9 wr, level 6, files in(2, 0) out(1 +0 blob) MB in(22.6, 0.0 +0.0 blob) out(22.6 +0.0 blob), read-write-amplify(2.0) write-amplify(1.0) OK, records in: 12077, records dropped: 259 output_compression: NoCompression
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.422262) EVENT_LOG_v1 {"time_micros": 1769939301422247, "job": 4, "event": "compaction_finished", "compaction_time_micros": 122703, "compaction_time_cpu_micros": 42227, "output_level": 6, "num_output_files": 1, "total_output_size": 23670868, "num_input_records": 12077, "num_output_records": 11818, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000014.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301425721, "job": 4, "event": "table_file_deletion", "file_number": 14}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000008.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939301425780, "job": 4, "event": "table_file_deletion", "file_number": 8}
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:48:21.296340) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:48:21 np0005604215.localdomain sudo[300738]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mds.mds.np0005604215.rwvxvg (monmap changed)...
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mds.mds.np0005604215.rwvxvg on np0005604215.localdomain
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='client.54355 -' entity='client.admin' cmd=[{"prefix": "orch ps", "daemon_type": "mon", "daemon_id": "np0005604215", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "mgr services"} : dispatch
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mgr.np0005604215.uhhqtv (monmap changed)...
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mgr.np0005604215.uhhqtv on np0005604215.localdomain
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:48:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:21 np0005604215.localdomain sudo[300812]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:48:21 np0005604215.localdomain sudo[300812]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:21 np0005604215.localdomain sudo[300812]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:21 np0005604215.localdomain sudo[300830]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 _orch deploy --fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:48:21 np0005604215.localdomain sudo[300830]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 2026-02-01 09:48:22.083007217 +0000 UTC m=+0.080220183 container create 3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_vaughan, architecture=x86_64, io.openshift.expose-services=, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, io.buildah.version=1.41.4, vcs-type=git, version=7, RELEASE=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=Red Hat Ceph Storage 7, ceph=True, name=rhceph, build-date=2025-12-08T17:28:53Z, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, vendor=Red Hat, Inc., GIT_BRANCH=main, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:48:22 np0005604215.localdomain systemd[1]: Started libpod-conmon-3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27.scope.
Feb 01 09:48:22 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 2026-02-01 09:48:22.143590753 +0000 UTC m=+0.140803709 container init 3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_vaughan, io.buildah.version=1.41.4, distribution-scope=public, org.opencontainers.image.created=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., release=1764794109, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.openshift.tags=rhceph ceph, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_BRANCH=main, name=rhceph, version=7, description=Red Hat Ceph Storage 7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, RELEASE=main, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, GIT_CLEAN=True, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, CEPH_POINT_RELEASE=, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 2026-02-01 09:48:22.052507382 +0000 UTC m=+0.049720378 image pull  registry.redhat.io/rhceph/rhceph-7-rhel9:latest
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 2026-02-01 09:48:22.152473981 +0000 UTC m=+0.149686937 container start 3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_vaughan, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, distribution-scope=public, build-date=2025-12-08T17:28:53Z, version=7, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, GIT_BRANCH=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=rhceph ceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, io.buildah.version=1.41.4, RELEASE=main, GIT_CLEAN=True, release=1764794109, description=Red Hat Ceph Storage 7, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, name=rhceph, vcs-type=git, architecture=x86_64, io.openshift.expose-services=)
Feb 01 09:48:22 np0005604215.localdomain sleepy_vaughan[300879]: 167 167
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 2026-02-01 09:48:22.152673748 +0000 UTC m=+0.149886704 container attach 3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_vaughan, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, ceph=True, RELEASE=main, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, version=7, GIT_CLEAN=True, vcs-type=git, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, architecture=x86_64, io.k8s.description=Red Hat Ceph Storage 7, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., description=Red Hat Ceph Storage 7, GIT_BRANCH=main, GIT_REPO=https://github.com/ceph/ceph-container.git, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, com.redhat.component=rhceph-container, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, name=rhceph, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, CEPH_POINT_RELEASE=, io.openshift.expose-services=, distribution-scope=public)
Feb 01 09:48:22 np0005604215.localdomain systemd[1]: libpod-3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27.scope: Deactivated successfully.
Feb 01 09:48:22 np0005604215.localdomain podman[300864]: 2026-02-01 09:48:22.155181085 +0000 UTC m=+0.152394051 container died 3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_vaughan, version=7, io.k8s.description=Red Hat Ceph Storage 7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, CEPH_POINT_RELEASE=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, description=Red Hat Ceph Storage 7, name=rhceph, ceph=True, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, com.redhat.component=rhceph-container, GIT_BRANCH=main, RELEASE=main, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-12-08T17:28:53Z, vendor=Red Hat, Inc., GIT_CLEAN=True, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.41.4, org.opencontainers.image.created=2025-12-08T17:28:53Z, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.openshift.expose-services=, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_REPO=https://github.com/ceph/ceph-container.git, vcs-type=git, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9)
Feb 01 09:48:22 np0005604215.localdomain systemd[1]: tmp-crun.yTJzvC.mount: Deactivated successfully.
Feb 01 09:48:22 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b37d8b5f095d75d8634c221dfda891575ed2b7e00de3f80dadf8a6761730af67-merged.mount: Deactivated successfully.
Feb 01 09:48:22 np0005604215.localdomain podman[300885]: 2026-02-01 09:48:22.251510522 +0000 UTC m=+0.086044255 container remove 3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27 (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=sleepy_vaughan, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, vcs-type=git, CEPH_POINT_RELEASE=, RELEASE=main, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, GIT_BRANCH=main, description=Red Hat Ceph Storage 7, io.k8s.description=Red Hat Ceph Storage 7, org.opencontainers.image.created=2025-12-08T17:28:53Z, ceph=True, com.redhat.component=rhceph-container, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_REPO=https://github.com/ceph/ceph-container.git, io.buildah.version=1.41.4, io.openshift.tags=rhceph ceph, distribution-scope=public, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, release=1764794109, architecture=x86_64, GIT_CLEAN=True, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=7, build-date=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=rhceph)
Feb 01 09:48:22 np0005604215.localdomain systemd[1]: libpod-conmon-3e4c3f6b5aefe3660de5630a94b9c24ecdc867dfed1e86dfe3c6f5a68a884d27.scope: Deactivated successfully.
Feb 01 09:48:22 np0005604215.localdomain sudo[300830]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mon.np0005604215 (monmap changed)...
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mon.np0005604215 on np0005604215.localdomain
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:22 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:48:22 np0005604215.localdomain sudo[300902]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:48:22 np0005604215.localdomain sudo[300902]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:48:22 np0005604215.localdomain sudo[300902]: pam_unix(sudo:session): session closed for user root
Feb 01 09:48:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:48:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:48:23 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mon.np0005604212 (monmap changed)...
Feb 01 09:48:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:23 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mon.np0005604212 on np0005604212.localdomain
Feb 01 09:48:23 np0005604215.localdomain ceph-mon[298604]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "auth get", "entity": "mon."} : dispatch
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config get", "who": "mon", "key": "public_network"} : dispatch
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: Reconfiguring mon.np0005604213 (monmap changed)...
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: Reconfiguring daemon mon.np0005604213 on np0005604213.localdomain
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:48:24 np0005604215.localdomain podman[300920]: 2026-02-01 09:48:24.90915203 +0000 UTC m=+0.125170879 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:48:24 np0005604215.localdomain podman[300920]: 2026-02-01 09:48:24.94812201 +0000 UTC m=+0.164140869 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:48:24 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:48:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34373 172.18.0.107:0/1411989763' entity='mgr.np0005604213.caiaeh' 
Feb 01 09:48:25 np0005604215.localdomain ceph-mon[298604]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:28 np0005604215.localdomain ceph-mon[298604]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:48:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:48:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:48:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:48:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:48:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17788 "" "Go-http-client/1.1"
Feb 01 09:48:30 np0005604215.localdomain ceph-mon[298604]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:48:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:48:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:48:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:48:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:48:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:48:32 np0005604215.localdomain ceph-mon[298604]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:34 np0005604215.localdomain ceph-mon[298604]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:48:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/940957761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:48:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:48:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/940957761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:48:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/940957761' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:48:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/940957761' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:48:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:48:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:48:35 np0005604215.localdomain systemd[1]: tmp-crun.3XTgs7.mount: Deactivated successfully.
Feb 01 09:48:35 np0005604215.localdomain podman[300945]: 2026-02-01 09:48:35.8773622 +0000 UTC m=+0.092483176 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller)
Feb 01 09:48:35 np0005604215.localdomain podman[300946]: 2026-02-01 09:48:35.949501759 +0000 UTC m=+0.161768926 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:48:35 np0005604215.localdomain podman[300946]: 2026-02-01 09:48:35.956983622 +0000 UTC m=+0.169250789 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:48:35 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:48:35 np0005604215.localdomain podman[300945]: 2026-02-01 09:48:35.978911049 +0000 UTC m=+0.194032055 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:48:35 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:48:36 np0005604215.localdomain ceph-mon[298604]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:38 np0005604215.localdomain ceph-mon[298604]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:40 np0005604215.localdomain ceph-mon[298604]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:48:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:48:40 np0005604215.localdomain podman[300992]: 2026-02-01 09:48:40.863558439 +0000 UTC m=+0.077409474 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, vcs-type=git, version=9.7, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, release=1769056855, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 01 09:48:40 np0005604215.localdomain podman[300992]: 2026-02-01 09:48:40.875350148 +0000 UTC m=+0.089201183 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_id=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, distribution-scope=public, release=1769056855, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Feb 01 09:48:40 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:48:40 np0005604215.localdomain systemd[1]: tmp-crun.3KUxVz.mount: Deactivated successfully.
Feb 01 09:48:40 np0005604215.localdomain podman[300993]: 2026-02-01 09:48:40.975146822 +0000 UTC m=+0.185031293 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:48:41 np0005604215.localdomain podman[300993]: 2026-02-01 09:48:41.008890169 +0000 UTC m=+0.218774650 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Feb 01 09:48:41 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:48:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:48:41.765 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:48:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:48:41.765 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:48:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:48:41.766 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:48:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:42.104 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:42.105 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:48:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:42.105 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:48:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:42.189 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:48:42 np0005604215.localdomain ceph-mon[298604]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:44.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:44.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:44.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:44.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:48:44 np0005604215.localdomain ceph-mon[298604]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:45.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:46.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:46 np0005604215.localdomain ceph-mon[298604]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:46 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1197085737' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.123 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.123 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:48:47 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1009944486' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:47 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2785224664' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:47 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2497530275' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:48:47 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1928219287' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.564 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.800 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.802 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12422MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.802 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.803 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.877 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.877 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:48:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:47.897 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:48:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:48:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4292011110' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:48.345 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.448s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:48:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:48.351 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:48:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:48.378 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:48:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:48.381 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:48:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:48.381 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.579s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:48:48 np0005604215.localdomain ceph-mon[298604]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:48 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1928219287' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:48 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4292011110' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:48:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:48:48 np0005604215.localdomain podman[301074]: 2026-02-01 09:48:48.868397278 +0000 UTC m=+0.083208236 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:48:48 np0005604215.localdomain podman[301074]: 2026-02-01 09:48:48.879598289 +0000 UTC m=+0.094409237 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 01 09:48:48 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:48:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:50 np0005604215.localdomain ceph-mon[298604]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:51.384 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:48:51.385 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:48:52 np0005604215.localdomain ceph-mon[298604]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:54 np0005604215.localdomain ceph-mon[298604]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:48:55 np0005604215.localdomain ceph-mon[298604]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:48:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:48:55 np0005604215.localdomain podman[301094]: 2026-02-01 09:48:55.859325977 +0000 UTC m=+0.075165194 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:48:55 np0005604215.localdomain podman[301094]: 2026-02-01 09:48:55.867472062 +0000 UTC m=+0.083311309 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:48:55 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:48:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.200:0/2545861821' entity='client.admin' cmd={"prefix": "status", "format": "json"} : dispatch
Feb 01 09:48:58 np0005604215.localdomain ceph-mon[298604]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:49:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:49:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:49:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:49:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:49:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17791 "" "Go-http-client/1.1"
Feb 01 09:49:00 np0005604215.localdomain ceph-mon[298604]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:49:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:49:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:49:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:49:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:49:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:49:02 np0005604215.localdomain ceph-mon[298604]: from='client.64166 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:49:02 np0005604215.localdomain ceph-mon[298604]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:49:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:49:04 np0005604215.localdomain ceph-mon[298604]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:06 np0005604215.localdomain ceph-mon[298604]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.200:0/244522388' entity='client.admin' cmd={"prefix": "config dump", "format": "json"} : dispatch
Feb 01 09:49:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:49:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:49:06 np0005604215.localdomain podman[301117]: 2026-02-01 09:49:06.819039212 +0000 UTC m=+0.081463311 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:49:06 np0005604215.localdomain podman[301117]: 2026-02-01 09:49:06.858705184 +0000 UTC m=+0.121129283 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 01 09:49:06 np0005604215.localdomain systemd[1]: tmp-crun.5OIIA6.mount: Deactivated successfully.
Feb 01 09:49:06 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:49:06 np0005604215.localdomain podman[301118]: 2026-02-01 09:49:06.880859307 +0000 UTC m=+0.141860352 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:49:06 np0005604215.localdomain podman[301118]: 2026-02-01 09:49:06.890634743 +0000 UTC m=+0.151635758 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:49:06 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:49:08 np0005604215.localdomain ceph-mon[298604]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:10 np0005604215.localdomain ceph-mon[298604]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:49:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:49:11 np0005604215.localdomain systemd[1]: tmp-crun.PX1KOe.mount: Deactivated successfully.
Feb 01 09:49:11 np0005604215.localdomain podman[301167]: 2026-02-01 09:49:11.863872057 +0000 UTC m=+0.076979012 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:49:11 np0005604215.localdomain podman[301167]: 2026-02-01 09:49:11.896793707 +0000 UTC m=+0.109900672 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:49:11 np0005604215.localdomain systemd[1]: tmp-crun.4pqjI0.mount: Deactivated successfully.
Feb 01 09:49:11 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:49:11 np0005604215.localdomain podman[301166]: 2026-02-01 09:49:11.919462687 +0000 UTC m=+0.132921453 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2026-01-22T05:09:47Z, version=9.7, managed_by=edpm_ansible, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git)
Feb 01 09:49:11 np0005604215.localdomain podman[301166]: 2026-02-01 09:49:11.930706289 +0000 UTC m=+0.144165045 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, config_id=openstack_network_exporter, architecture=x86_64, vcs-type=git)
Feb 01 09:49:11 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:49:12 np0005604215.localdomain ceph-mon[298604]: from='client.54403 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json"}]: dispatch
Feb 01 09:49:12 np0005604215.localdomain ceph-mon[298604]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:14 np0005604215.localdomain ceph-mon[298604]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:16 np0005604215.localdomain ceph-mon[298604]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:18 np0005604215.localdomain ceph-mon[298604]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.200:0/2521108953' entity='client.admin' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #16. Immutable memtables: 0.
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.507255) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 5] Flushing memtable with next log file: 16
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359507409, "job": 5, "event": "flush_started", "num_memtables": 1, "num_entries": 1027, "num_deletes": 251, "total_data_size": 1358557, "memory_usage": 1388096, "flush_reason": "Manual Compaction"}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 5] Level-0 flush table #17: started
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359518960, "cf_name": "default", "job": 5, "event": "table_file_creation", "file_number": 17, "file_size": 870764, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 13146, "largest_seqno": 14167, "table_properties": {"data_size": 866270, "index_size": 2093, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1349, "raw_key_size": 10920, "raw_average_key_size": 21, "raw_value_size": 856975, "raw_average_value_size": 1657, "num_data_blocks": 88, "num_entries": 517, "num_filter_entries": 517, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939301, "oldest_key_time": 1769939301, "file_creation_time": 1769939359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 17, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 5] Flush lasted 11743 microseconds, and 3788 cpu microseconds.
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.519009) [db/flush_job.cc:967] [default] [JOB 5] Level-0 flush table #17: 870764 bytes OK
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.519030) [db/memtable_list.cc:519] [default] Level-0 commit table #17 started
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.521062) [db/memtable_list.cc:722] [default] Level-0 commit table #17: memtable #1 done
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.521088) EVENT_LOG_v1 {"time_micros": 1769939359521081, "job": 5, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.521114) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 5] Try to delete WAL files size 1353349, prev total WAL file size 1353349, number of live WAL files 2.
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000013.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.521966) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131303434' seq:72057594037927935, type:22 .. '7061786F73003131323936' seq:0, type:0; will stop at (end)
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 6] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 5 Base level 0, inputs: [17(850KB)], [15(22MB)]
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359522024, "job": 6, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [17], "files_L6": [15], "score": -1, "input_data_size": 24541632, "oldest_snapshot_seqno": -1}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 6] Generated table #18: 11809 keys, 20524115 bytes, temperature: kUnknown
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359649138, "cf_name": "default", "job": 6, "event": "table_file_creation", "file_number": 18, "file_size": 20524115, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 20457444, "index_size": 36042, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 29573, "raw_key_size": 319468, "raw_average_key_size": 27, "raw_value_size": 20256736, "raw_average_value_size": 1715, "num_data_blocks": 1356, "num_entries": 11809, "num_filter_entries": 11809, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939359, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 18, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.649485) [db/compaction/compaction_job.cc:1663] [default] [JOB 6] Compacted 1@0 + 1@6 files to L6 => 20524115 bytes
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.651317) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 193.0 rd, 161.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.8, 22.6 +0.0 blob) out(19.6 +0.0 blob), read-write-amplify(51.8) write-amplify(23.6) OK, records in: 12335, records dropped: 526 output_compression: NoCompression
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.651347) EVENT_LOG_v1 {"time_micros": 1769939359651334, "job": 6, "event": "compaction_finished", "compaction_time_micros": 127184, "compaction_time_cpu_micros": 52026, "output_level": 6, "num_output_files": 1, "total_output_size": 20524115, "num_input_records": 12335, "num_output_records": 11809, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000017.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359651590, "job": 6, "event": "table_file_deletion", "file_number": 17}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000015.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939359655577, "job": 6, "event": "table_file_deletion", "file_number": 15}
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.521852) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.655608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.655613) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.655615) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.655618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:49:19 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:49:19.655621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:49:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:49:19 np0005604215.localdomain podman[301204]: 2026-02-01 09:49:19.861246534 +0000 UTC m=+0.078069654 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true)
Feb 01 09:49:19 np0005604215.localdomain podman[301204]: 2026-02-01 09:49:19.895885269 +0000 UTC m=+0.112708349 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:49:19 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:49:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e90 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:20 np0005604215.localdomain ceph-mon[298604]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr fail"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='client.? 172.18.0.200:0/2103452742' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 e91: 6 total, 6 up, 6 in
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr handle_mgr_map Activating!
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr handle_mgr_map I am now activating
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604212"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604213"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon metadata", "id": "np0005604215"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).mds e16 all = 0
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).mds e16 all = 0
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).mds e16 all = 0
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 0} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 1} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 2} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 3} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 4} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata", "id": 5} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mds metadata"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).mds e16 all = 1
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon metadata"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [balancer DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: balancer
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [cephadm DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:49:21
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Some PGs (1.000000) are unknown; try again later
Feb 01 09:49:21 np0005604215.localdomain sshd[299283]: pam_unix(sshd:session): session closed for user ceph-admin
Feb 01 09:49:21 np0005604215.localdomain systemd[1]: session-74.scope: Deactivated successfully.
Feb 01 09:49:21 np0005604215.localdomain systemd[1]: session-74.scope: Consumed 10.568s CPU time.
Feb 01 09:49:21 np0005604215.localdomain systemd-logind[761]: Session 74 logged out. Waiting for processes to exit.
Feb 01 09:49:21 np0005604215.localdomain systemd-logind[761]: Removed session 74.
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: cephadm
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [crash DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: crash
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [devicehealth DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: devicehealth
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [iostat DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: iostat
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [devicehealth INFO root] Starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [nfs DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: nfs
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [orchestrator DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: orchestrator
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: pg_autoscaler
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [progress DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: progress
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Loading...
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Loaded [<progress.module.GhostEvent object at 0x7f9407318640>, <progress.module.GhostEvent object at 0x7f94073185b0>, <progress.module.GhostEvent object at 0x7f9407318610>, <progress.module.GhostEvent object at 0x7f9407318ca0>, <progress.module.GhostEvent object at 0x7f9407318cd0>, <progress.module.GhostEvent object at 0x7f9407318d00>, <progress.module.GhostEvent object at 0x7f9407318af0>, <progress.module.GhostEvent object at 0x7f9407318b20>, <progress.module.GhostEvent object at 0x7f9407318b50>, <progress.module.GhostEvent object at 0x7f9407318940>, <progress.module.GhostEvent object at 0x7f9407318e80>, <progress.module.GhostEvent object at 0x7f9407318eb0>, <progress.module.GhostEvent object at 0x7f9407318ee0>, <progress.module.GhostEvent object at 0x7f9407318f10>, <progress.module.GhostEvent object at 0x7f9407318f40>, <progress.module.GhostEvent object at 0x7f9407318f70>, <progress.module.GhostEvent object at 0x7f9407318fa0>, <progress.module.GhostEvent object at 0x7f9407318fd0>, <progress.module.GhostEvent object at 0x7f9401adf040>, <progress.module.GhostEvent object at 0x7f9401adf070>, <progress.module.GhostEvent object at 0x7f9401adf0a0>, <progress.module.GhostEvent object at 0x7f9401adf0d0>, <progress.module.GhostEvent object at 0x7f9401adf100>, <progress.module.GhostEvent object at 0x7f9401adf130>, <progress.module.GhostEvent object at 0x7f9401adf160>, <progress.module.GhostEvent object at 0x7f9401adf190>, <progress.module.GhostEvent object at 0x7f9401adf1c0>, <progress.module.GhostEvent object at 0x7f9401adf1f0>, <progress.module.GhostEvent object at 0x7f9401adf220>, <progress.module.GhostEvent object at 0x7f9401adf250>, <progress.module.GhostEvent object at 0x7f9401adf280>, <progress.module.GhostEvent object at 0x7f9401adf2b0>, <progress.module.GhostEvent object at 0x7f9401adf2e0>, <progress.module.GhostEvent object at 0x7f9401adf310>, <progress.module.GhostEvent object at 0x7f9401adf340>, <progress.module.GhostEvent object at 0x7f9401adf370>, <progress.module.GhostEvent object at 0x7f9401adf3a0>, <progress.module.GhostEvent object at 0x7f9401adf3d0>, <progress.module.GhostEvent object at 0x7f9401adf400>, <progress.module.GhostEvent object at 0x7f9401adf430>, <progress.module.GhostEvent object at 0x7f9401adf460>, <progress.module.GhostEvent object at 0x7f9401adf490>, <progress.module.GhostEvent object at 0x7f9401adf4c0>, <progress.module.GhostEvent object at 0x7f9401adf4f0>, <progress.module.GhostEvent object at 0x7f9401adf520>, <progress.module.GhostEvent object at 0x7f9401adf550>, <progress.module.GhostEvent object at 0x7f9401adf580>, <progress.module.GhostEvent object at 0x7f9401adf5b0>, <progress.module.GhostEvent object at 0x7f9401adf5e0>, <progress.module.GhostEvent object at 0x7f9401adf610>] historic events
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Loaded OSDMap, ready.
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] recovery thread starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] starting setup
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: rbd_support
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [restful DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: restful
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [status DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: status
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [telemetry DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [restful INFO root] server_addr: :: server_port: 8003
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: telemetry
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [restful WARNING root] server not running: no certificate configured
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [volumes DEBUG root] setting log level based on debug_mgr: INFO (2/5)
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: mgr load Constructed class from module: volumes
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] PerfHandler: starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: vms, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.434+0000 7f93f224a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.435+0000 7f93f224a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.435+0000 7f93f224a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.435+0000 7f93f224a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.435+0000 7f93f224a640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.439+0000 7f93ef244640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.439+0000 7f93ef244640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.439+0000 7f93ef244640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.439+0000 7f93ef244640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:49:21.439+0000 7f93ef244640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: volumes, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: images, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_task_task: backups, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TaskHandler: starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} v 0)
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: starting
Feb 01 09:49:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] setup complete
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: Activating manager daemon np0005604215.uhhqtv
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.200:0/2103452742' entity='client.admin' cmd={"prefix": "mgr fail"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: osdmap e91: 6 total, 6 up, 6 in
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd='[{"prefix": "mgr fail"}]': finished
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: mgrmap e53: np0005604215.uhhqtv(active, starting, since 0.0372602s), standbys: np0005604212.oynhpm, np0005604209.isqrps
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604212"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604213"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata", "id": "np0005604215"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604215.rwvxvg"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604213.jdbvyh"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata", "who": "mds.np0005604212.tkdkxt"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604215.uhhqtv", "id": "np0005604215.uhhqtv"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604212.oynhpm", "id": "np0005604212.oynhpm"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604209.isqrps", "id": "np0005604209.isqrps"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 0} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 1} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 2} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 3} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 4} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata", "id": 5} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mds metadata"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mon metadata"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: Manager daemon np0005604215.uhhqtv is now available
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/mirror_snapshot_schedule"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix":"config rm","who":"mgr","name":"mgr/rbd_support/np0005604215.uhhqtv/trash_purge_schedule"} : dispatch
Feb 01 09:49:21 np0005604215.localdomain sshd[301362]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:49:21 np0005604215.localdomain sshd[301362]: Accepted publickey for ceph-admin from 192.168.122.108 port 60216 ssh2: RSA SHA256:7SxEMMnElUSt0NS9ETz/MqwplC4qUXsjkacm12wdfE0
Feb 01 09:49:21 np0005604215.localdomain systemd-logind[761]: New session 75 of user ceph-admin.
Feb 01 09:49:21 np0005604215.localdomain systemd[1]: Started Session 75 of User ceph-admin.
Feb 01 09:49:21 np0005604215.localdomain sshd[301362]: pam_unix(sshd:session): session opened for user ceph-admin(uid=1002) by (uid=0)
Feb 01 09:49:21 np0005604215.localdomain sudo[301366]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:49:21 np0005604215.localdomain sudo[301366]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:21 np0005604215.localdomain sudo[301366]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:21 np0005604215.localdomain sudo[301384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:49:21 np0005604215.localdomain sudo[301384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v3: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:22 np0005604215.localdomain systemd[1]: tmp-crun.BBFa6K.mount: Deactivated successfully.
Feb 01 09:49:22 np0005604215.localdomain podman[301474]: 2026-02-01 09:49:22.678740978 +0000 UTC m=+0.102799329 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, vcs-type=git, description=Red Hat Ceph Storage 7, ceph=True, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, release=1764794109, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_BRANCH=main, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., org.opencontainers.image.created=2025-12-08T17:28:53Z, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., GIT_CLEAN=True, name=rhceph, io.buildah.version=1.41.4, architecture=x86_64, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, GIT_REPO=https://github.com/ceph/ceph-container.git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, io.openshift.expose-services=, build-date=2025-12-08T17:28:53Z, version=7, url=https://catalog.redhat.com/en/search?searchType=containers, CEPH_POINT_RELEASE=, io.openshift.tags=rhceph ceph)
Feb 01 09:49:22 np0005604215.localdomain podman[301474]: 2026-02-01 09:49:22.783647773 +0000 UTC m=+0.207706104 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.openshift.expose-services=, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, io.buildah.version=1.41.4, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, GIT_BRANCH=main, name=rhceph, org.opencontainers.image.created=2025-12-08T17:28:53Z, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., version=7, description=Red Hat Ceph Storage 7, build-date=2025-12-08T17:28:53Z, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, ceph=True, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1764794109, vcs-type=git, io.k8s.description=Red Hat Ceph Storage 7, RELEASE=main, distribution-scope=public, CEPH_POINT_RELEASE=, architecture=x86_64, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, com.redhat.component=rhceph-container, io.openshift.tags=rhceph ceph, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5)
Feb 01 09:49:22 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:49:22] ENGINE Bus STARTING
Feb 01 09:49:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:49:22] ENGINE Bus STARTING
Feb 01 09:49:22 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:49:22] ENGINE Serving on http://172.18.0.108:8765
Feb 01 09:49:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:49:22] ENGINE Serving on http://172.18.0.108:8765
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:49:23] ENGINE Serving on https://172.18.0.108:7150
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:49:23] ENGINE Serving on https://172.18.0.108:7150
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:49:23] ENGINE Bus STARTED
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:49:23] ENGINE Bus STARTED
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cherrypy.error] [01/Feb/2026:09:49:23] ENGINE Client ('172.18.0.108', 45716) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : [01/Feb/2026:09:49:23] ENGINE Client ('172.18.0.108', 45716) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mgrmap e54: np0005604215.uhhqtv(active, since 1.10046s), standbys: np0005604212.oynhpm, np0005604209.isqrps
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:49:22] ENGINE Bus STARTING
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:49:22] ENGINE Serving on http://172.18.0.108:8765
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:49:23] ENGINE Serving on https://172.18.0.108:7150
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:49:23] ENGINE Bus STARTED
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: [01/Feb/2026:09:49:23] ENGINE Client ('172.18.0.108', 45716) lost — peer dropped the TLS connection suddenly, during handshake: (6, 'TLS/SSL connection has been closed (EOF) (_ssl.c:1147)')
Feb 01 09:49:23 np0005604215.localdomain sudo[301384]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:49:23 np0005604215.localdomain sudo[301620]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:49:23 np0005604215.localdomain sudo[301620]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:23 np0005604215.localdomain sudo[301620]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:23 np0005604215.localdomain ceph-mgr[278126]: [devicehealth INFO root] Check health
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:49:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:49:23 np0005604215.localdomain sudo[301648]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:49:23 np0005604215.localdomain sudo[301648]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:24 np0005604215.localdomain sudo[301648]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:24 np0005604215.localdomain sudo[301699]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:49:24 np0005604215.localdomain sudo[301699]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:24 np0005604215.localdomain sudo[301699]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: pgmap v4: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: Health check cleared: CEPHADM_STRAY_DAEMON (was: 1 stray daemon(s) not managed by cephadm)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: Health check cleared: CEPHADM_STRAY_HOST (was: 1 stray host(s) with 1 daemon(s) not managed by cephadm)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: Cluster is now healthy
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mgrmap e55: np0005604215.uhhqtv(active, since 2s), standbys: np0005604212.oynhpm, np0005604209.isqrps
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:24 np0005604215.localdomain sudo[301717]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 list-networks
Feb 01 09:49:24 np0005604215.localdomain sudo[301717]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:24 np0005604215.localdomain sudo[301717]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:49:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:49:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain sudo[301753]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:49:25 np0005604215.localdomain sudo[301753]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301753]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301771]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:49:25 np0005604215.localdomain sudo[301771]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301771]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:25 np0005604215.localdomain sudo[301789]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:49:25 np0005604215.localdomain sudo[301789]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301789]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301807]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:49:25 np0005604215.localdomain sudo[301807]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301807]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301825]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:49:25 np0005604215.localdomain sudo[301825]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301825]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301859]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:49:25 np0005604215.localdomain sudo[301859]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301859]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301877]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new
Feb 01 09:49:25 np0005604215.localdomain sudo[301877]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301877]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301895]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.conf.new /etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain sudo[301895]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301895]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain sudo[301913]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:49:25 np0005604215.localdomain sudo[301913]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301913]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.conf
Feb 01 09:49:25 np0005604215.localdomain sudo[301931]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:49:25 np0005604215.localdomain sudo[301931]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:49:25 np0005604215.localdomain sudo[301931]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:25 np0005604215.localdomain sudo[301950]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:49:25 np0005604215.localdomain sudo[301950]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:25 np0005604215.localdomain sudo[301950]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain podman[301949]: 2026-02-01 09:49:26.00146639 +0000 UTC m=+0.085074824 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:49:26 np0005604215.localdomain sudo[301982]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:49:26 np0005604215.localdomain sudo[301982]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[301982]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain podman[301949]: 2026-02-01 09:49:26.038639544 +0000 UTC m=+0.122248028 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:49:26 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:49:26 np0005604215.localdomain sudo[302008]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302008]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302008]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: mgr.server handle_open ignoring open from mgr.np0005604213.caiaeh 172.18.0.107:0/309736900; not ready for session (expect reconnect)
Feb 01 09:49:26 np0005604215.localdomain sudo[302042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302042]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302060]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302060]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302060]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302078]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:26 np0005604215.localdomain sudo[302078]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302078]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:26 np0005604215.localdomain sudo[302096]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /etc/ceph
Feb 01 09:49:26 np0005604215.localdomain sudo[302096]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:26 np0005604215.localdomain sudo[302096]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302114]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph
Feb 01 09:49:26 np0005604215.localdomain sudo[302114]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302114]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302132]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302132]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302132]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302150]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:49:26 np0005604215.localdomain sudo[302150]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302150]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302168]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302168]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302168]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302202]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302202]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302202]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:26 np0005604215.localdomain sudo[302220]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new
Feb 01 09:49:26 np0005604215.localdomain sudo[302220]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:26 np0005604215.localdomain sudo[302220]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain sudo[302238]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/etc/ceph/ceph.client.admin.keyring.new /etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain sudo[302238]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302238]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain sudo[302256]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:49:27 np0005604215.localdomain sudo[302256]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302256]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: pgmap v5: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.conf
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mgrmap e56: np0005604215.uhhqtv(active, since 4s), standbys: np0005604212.oynhpm, np0005604209.isqrps
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: Standby manager daemon np0005604213.caiaeh started
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO cephadm.serve] Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain sudo[302274]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mkdir -p /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config
Feb 01 09:49:27 np0005604215.localdomain sudo[302274]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302274]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain sudo[302292]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/touch /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:49:27 np0005604215.localdomain sudo[302292]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302292]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:27 np0005604215.localdomain sudo[302310]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R ceph-admin /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e
Feb 01 09:49:27 np0005604215.localdomain sudo[302310]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302310]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain sudo[302328]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 644 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:49:27 np0005604215.localdomain sudo[302328]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302328]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain sudo[302362]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chown -R 0:0 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:49:27 np0005604215.localdomain sudo[302362]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302362]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain sudo[302380]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/chmod 600 /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new
Feb 01 09:49:27 np0005604215.localdomain sudo[302380]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302380]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain sudo[302398]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/mv /tmp/cephadm-33fac0b9-80c7-560f-918a-c92d3021ca1e/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring.new /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:27 np0005604215.localdomain sudo[302398]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:27 np0005604215.localdomain sudo[302398]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 0 B/s wr, 20 op/s
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev b203ac22-5257-4a69-8274-fe65fc85b21f (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev b203ac22-5257-4a69-8274-fe65fc85b21f (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:49:27 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event b203ac22-5257-4a69-8274-fe65fc85b21f (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:49:27 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/etc/ceph/ceph.client.admin.keyring
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: Updating np0005604215.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: Updating np0005604212.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: mgrmap e57: np0005604215.uhhqtv(active, since 5s), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "mgr metadata", "who": "np0005604213.caiaeh", "id": "np0005604213.caiaeh"} : dispatch
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: Updating np0005604213.localdomain:/var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/config/ceph.client.admin.keyring
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:49:28 np0005604215.localdomain sudo[302416]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:49:28 np0005604215.localdomain sudo[302416]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:28 np0005604215.localdomain sudo[302416]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:49:28 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 25c0d117-e808-4c72-b186-7a21eb50bff1 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:49:28 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 25c0d117-e808-4c72-b186-7a21eb50bff1 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:49:28 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 25c0d117-e808-4c72-b186-7a21eb50bff1 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:49:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:49:28 np0005604215.localdomain sudo[302434]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:49:28 np0005604215.localdomain sudo[302434]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:49:28 np0005604215.localdomain sudo[302434]: pam_unix(sudo:session): session closed for user root
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: pgmap v6: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: pgmap v7: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 0 B/s wr, 20 op/s
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1 stray daemon(s) not managed by cephadm (CEPHADM_STRAY_DAEMON)
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: Health check failed: 1 stray host(s) with 1 daemon(s) not managed by cephadm (CEPHADM_STRAY_HOST)
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:49:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Feb 01 09:49:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:49:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:49:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:49:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:49:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:49:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17798 "" "Go-http-client/1.1"
Feb 01 09:49:31 np0005604215.localdomain ceph-mon[298604]: pgmap v8: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 0 B/s wr, 15 op/s
Feb 01 09:49:31 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:49:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:49:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:49:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:49:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:49:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:49:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:49:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:49:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 0 B/s wr, 12 op/s
Feb 01 09:49:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:49:32 np0005604215.localdomain ceph-mon[298604]: pgmap v9: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 0 B/s wr, 12 op/s
Feb 01 09:49:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:49:34 np0005604215.localdomain ceph-mon[298604]: pgmap v10: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:49:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2611289140' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:49:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2611289140' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:49:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:49:37 np0005604215.localdomain ceph-mon[298604]: pgmap v11: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:49:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:49:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:49:37 np0005604215.localdomain systemd[1]: tmp-crun.waYBBU.mount: Deactivated successfully.
Feb 01 09:49:37 np0005604215.localdomain podman[302452]: 2026-02-01 09:49:37.864665239 +0000 UTC m=+0.077094544 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:49:37 np0005604215.localdomain podman[302453]: 2026-02-01 09:49:37.884722286 +0000 UTC m=+0.089705229 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:49:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:49:37 np0005604215.localdomain podman[302453]: 2026-02-01 09:49:37.957865586 +0000 UTC m=+0.162848529 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:49:37 np0005604215.localdomain podman[302452]: 2026-02-01 09:49:37.974703103 +0000 UTC m=+0.187132368 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:49:37 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:49:38 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:49:38 np0005604215.localdomain systemd[1]: tmp-crun.KvoChO.mount: Deactivated successfully.
Feb 01 09:49:39 np0005604215.localdomain ceph-mon[298604]: pgmap v12: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 0 B/s wr, 10 op/s
Feb 01 09:49:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:41 np0005604215.localdomain ceph-mon[298604]: pgmap v13: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:49:41.766 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:49:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:49:41.766 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:49:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:49:41.766 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:49:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:42 np0005604215.localdomain ceph-mon[298604]: pgmap v14: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:49:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:49:42 np0005604215.localdomain podman[302500]: 2026-02-01 09:49:42.879130272 +0000 UTC m=+0.090283707 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, vcs-type=git, architecture=x86_64, release=1769056855, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, io.openshift.expose-services=, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:49:42 np0005604215.localdomain podman[302500]: 2026-02-01 09:49:42.919819797 +0000 UTC m=+0.130973282 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, release=1769056855, container_name=openstack_network_exporter)
Feb 01 09:49:42 np0005604215.localdomain systemd[1]: tmp-crun.oEIPpL.mount: Deactivated successfully.
Feb 01 09:49:42 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:49:42 np0005604215.localdomain podman[302501]: 2026-02-01 09:49:42.937326464 +0000 UTC m=+0.144962049 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 09:49:42 np0005604215.localdomain podman[302501]: 2026-02-01 09:49:42.943827568 +0000 UTC m=+0.151463193 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 09:49:42 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:49:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:43.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:43.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:49:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:43.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:49:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:43.115 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:49:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:44.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:45 np0005604215.localdomain ceph-mon[298604]: pgmap v15: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:45.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:46.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:46.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:46.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:46.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:49:47 np0005604215.localdomain ceph-mon[298604]: pgmap v16: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:48 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2378768146' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.131 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.132 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.132 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.133 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.133 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:49:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:49:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3814061159' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.580 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.788 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.790 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12364MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.790 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.791 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.933 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.934 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:49:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:48.950 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:49:49 np0005604215.localdomain ceph-mon[298604]: pgmap v17: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/4193886347' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3814061159' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4268203992' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:49:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/865720152' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:49.424 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.475s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:49:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:49.431 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:49:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:49.468 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:49:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:49.470 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:49:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:49.471 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.680s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:49:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3245776648' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/865720152' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:49:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:50.467 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:50.530 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:49:50.531 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:49:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:49:50 np0005604215.localdomain podman[302584]: 2026-02-01 09:49:50.865863706 +0000 UTC m=+0.081427850 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:49:50 np0005604215.localdomain podman[302584]: 2026-02-01 09:49:50.904984931 +0000 UTC m=+0.120549065 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 01 09:49:50 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:49:51 np0005604215.localdomain ceph-mon[298604]: pgmap v18: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:49:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:52 np0005604215.localdomain ceph-mon[298604]: pgmap v19: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:54 np0005604215.localdomain ceph-mon[298604]: pgmap v20: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:49:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:56 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:49:56 np0005604215.localdomain podman[302603]: 2026-02-01 09:49:56.863613501 +0000 UTC m=+0.079523270 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:49:56 np0005604215.localdomain podman[302603]: 2026-02-01 09:49:56.872441368 +0000 UTC m=+0.088351127 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:49:56 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:49:57 np0005604215.localdomain ceph-mon[298604]: pgmap v21: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:59 np0005604215.localdomain ceph-mon[298604]: pgmap v22: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:49:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:50:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:50:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:50:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:50:00 np0005604215.localdomain ceph-mon[298604]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 09:50:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:50:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17789 "" "Go-http-client/1.1"
Feb 01 09:50:01 np0005604215.localdomain ceph-mon[298604]: pgmap v23: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:50:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:50:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:50:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:50:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:50:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:50:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:02 np0005604215.localdomain ceph-mon[298604]: pgmap v24: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:04 np0005604215.localdomain ceph-mon[298604]: pgmap v25: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:07 np0005604215.localdomain ceph-mon[298604]: pgmap v26: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:50:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:50:08 np0005604215.localdomain podman[302624]: 2026-02-01 09:50:08.871954962 +0000 UTC m=+0.084484486 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:50:08 np0005604215.localdomain podman[302625]: 2026-02-01 09:50:08.940325972 +0000 UTC m=+0.150313236 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:50:08 np0005604215.localdomain podman[302625]: 2026-02-01 09:50:08.952698898 +0000 UTC m=+0.162686162 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:50:08 np0005604215.localdomain podman[302624]: 2026-02-01 09:50:08.964671622 +0000 UTC m=+0.177201176 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 01 09:50:08 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:50:08 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:50:09 np0005604215.localdomain ceph-mon[298604]: pgmap v27: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:11 np0005604215.localdomain ceph-mon[298604]: pgmap v28: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:12 np0005604215.localdomain ceph-mon[298604]: pgmap v29: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:50:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:50:13 np0005604215.localdomain podman[302671]: 2026-02-01 09:50:13.865477105 +0000 UTC m=+0.080808880 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, name=ubi9/ubi-minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:50:13 np0005604215.localdomain podman[302671]: 2026-02-01 09:50:13.878069478 +0000 UTC m=+0.093401243 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, distribution-scope=public, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, config_id=openstack_network_exporter, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, release=1769056855, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64)
Feb 01 09:50:13 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:50:13 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:13 np0005604215.localdomain podman[302672]: 2026-02-01 09:50:13.922247364 +0000 UTC m=+0.135601537 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 01 09:50:13 np0005604215.localdomain podman[302672]: 2026-02-01 09:50:13.956521743 +0000 UTC m=+0.169875916 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:50:13 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:50:14 np0005604215.localdomain ceph-mon[298604]: pgmap v30: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:17 np0005604215.localdomain ceph-mon[298604]: pgmap v31: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:19 np0005604215.localdomain ceph-mon[298604]: pgmap v32: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:21 np0005604215.localdomain ceph-mon[298604]: pgmap v33: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:50:21
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['vms', 'manila_data', 'backups', 'images', 'volumes', '.mgr', 'manila_metadata']
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.0014449417225013959 of space, bias 1.0, pg target 0.2885066972594454 quantized to 32 (current 32)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.0021774090359203426 quantized to 16 (current 16)
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:50:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:50:21 np0005604215.localdomain podman[302707]: 2026-02-01 09:50:21.886994442 +0000 UTC m=+0.096061975 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 01 09:50:21 np0005604215.localdomain podman[302707]: 2026-02-01 09:50:21.89752241 +0000 UTC m=+0.106589913 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Feb 01 09:50:21 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:50:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:22 np0005604215.localdomain ceph-mon[298604]: pgmap v34: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:25 np0005604215.localdomain ceph-mon[298604]: pgmap v35: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:27 np0005604215.localdomain ceph-mon[298604]: pgmap v36: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:27 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:50:27 np0005604215.localdomain podman[302726]: 2026-02-01 09:50:27.864503709 +0000 UTC m=+0.080012975 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:50:27 np0005604215.localdomain podman[302726]: 2026-02-01 09:50:27.879900309 +0000 UTC m=+0.095409525 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:50:27 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:50:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:28 np0005604215.localdomain sudo[302750]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:50:28 np0005604215.localdomain sudo[302750]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:50:28 np0005604215.localdomain sudo[302750]: pam_unix(sudo:session): session closed for user root
Feb 01 09:50:28 np0005604215.localdomain sudo[302768]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:50:28 np0005604215.localdomain sudo[302768]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: pgmap v37: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:29 np0005604215.localdomain sudo[302768]: pam_unix(sudo:session): session closed for user root
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:50:29 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 4f8e6ba9-c226-4470-b127-1e620a7c18ec (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:50:29 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 4f8e6ba9-c226-4470-b127-1e620a7c18ec (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:50:29 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 4f8e6ba9-c226-4470-b127-1e620a7c18ec (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:50:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:50:29 np0005604215.localdomain sudo[302819]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:50:29 np0005604215.localdomain sudo[302819]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:50:29 np0005604215.localdomain sudo[302819]: pam_unix(sudo:session): session closed for user root
Feb 01 09:50:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:50:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:50:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:50:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:50:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:50:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:50:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:50:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 153530 "" "Go-http-client/1.1"
Feb 01 09:50:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:50:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 17789 "" "Go-http-client/1.1"
Feb 01 09:50:31 np0005604215.localdomain ceph-mon[298604]: pgmap v38: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:31 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:50:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:50:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:50:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:50:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:50:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:50:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:50:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:50:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:50:32 np0005604215.localdomain ceph-mon[298604]: pgmap v39: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:35 np0005604215.localdomain ceph-mon[298604]: pgmap v40: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3877644669' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:50:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3877644669' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:50:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:37 np0005604215.localdomain ceph-mon[298604]: pgmap v41: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:39 np0005604215.localdomain ceph-mon[298604]: pgmap v42: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:50:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:50:39 np0005604215.localdomain podman[302838]: 2026-02-01 09:50:39.877417195 +0000 UTC m=+0.085065502 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:50:39 np0005604215.localdomain podman[302837]: 2026-02-01 09:50:39.924244575 +0000 UTC m=+0.132310415 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 01 09:50:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:39 np0005604215.localdomain podman[302838]: 2026-02-01 09:50:39.941816122 +0000 UTC m=+0.149464469 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:50:39 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:50:40 np0005604215.localdomain podman[302837]: 2026-02-01 09:50:40.005716164 +0000 UTC m=+0.213781994 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller)
Feb 01 09:50:40 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:50:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:40.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:40.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 09:50:41 np0005604215.localdomain ceph-mon[298604]: pgmap v43: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:41.768 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:50:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:41.769 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:50:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:41.769 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:50:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:42 np0005604215.localdomain ceph-mon[298604]: pgmap v44: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:43.131 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:43.132 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:50:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:43.132 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:50:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:43.159 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:50:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:50:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:50:44 np0005604215.localdomain podman[302886]: 2026-02-01 09:50:44.847273752 +0000 UTC m=+0.065793251 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, vcs-type=git, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 01 09:50:44 np0005604215.localdomain podman[302886]: 2026-02-01 09:50:44.859812233 +0000 UTC m=+0.078331782 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, container_name=openstack_network_exporter, release=1769056855, managed_by=edpm_ansible, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 01 09:50:44 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:50:44 np0005604215.localdomain podman[302887]: 2026-02-01 09:50:44.861823656 +0000 UTC m=+0.073508803 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:50:44 np0005604215.localdomain podman[302887]: 2026-02-01 09:50:44.94183621 +0000 UTC m=+0.153521357 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:50:44 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:50:45 np0005604215.localdomain ceph-mon[298604]: pgmap v45: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:45.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:50:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5755 writes, 24K keys, 5755 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5755 writes, 912 syncs, 6.31 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 250 writes, 446 keys, 250 commit groups, 1.0 writes per commit group, ingest: 0.42 MB, 0.00 MB/s
                                                          Interval WAL: 250 writes, 125 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:50:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:46.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:46.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:46.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:47 np0005604215.localdomain ceph-mon[298604]: pgmap v46: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:47.161 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.169 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.170 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.170 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.170 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.171 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:50:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:50:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1703289481' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.628 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.845 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.846 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=12355MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.847 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:50:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:48.847 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.058 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.058 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:50:49 np0005604215.localdomain ceph-mon[298604]: pgmap v47: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3068824305' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1703289481' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.118 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.179 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.179 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.193 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.217 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.231 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:50:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:50:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1467814334' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.689 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.696 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.767 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.771 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:50:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:49.771 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.924s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:50:49 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:49.826 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:50:49 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:49.827 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:50:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/944887520' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1467814334' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:50:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5426 writes, 23K keys, 5426 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5426 writes, 740 syncs, 7.33 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 109 writes, 398 keys, 109 commit groups, 1.0 writes per commit group, ingest: 0.49 MB, 0.00 MB/s
                                                          Interval WAL: 109 writes, 47 syncs, 2.32 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 09:50:51 np0005604215.localdomain ceph-mon[298604]: pgmap v48: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/161490111' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f93fd291be0>)]
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f93fd291dc0>)]
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 09:50:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:51.772 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:52.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:52 np0005604215.localdomain ceph-mon[298604]: pgmap v49: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:50:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3403118509' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:50:52 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:52.737 259225 INFO oslo.privsep.daemon [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpdcrtwudu/privsep.sock']
Feb 01 09:50:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:50:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:52.830 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:50:52 np0005604215.localdomain podman[302973]: 2026-02-01 09:50:52.868261973 +0000 UTC m=+0.083648698 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:50:52 np0005604215.localdomain podman[302973]: 2026-02-01 09:50:52.907842327 +0000 UTC m=+0.123229032 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:50:52 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:50:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:53.329 259225 INFO oslo.privsep.daemon [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 01 09:50:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:53.227 302993 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:50:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:53.231 302993 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:50:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:53.235 302993 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 01 09:50:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:53.235 302993 INFO oslo.privsep.daemon [-] privsep daemon running as pid 302993
Feb 01 09:50:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:53.870 259225 INFO oslo.privsep.daemon [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp8pr7eztf/privsep.sock']
Feb 01 09:50:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 01 09:50:54 np0005604215.localdomain ceph-mon[298604]: mgrmap e58: np0005604215.uhhqtv(active, since 92s), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:50:54 np0005604215.localdomain ceph-mon[298604]: pgmap v50: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 255 B/s wr, 0 op/s
Feb 01 09:50:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:54.457 259225 INFO oslo.privsep.daemon [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 01 09:50:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:54.361 303002 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:50:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:54.366 303002 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:50:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:54.370 303002 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none
Feb 01 09:50:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:54.371 303002 INFO oslo.privsep.daemon [-] privsep daemon running as pid 303002
Feb 01 09:50:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e91 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:50:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:55.310 259225 INFO oslo.privsep.daemon [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpoodm2ude/privsep.sock']
Feb 01 09:50:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e92 e92: 6 total, 6 up, 6 in
Feb 01 09:50:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:55.916 259225 INFO oslo.privsep.daemon [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Spawned new privsep daemon via rootwrap
Feb 01 09:50:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:55.771 303014 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:50:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:55.777 303014 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:50:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:55.781 303014 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 01 09:50:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:55.781 303014 INFO oslo.privsep.daemon [-] privsep daemon running as pid 303014
Feb 01 09:50:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v52: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 307 B/s wr, 0 op/s
Feb 01 09:50:56 np0005604215.localdomain ceph-mon[298604]: osdmap e92: 6 total, 6 up, 6 in
Feb 01 09:50:56 np0005604215.localdomain ceph-mon[298604]: pgmap v52: 177 pgs: 177 active+clean; 105 MiB data, 584 MiB used, 41 GiB / 42 GiB avail; 307 B/s wr, 0 op/s
Feb 01 09:50:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:57.239 259225 INFO neutron.agent.linux.ip_lib [None req-b3c76f87-d3d1-4736-887d-1f3db49b2c96 - - - - - -] Device tap76e2ea9b-94 cannot be used as it has no MAC address
Feb 01 09:50:57 np0005604215.localdomain kernel: device tap76e2ea9b-94 entered promiscuous mode
Feb 01 09:50:57 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939457.3422] manager: (tap76e2ea9b-94): new Generic device (/org/freedesktop/NetworkManager/Devices/13)
Feb 01 09:50:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:50:57Z|00025|binding|INFO|Claiming lport 76e2ea9b-94ac-478b-8fb0-863a2f63759c for this chassis.
Feb 01 09:50:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:50:57Z|00026|binding|INFO|76e2ea9b-94ac-478b-8fb0-863a2f63759c: Claiming unknown
Feb 01 09:50:57 np0005604215.localdomain systemd-udevd[303029]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: libvirt version: 11.10.0, package: 2.el9 (builder@centos.org, 2025-12-18-15:09:54, )
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: hostname: np0005604215.localdomain
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:50:57Z|00027|binding|INFO|Setting lport 76e2ea9b-94ac-478b-8fb0-863a2f63759c ovn-installed in OVS
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap76e2ea9b-94: No such device
Feb 01 09:50:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v53: 177 pgs: 177 active+clean; 125 MiB data, 645 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:50:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:58.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:50:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:58.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 09:50:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:50:58Z|00028|binding|INFO|Setting lport 76e2ea9b-94ac-478b-8fb0-863a2f63759c up in Southbound
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.281 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '192.168.199.3/24', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '29b63d789cd547019a15ada42140b6b4', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c2c0b803-4f99-4e3e-ab61-20b4c613d0ad, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=76e2ea9b-94ac-478b-8fb0-863a2f63759c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.283 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 76e2ea9b-94ac-478b-8fb0-863a2f63759c in datapath 7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f bound to our chassis
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.286 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 8684fce5-bb3f-4eb0-bf35-2f3948d473ca IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.287 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.288 158655 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpejz9ttd8/privsep.sock']
Feb 01 09:50:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:50:58.506 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 09:50:58 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:50:58 np0005604215.localdomain podman[303106]: 
Feb 01 09:50:58 np0005604215.localdomain podman[303106]: 2026-02-01 09:50:58.79551548 +0000 UTC m=+0.041609858 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:50:58 np0005604215.localdomain podman[303118]: 2026-02-01 09:50:58.904663562 +0000 UTC m=+0.119649150 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:50:58 np0005604215.localdomain podman[303106]: 2026-02-01 09:50:58.921856169 +0000 UTC m=+0.167950507 container create 3a3e234999fe81c21d45fee4ad4a786614993ad042362467bfd117b0c09b08b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:50:58 np0005604215.localdomain podman[303118]: 2026-02-01 09:50:58.943818052 +0000 UTC m=+0.158803650 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 09:50:58 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.970 158655 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.971 158655 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpejz9ttd8/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.849 303130 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.856 303130 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.859 303130 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.859 303130 INFO oslo.privsep.daemon [-] privsep daemon running as pid 303130
Feb 01 09:50:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:58.974 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[22d5d1a9-1d83-4a1a-b59e-bd0c994b62e7]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:50:58 np0005604215.localdomain systemd[1]: Started libpod-conmon-3a3e234999fe81c21d45fee4ad4a786614993ad042362467bfd117b0c09b08b8.scope.
Feb 01 09:50:58 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:50:59 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1832dca9d10799b404d2490d5c7e80ce7bd78dcf138fd23f5c49d32d39ab97fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:50:59 np0005604215.localdomain podman[303106]: 2026-02-01 09:50:59.010423409 +0000 UTC m=+0.256517717 container init 3a3e234999fe81c21d45fee4ad4a786614993ad042362467bfd117b0c09b08b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 01 09:50:59 np0005604215.localdomain ceph-mon[298604]: pgmap v53: 177 pgs: 177 active+clean; 125 MiB data, 645 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:50:59 np0005604215.localdomain podman[303106]: 2026-02-01 09:50:59.019469481 +0000 UTC m=+0.265563809 container start 3a3e234999fe81c21d45fee4ad4a786614993ad042362467bfd117b0c09b08b8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:50:59 np0005604215.localdomain dnsmasq[303151]: started, version 2.85 cachesize 150
Feb 01 09:50:59 np0005604215.localdomain dnsmasq[303151]: DNS service limited to local subnets
Feb 01 09:50:59 np0005604215.localdomain dnsmasq[303151]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:50:59 np0005604215.localdomain dnsmasq[303151]: warning: no upstream servers configured
Feb 01 09:50:59 np0005604215.localdomain dnsmasq-dhcp[303151]: DHCP, static leases only on 192.168.199.0, lease time 1d
Feb 01 09:50:59 np0005604215.localdomain dnsmasq[303151]: read /var/lib/neutron/dhcp/7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f/addn_hosts - 0 addresses
Feb 01 09:50:59 np0005604215.localdomain dnsmasq-dhcp[303151]: read /var/lib/neutron/dhcp/7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f/host
Feb 01 09:50:59 np0005604215.localdomain dnsmasq-dhcp[303151]: read /var/lib/neutron/dhcp/7fc4cf94-32b4-4b60-bdef-3e96b4aa6c2f/opts
Feb 01 09:50:59 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:50:59.370 259225 INFO neutron.agent.dhcp.agent [None req-4160db1a-252e-4d10-b129-8d06cb2b1c89 - - - - - -] DHCP configuration for ports {'0d5f687b-71f8-4b2a-9e1c-1e5f246bb2a3'} is completed
Feb 01 09:50:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:59.431 303130 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:50:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:59.431 303130 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:50:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:59.431 303130 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:50:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:50:59.524 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d917c27f-e2cb-4af4-b58c-205875afdb03]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:50:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v54: 177 pgs: 177 active+clean; 125 MiB data, 645 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:51:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:51:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:51:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 e93: 6 total, 6 up, 6 in
Feb 01 09:51:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:51:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 09:51:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:51:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18268 "" "Go-http-client/1.1"
Feb 01 09:51:01 np0005604215.localdomain ceph-mon[298604]: pgmap v54: 177 pgs: 177 active+clean; 125 MiB data, 645 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:51:01 np0005604215.localdomain ceph-mon[298604]: osdmap e93: 6 total, 6 up, 6 in
Feb 01 09:51:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:51:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:51:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:51:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:51:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:51:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:51:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v56: 177 pgs: 177 active+clean; 125 MiB data, 645 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 2.6 MiB/s wr, 23 op/s
Feb 01 09:51:02 np0005604215.localdomain ceph-mon[298604]: pgmap v56: 177 pgs: 177 active+clean; 125 MiB data, 645 MiB used, 41 GiB / 42 GiB avail; 16 KiB/s rd, 2.6 MiB/s wr, 23 op/s
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.406 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:51:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:51:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 5.0 MiB/s wr, 46 op/s
Feb 01 09:51:05 np0005604215.localdomain ceph-mon[298604]: pgmap v57: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 5.0 MiB/s wr, 46 op/s
Feb 01 09:51:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 4.1 MiB/s wr, 38 op/s
Feb 01 09:51:07 np0005604215.localdomain ceph-mon[298604]: pgmap v58: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 4.1 MiB/s wr, 38 op/s
Feb 01 09:51:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:51:09 np0005604215.localdomain ceph-mon[298604]: pgmap v59: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:51:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:51:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:51:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:51:10 np0005604215.localdomain systemd[1]: tmp-crun.ZTKjpd.mount: Deactivated successfully.
Feb 01 09:51:10 np0005604215.localdomain podman[303154]: 2026-02-01 09:51:10.851711862 +0000 UTC m=+0.074413920 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:51:10 np0005604215.localdomain podman[303154]: 2026-02-01 09:51:10.922644653 +0000 UTC m=+0.145346721 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:51:10 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:51:10 np0005604215.localdomain podman[303155]: 2026-02-01 09:51:10.993174292 +0000 UTC m=+0.209896703 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:51:11 np0005604215.localdomain podman[303155]: 2026-02-01 09:51:11.005735073 +0000 UTC m=+0.222457504 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:51:11 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:51:11 np0005604215.localdomain ceph-mon[298604]: pgmap v60: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 MiB/s wr, 19 op/s
Feb 01 09:51:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 16 op/s
Feb 01 09:51:12 np0005604215.localdomain ceph-mon[298604]: pgmap v61: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 16 op/s
Feb 01 09:51:13 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 15 op/s
Feb 01 09:51:15 np0005604215.localdomain ceph-mon[298604]: pgmap v62: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail; 11 KiB/s rd, 1.7 MiB/s wr, 15 op/s
Feb 01 09:51:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:51:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:51:15 np0005604215.localdomain podman[303201]: 2026-02-01 09:51:15.866682644 +0000 UTC m=+0.073749440 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:51:15 np0005604215.localdomain podman[303200]: 2026-02-01 09:51:15.924762574 +0000 UTC m=+0.135678120 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, build-date=2026-01-22T05:09:47Z, release=1769056855, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, version=9.7)
Feb 01 09:51:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:15 np0005604215.localdomain podman[303201]: 2026-02-01 09:51:15.947866304 +0000 UTC m=+0.154933100 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:51:15 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:51:15 np0005604215.localdomain podman[303200]: 2026-02-01 09:51:15.966801555 +0000 UTC m=+0.177717101 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-type=git, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, managed_by=edpm_ansible, name=ubi9/ubi-minimal, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 09:51:15 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:51:17 np0005604215.localdomain ceph-mon[298604]: pgmap v63: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:19 np0005604215.localdomain ceph-mon[298604]: pgmap v64: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:21 np0005604215.localdomain ceph-mon[298604]: pgmap v65: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:51:21
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['.mgr', 'images', 'backups', 'volumes', 'manila_data', 'vms', 'manila_metadata']
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.002170138888888889 quantized to 16 (current 16)
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:51:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:22 np0005604215.localdomain ceph-mon[298604]: pgmap v66: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:51:23 np0005604215.localdomain podman[303239]: 2026-02-01 09:51:23.862640067 +0000 UTC m=+0.080912814 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:51:23 np0005604215.localdomain podman[303239]: 2026-02-01 09:51:23.875747034 +0000 UTC m=+0.094019792 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:51:23 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:51:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:25 np0005604215.localdomain ceph-mon[298604]: pgmap v67: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:27 np0005604215.localdomain ceph-mon[298604]: pgmap v68: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:51:27Z|00029|memory_trim|INFO|Detected inactivity (last active 30017 ms ago): trimming memory
Feb 01 09:51:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:29 np0005604215.localdomain ceph-mon[298604]: pgmap v69: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:29 np0005604215.localdomain sudo[303258]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:51:29 np0005604215.localdomain sudo[303258]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:51:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:51:29 np0005604215.localdomain sudo[303258]: pam_unix(sudo:session): session closed for user root
Feb 01 09:51:29 np0005604215.localdomain sudo[303282]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:51:29 np0005604215.localdomain sudo[303282]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:51:29 np0005604215.localdomain podman[303276]: 2026-02-01 09:51:29.833242606 +0000 UTC m=+0.084880387 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:51:29 np0005604215.localdomain podman[303276]: 2026-02-01 09:51:29.846812249 +0000 UTC m=+0.098450110 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:51:29 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:51:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:51:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:51:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:51:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:51:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18270 "" "Go-http-client/1.1"
Feb 01 09:51:30 np0005604215.localdomain sudo[303282]: pam_unix(sudo:session): session closed for user root
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:51:30 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 6adb7234-316a-43c5-a8de-c5b6e5f01375 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:51:30 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 6adb7234-316a-43c5-a8de-c5b6e5f01375 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:51:30 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 6adb7234-316a-43c5-a8de-c5b6e5f01375 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:51:30 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:51:30 np0005604215.localdomain sudo[303350]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:51:30 np0005604215.localdomain sudo[303350]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:51:30 np0005604215.localdomain sudo[303350]: pam_unix(sudo:session): session closed for user root
Feb 01 09:51:31 np0005604215.localdomain ceph-mon[298604]: pgmap v70: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:31 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:51:31 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:51:31 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:51:31 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:51:31 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:51:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:51:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:51:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:51:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:51:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:51:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:51:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:51:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:51:32 np0005604215.localdomain ceph-mon[298604]: pgmap v71: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:51:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2524954826' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:51:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:51:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2524954826' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:51:35 np0005604215.localdomain ceph-mon[298604]: pgmap v72: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2524954826' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:51:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2524954826' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:51:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:37 np0005604215.localdomain ceph-mon[298604]: pgmap v73: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:39 np0005604215.localdomain ceph-mon[298604]: pgmap v74: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:41 np0005604215.localdomain ceph-mon[298604]: pgmap v75: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:51:41.769 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:51:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:51:41.770 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:51:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:51:41.770 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:51:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:51:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:51:41 np0005604215.localdomain systemd[1]: tmp-crun.Y4H479.mount: Deactivated successfully.
Feb 01 09:51:41 np0005604215.localdomain podman[303368]: 2026-02-01 09:51:41.870090576 +0000 UTC m=+0.083851165 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:51:41 np0005604215.localdomain podman[303368]: 2026-02-01 09:51:41.914732898 +0000 UTC m=+0.128493457 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Feb 01 09:51:41 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:51:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:41 np0005604215.localdomain podman[303369]: 2026-02-01 09:51:41.916150621 +0000 UTC m=+0.128223198 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:51:41 np0005604215.localdomain podman[303369]: 2026-02-01 09:51:41.996232938 +0000 UTC m=+0.208305515 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:51:42 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:51:42 np0005604215.localdomain ceph-mon[298604]: pgmap v76: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:45 np0005604215.localdomain ceph-mon[298604]: pgmap v77: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:45.506 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:45.507 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:51:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:45.507 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:51:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:45.523 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:51:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:51:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:51:46 np0005604215.localdomain podman[303414]: 2026-02-01 09:51:46.867425487 +0000 UTC m=+0.078393205 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 09:51:46 np0005604215.localdomain podman[303414]: 2026-02-01 09:51:46.878566434 +0000 UTC m=+0.089534182 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, vcs-type=git, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vendor=Red Hat, Inc., distribution-scope=public, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7)
Feb 01 09:51:46 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:51:46 np0005604215.localdomain podman[303415]: 2026-02-01 09:51:46.920619745 +0000 UTC m=+0.130625972 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 01 09:51:46 np0005604215.localdomain podman[303415]: 2026-02-01 09:51:46.951091215 +0000 UTC m=+0.161097462 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:51:46 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:51:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:47.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:47 np0005604215.localdomain ceph-mon[298604]: pgmap v78: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:47.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:47.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:48.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:48.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:48.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:51:49 np0005604215.localdomain ceph-mon[298604]: pgmap v79: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:49 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:51:49.873 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:51:49 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:51:49.874 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:51:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.116 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.117 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3611657215' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.139 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.139 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.140 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.140 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.140 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:51:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:51:50 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2994643580' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.592 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.452s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.770 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.772 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11992MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.772 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.773 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.923 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.923 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:51:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:50.984 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:51:51 np0005604215.localdomain ceph-mon[298604]: pgmap v80: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1254538' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2994643580' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:51:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3487162458' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:51.408 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:51:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:51.413 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:51:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:51.439 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:51:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:51.441 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:51:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:51.442 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:51:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3487162458' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:52 np0005604215.localdomain ceph-mon[298604]: pgmap v81: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2242716200' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:51:53.425 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:51:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:51:54 np0005604215.localdomain ceph-mon[298604]: pgmap v82: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2886766219' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:51:54 np0005604215.localdomain podman[303496]: 2026-02-01 09:51:54.866375492 +0000 UTC m=+0.080891232 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute)
Feb 01 09:51:54 np0005604215.localdomain podman[303496]: 2026-02-01 09:51:54.876200398 +0000 UTC m=+0.090716138 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:51:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:51:54.877 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:51:54 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:51:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:51:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:56 np0005604215.localdomain ceph-mon[298604]: pgmap v83: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:51:59 np0005604215.localdomain ceph-mon[298604]: pgmap v84: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:52:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:52:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:52:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 09:52:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:52:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18273 "" "Go-http-client/1.1"
Feb 01 09:52:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:00 np0005604215.localdomain ceph-mon[298604]: pgmap v85: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:52:00 np0005604215.localdomain systemd[1]: tmp-crun.0SPMoT.mount: Deactivated successfully.
Feb 01 09:52:00 np0005604215.localdomain podman[303515]: 2026-02-01 09:52:00.87320184 +0000 UTC m=+0.087208539 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:52:00 np0005604215.localdomain podman[303515]: 2026-02-01 09:52:00.906614432 +0000 UTC m=+0.120621081 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:52:00 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:52:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:52:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:52:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:52:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:52:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:52:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:52:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v86: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:02 np0005604215.localdomain ceph-mon[298604]: pgmap v86: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:02 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:02.741 2 INFO neutron.agent.securitygroups_rpc [None req-3ef8af06-8ebd-433a-810c-d499d03d752f 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']
Feb 01 09:52:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v87: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:04 np0005604215.localdomain ceph-mon[298604]: pgmap v87: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:04.978 2 INFO neutron.agent.securitygroups_rpc [None req-48e73cc9-05d5-4151-9bdf-df0a5d68c81e 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']
Feb 01 09:52:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:05.497 259225 INFO neutron.agent.linux.ip_lib [None req-e9ae3fea-bfdb-4e14-8ee9-6ee644d63438 - - - - - -] Device tapc8e9dce8-3c cannot be used as it has no MAC address
Feb 01 09:52:05 np0005604215.localdomain kernel: device tapc8e9dce8-3c entered promiscuous mode
Feb 01 09:52:05 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:05Z|00030|binding|INFO|Claiming lport c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 for this chassis.
Feb 01 09:52:05 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:05Z|00031|binding|INFO|c8e9dce8-3cef-4d4b-8d3c-5d13d0890663: Claiming unknown
Feb 01 09:52:05 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939525.5650] manager: (tapc8e9dce8-3c): new Generic device (/org/freedesktop/NetworkManager/Devices/14)
Feb 01 09:52:05 np0005604215.localdomain systemd-udevd[303549]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:05.584 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-9c0246b2-3507-4017-b8dd-01251187a6c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c0246b2-3507-4017-b8dd-01251187a6c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=722e7a10-7816-489f-9516-bc350daf9fce, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=c8e9dce8-3cef-4d4b-8d3c-5d13d0890663) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:05.589 158655 INFO neutron.agent.ovn.metadata.agent [-] Port c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 in datapath 9c0246b2-3507-4017-b8dd-01251187a6c3 bound to our chassis
Feb 01 09:52:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:05.592 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2f157b64-12ad-48f6-bd1f-788194f131e8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:52:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:05.592 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c0246b2-3507-4017-b8dd-01251187a6c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:52:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:05.593 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[c92a974e-63e1-4f07-8d7c-c9b4c0784913]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:05 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:05Z|00032|binding|INFO|Setting lport c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 ovn-installed in OVS
Feb 01 09:52:05 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:05Z|00033|binding|INFO|Setting lport c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 up in Southbound
Feb 01 09:52:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v88: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:06 np0005604215.localdomain ceph-mon[298604]: pgmap v88: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:06 np0005604215.localdomain podman[303602]: 
Feb 01 09:52:06 np0005604215.localdomain podman[303602]: 2026-02-01 09:52:06.428244297 +0000 UTC m=+0.087317113 container create 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 09:52:06 np0005604215.localdomain systemd[1]: Started libpod-conmon-23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0.scope.
Feb 01 09:52:06 np0005604215.localdomain systemd[1]: tmp-crun.HVqGBr.mount: Deactivated successfully.
Feb 01 09:52:06 np0005604215.localdomain podman[303602]: 2026-02-01 09:52:06.383828992 +0000 UTC m=+0.042901838 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:52:06 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:06 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/53daeb5ecd744789a19f463b75866691ebb76af9c46948b934b77d0920c93713/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:06 np0005604215.localdomain podman[303602]: 2026-02-01 09:52:06.515376213 +0000 UTC m=+0.174449039 container init 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:52:06 np0005604215.localdomain podman[303602]: 2026-02-01 09:52:06.523975981 +0000 UTC m=+0.183048807 container start 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 09:52:06 np0005604215.localdomain dnsmasq[303619]: started, version 2.85 cachesize 150
Feb 01 09:52:06 np0005604215.localdomain dnsmasq[303619]: DNS service limited to local subnets
Feb 01 09:52:06 np0005604215.localdomain dnsmasq[303619]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:52:06 np0005604215.localdomain dnsmasq[303619]: warning: no upstream servers configured
Feb 01 09:52:06 np0005604215.localdomain dnsmasq-dhcp[303619]: DHCP, static leases only on 19.80.0.0, lease time 1d
Feb 01 09:52:06 np0005604215.localdomain dnsmasq[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/addn_hosts - 0 addresses
Feb 01 09:52:06 np0005604215.localdomain dnsmasq-dhcp[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/host
Feb 01 09:52:06 np0005604215.localdomain dnsmasq-dhcp[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/opts
Feb 01 09:52:06 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:06.567 259225 INFO neutron.agent.dhcp.agent [None req-d86e1c16-84a5-4dd8-a2ec-4aa9b3dd89ba - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:04Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323e62e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032366be0>], id=d16170e5-2dd1-4d5e-a380-5344cdba0aa7, ip_allocation=immediate, mac_address=fa:16:3e:db:2d:9c, name=tempest-subport-491001553, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:02Z, description=, dns_domain=, id=9c0246b2-3507-4017-b8dd-01251187a6c3, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-subport_net-910372982, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=95, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=271, status=ACTIVE, subnets=['4d0b3f04-7e3e-474c-9d7e-bf58f363cb51'], tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:52:03Z, vlan_transparent=None, network_id=9c0246b2-3507-4017-b8dd-01251187a6c3, port_security_enabled=True, project_id=ebe5e345d591408fa955b2e811bfaffb, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f05aaf36-904c-44ae-a203-34e61744db7d'], standard_attr_id=276, status=DOWN, tags=[], tenant_id=ebe5e345d591408fa955b2e811bfaffb, updated_at=2026-02-01T09:52:04Z on network 9c0246b2-3507-4017-b8dd-01251187a6c3
Feb 01 09:52:06 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:06.674 259225 INFO neutron.agent.dhcp.agent [None req-11cea9ce-82f8-4593-a73b-419f2d4bec5d - - - - - -] DHCP configuration for ports {'8e91955e-c3fb-4309-8605-7dae9ca4cd95'} is completed
Feb 01 09:52:06 np0005604215.localdomain dnsmasq[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/addn_hosts - 1 addresses
Feb 01 09:52:06 np0005604215.localdomain dnsmasq-dhcp[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/host
Feb 01 09:52:06 np0005604215.localdomain podman[303635]: 2026-02-01 09:52:06.825464659 +0000 UTC m=+0.059147425 container kill 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:52:06 np0005604215.localdomain dnsmasq-dhcp[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/opts
Feb 01 09:52:07 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:07.835 259225 INFO neutron.agent.dhcp.agent [None req-63c1010f-df59-4e27-bee2-d2dcf01f190a - - - - - -] DHCP configuration for ports {'d16170e5-2dd1-4d5e-a380-5344cdba0aa7'} is completed
Feb 01 09:52:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v89: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:08 np0005604215.localdomain ceph-mon[298604]: pgmap v89: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v90: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:10 np0005604215.localdomain ceph-mon[298604]: pgmap v90: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v91: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:12 np0005604215.localdomain ceph-mon[298604]: pgmap v91: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:52:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:52:12 np0005604215.localdomain podman[303657]: 2026-02-01 09:52:12.913003834 +0000 UTC m=+0.118946359 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:52:12 np0005604215.localdomain systemd[1]: tmp-crun.BdsBs5.mount: Deactivated successfully.
Feb 01 09:52:12 np0005604215.localdomain podman[303656]: 2026-02-01 09:52:12.934920498 +0000 UTC m=+0.141237194 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:12 np0005604215.localdomain podman[303657]: 2026-02-01 09:52:12.949707468 +0000 UTC m=+0.155649993 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:52:12 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:52:12 np0005604215.localdomain podman[303656]: 2026-02-01 09:52:12.975726829 +0000 UTC m=+0.182043565 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 01 09:52:12 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3624140201' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #19. Immutable memtables: 0.
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.442238) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 7] Flushing memtable with next log file: 19
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533442277, "job": 7, "event": "flush_started", "num_memtables": 1, "num_entries": 2518, "num_deletes": 252, "total_data_size": 6620019, "memory_usage": 6994848, "flush_reason": "Manual Compaction"}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 7] Level-0 flush table #20: started
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533469189, "cf_name": "default", "job": 7, "event": "table_file_creation", "file_number": 20, "file_size": 4263172, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 14172, "largest_seqno": 16685, "table_properties": {"data_size": 4253747, "index_size": 5866, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2501, "raw_key_size": 21052, "raw_average_key_size": 21, "raw_value_size": 4234297, "raw_average_value_size": 4307, "num_data_blocks": 251, "num_entries": 983, "num_filter_entries": 983, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939360, "oldest_key_time": 1769939360, "file_creation_time": 1769939533, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 20, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 7] Flush lasted 27010 microseconds, and 8590 cpu microseconds.
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.469246) [db/flush_job.cc:967] [default] [JOB 7] Level-0 flush table #20: 4263172 bytes OK
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.469271) [db/memtable_list.cc:519] [default] Level-0 commit table #20 started
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.471144) [db/memtable_list.cc:722] [default] Level-0 commit table #20: memtable #1 done
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.471165) EVENT_LOG_v1 {"time_micros": 1769939533471160, "job": 7, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.471190) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 7] Try to delete WAL files size 6608621, prev total WAL file size 6608621, number of live WAL files 2.
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000016.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.472532) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131323935' seq:72057594037927935, type:22 .. '7061786F73003131353437' seq:0, type:0; will stop at (end)
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 8] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 7 Base level 0, inputs: [20(4163KB)], [18(19MB)]
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533472585, "job": 8, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [20], "files_L6": [18], "score": -1, "input_data_size": 24787287, "oldest_snapshot_seqno": -1}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 8] Generated table #21: 12258 keys, 21729359 bytes, temperature: kUnknown
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533627180, "cf_name": "default", "job": 8, "event": "table_file_creation", "file_number": 21, "file_size": 21729359, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21659258, "index_size": 38384, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30661, "raw_key_size": 329844, "raw_average_key_size": 26, "raw_value_size": 21450111, "raw_average_value_size": 1749, "num_data_blocks": 1453, "num_entries": 12258, "num_filter_entries": 12258, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939533, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 21, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.627565) [db/compaction/compaction_job.cc:1663] [default] [JOB 8] Compacted 1@0 + 1@6 files to L6 => 21729359 bytes
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.629397) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 160.2 rd, 140.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(4.1, 19.6 +0.0 blob) out(20.7 +0.0 blob), read-write-amplify(10.9) write-amplify(5.1) OK, records in: 12792, records dropped: 534 output_compression: NoCompression
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.629427) EVENT_LOG_v1 {"time_micros": 1769939533629415, "job": 8, "event": "compaction_finished", "compaction_time_micros": 154747, "compaction_time_cpu_micros": 50939, "output_level": 6, "num_output_files": 1, "total_output_size": 21729359, "num_input_records": 12792, "num_output_records": 12258, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000020.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533630533, "job": 8, "event": "table_file_deletion", "file_number": 20}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000018.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939533633796, "job": 8, "event": "table_file_deletion", "file_number": 18}
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.472471) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.633980) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.633993) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.633999) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.634004) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:52:13 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:52:13.634011) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:52:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v92: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:14 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2868492340' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:14 np0005604215.localdomain ceph-mon[298604]: pgmap v92: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v93: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:16 np0005604215.localdomain ceph-mon[298604]: pgmap v93: 177 pgs: 177 active+clean; 145 MiB data, 707 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:52:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:52:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:52:17 np0005604215.localdomain podman[303705]: 2026-02-01 09:52:17.867926895 +0000 UTC m=+0.078701915 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., version=9.7, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, architecture=x86_64, release=1769056855)
Feb 01 09:52:17 np0005604215.localdomain podman[303705]: 2026-02-01 09:52:17.911626467 +0000 UTC m=+0.122401437 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, architecture=x86_64, version=9.7, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 09:52:17 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:52:17 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1489238524' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:17 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3628873771' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:17 np0005604215.localdomain podman[303706]: 2026-02-01 09:52:17.918208293 +0000 UTC m=+0.125979708 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:52:18 np0005604215.localdomain podman[303706]: 2026-02-01 09:52:18.000608261 +0000 UTC m=+0.208379696 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible)
Feb 01 09:52:18 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:52:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v94: 177 pgs: 177 active+clean; 238 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 3.5 MiB/s wr, 60 op/s
Feb 01 09:52:18 np0005604215.localdomain ceph-mon[298604]: pgmap v94: 177 pgs: 177 active+clean; 238 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 3.5 MiB/s wr, 60 op/s
Feb 01 09:52:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2363802296' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/942500886' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v95: 177 pgs: 177 active+clean; 238 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 3.5 MiB/s wr, 60 op/s
Feb 01 09:52:20 np0005604215.localdomain ceph-mon[298604]: pgmap v95: 177 pgs: 177 active+clean; 238 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 3.5 MiB/s wr, 60 op/s
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:52:21
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['manila_metadata', 'manila_data', 'vms', '.mgr', 'backups', 'volumes', 'images']
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.006291494364182021 of space, bias 1.0, pg target 1.2582988728364042 quantized to 32 (current 32)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8555772569444443 quantized to 32 (current 32)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.0021628687418574354 quantized to 16 (current 16)
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:52:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:52:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v96: 177 pgs: 177 active+clean; 238 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 3.5 MiB/s wr, 60 op/s
Feb 01 09:52:22 np0005604215.localdomain ceph-mon[298604]: pgmap v96: 177 pgs: 177 active+clean; 238 MiB data, 787 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 3.5 MiB/s wr, 60 op/s
Feb 01 09:52:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v97: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 4.7 MiB/s rd, 3.6 MiB/s wr, 179 op/s
Feb 01 09:52:24 np0005604215.localdomain ceph-mon[298604]: pgmap v97: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 4.7 MiB/s rd, 3.6 MiB/s wr, 179 op/s
Feb 01 09:52:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:24.272 274321 DEBUG nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Creating tmpfile /var/lib/nova/instances/tmpm_4plr78 to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041
Feb 01 09:52:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:24.834 274321 DEBUG nova.compute.manager [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm_4plr78',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476
Feb 01 09:52:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:24.860 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:52:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:24.861 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:52:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:24.872 274321 INFO nova.compute.rpcapi [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66
Feb 01 09:52:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:24.873 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:52:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:25 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:52:25 np0005604215.localdomain podman[303742]: 2026-02-01 09:52:25.867347253 +0000 UTC m=+0.082758120 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:52:25 np0005604215.localdomain podman[303742]: 2026-02-01 09:52:25.876439427 +0000 UTC m=+0.091850284 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:52:25 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:52:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:25.963 259225 INFO neutron.agent.linux.ip_lib [None req-f3b93c3d-7229-4877-93ce-2ef790c80c9d - - - - - -] Device tap9ba17182-29 cannot be used as it has no MAC address
Feb 01 09:52:25 np0005604215.localdomain kernel: device tap9ba17182-29 entered promiscuous mode
Feb 01 09:52:25 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939545.9898] manager: (tap9ba17182-29): new Generic device (/org/freedesktop/NetworkManager/Devices/15)
Feb 01 09:52:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:25Z|00034|binding|INFO|Claiming lport 9ba17182-297c-4dca-a0cf-d9bfe1422e70 for this chassis.
Feb 01 09:52:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:25Z|00035|binding|INFO|9ba17182-297c-4dca-a0cf-d9bfe1422e70: Claiming unknown
Feb 01 09:52:25 np0005604215.localdomain systemd-udevd[303772]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:25.998 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-1a91bc36-a078-4e5e-bd8f-3f791a7ad269', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a91bc36-a078-4e5e-bd8f-3f791a7ad269', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a75b32a03c2b49f0927f81d1bf3f53d7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c10e3803-7396-403d-9d9d-ba485ed9d9b4, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9ba17182-297c-4dca-a0cf-d9bfe1422e70) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:25.999 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9ba17182-297c-4dca-a0cf-d9bfe1422e70 in datapath 1a91bc36-a078-4e5e-bd8f-3f791a7ad269 bound to our chassis
Feb 01 09:52:26 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:26.003 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port f6ad9985-5263-405a-b8c4-55a01e5f2ffe IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:52:26 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:26.003 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a91bc36-a078-4e5e-bd8f-3f791a7ad269, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:52:26 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:26.003 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5b3ef1b8-8919-4579-8ce9-b1460ef6fa02]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:26 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:26Z|00036|binding|INFO|Setting lport 9ba17182-297c-4dca-a0cf-d9bfe1422e70 ovn-installed in OVS
Feb 01 09:52:26 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:26Z|00037|binding|INFO|Setting lport 9ba17182-297c-4dca-a0cf-d9bfe1422e70 up in Southbound
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9ba17182-29: No such device
Feb 01 09:52:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v98: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 4.7 MiB/s rd, 3.6 MiB/s wr, 179 op/s
Feb 01 09:52:26 np0005604215.localdomain ceph-mon[298604]: pgmap v98: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 4.7 MiB/s rd, 3.6 MiB/s wr, 179 op/s
Feb 01 09:52:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:26.795 274321 DEBUG nova.compute.manager [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm_4plr78',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5aefea54-941a-48bf-ad9e-7f13fdfdb4ed',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604
Feb 01 09:52:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:26.824 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquiring lock "refresh_cache-5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:52:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:26.825 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquired lock "refresh_cache-5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:52:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:26.825 274321 DEBUG nova.network.neutron [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 01 09:52:26 np0005604215.localdomain podman[303844]: 
Feb 01 09:52:26 np0005604215.localdomain podman[303844]: 2026-02-01 09:52:26.856483326 +0000 UTC m=+0.091481872 container create 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 01 09:52:26 np0005604215.localdomain systemd[1]: Started libpod-conmon-1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38.scope.
Feb 01 09:52:26 np0005604215.localdomain podman[303844]: 2026-02-01 09:52:26.81203022 +0000 UTC m=+0.047028746 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:52:26 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/44b6a4c8167cba5b5fbdfbf9820bb6cd4a6fbd5e72379076f4e21cd139706606/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:26 np0005604215.localdomain podman[303844]: 2026-02-01 09:52:26.953247842 +0000 UTC m=+0.188246338 container init 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:52:26 np0005604215.localdomain podman[303844]: 2026-02-01 09:52:26.958995871 +0000 UTC m=+0.193994377 container start 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS)
Feb 01 09:52:26 np0005604215.localdomain dnsmasq[303862]: started, version 2.85 cachesize 150
Feb 01 09:52:26 np0005604215.localdomain dnsmasq[303862]: DNS service limited to local subnets
Feb 01 09:52:26 np0005604215.localdomain dnsmasq[303862]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:52:26 np0005604215.localdomain dnsmasq[303862]: warning: no upstream servers configured
Feb 01 09:52:26 np0005604215.localdomain dnsmasq-dhcp[303862]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:52:26 np0005604215.localdomain dnsmasq[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/addn_hosts - 0 addresses
Feb 01 09:52:26 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/host
Feb 01 09:52:26 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/opts
Feb 01 09:52:27 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:27.064 259225 INFO neutron.agent.dhcp.agent [None req-3479b879-d760-4820-af00-c8479ea8edce - - - - - -] DHCP configuration for ports {'a095506e-75b5-4165-a26f-acc54923bd6f'} is completed
Feb 01 09:52:27 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:27.159 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:26Z, description=, device_id=f4abece4-f6f1-47bb-a6bc-1160a4cf7739, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236ca90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00324b4520>], id=37cf06b3-ab28-46e6-8f77-67f52e288c13, ip_allocation=immediate, mac_address=fa:16:3e:26:e3:7c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:23Z, description=, dns_domain=, id=1a91bc36-a078-4e5e-bd8f-3f791a7ad269, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-129983686-network, port_security_enabled=True, project_id=a75b32a03c2b49f0927f81d1bf3f53d7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48951, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=431, status=ACTIVE, subnets=['b8fa8a7c-8ae5-460f-a6c7-4e652af56379'], tags=[], tenant_id=a75b32a03c2b49f0927f81d1bf3f53d7, updated_at=2026-02-01T09:52:24Z, vlan_transparent=None, network_id=1a91bc36-a078-4e5e-bd8f-3f791a7ad269, port_security_enabled=False, project_id=a75b32a03c2b49f0927f81d1bf3f53d7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=455, status=DOWN, tags=[], tenant_id=a75b32a03c2b49f0927f81d1bf3f53d7, updated_at=2026-02-01T09:52:27Z on network 1a91bc36-a078-4e5e-bd8f-3f791a7ad269
Feb 01 09:52:27 np0005604215.localdomain dnsmasq[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/addn_hosts - 1 addresses
Feb 01 09:52:27 np0005604215.localdomain podman[303881]: 2026-02-01 09:52:27.362583052 +0000 UTC m=+0.057736961 container kill 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:52:27 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/host
Feb 01 09:52:27 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/opts
Feb 01 09:52:27 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:27.633 259225 INFO neutron.agent.dhcp.agent [None req-130b50b6-1d53-46d0-a999-4088b2e200fe - - - - - -] DHCP configuration for ports {'37cf06b3-ab28-46e6-8f77-67f52e288c13'} is completed
Feb 01 09:52:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:27.974 274321 DEBUG nova.network.neutron [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Updating instance_info_cache with network_info: [{"id": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "address": "fa:16:3e:6e:4d:83", "network": {"id": "01cb494b-1310-460f-acbe-602aefea39c6", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1791362587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "ebe5e345d591408fa955b2e811bfaffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aeb3a2-ba", "ovs_interfaceid": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.039 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Releasing lock "refresh_cache-5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.042 274321 DEBUG nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm_4plr78',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5aefea54-941a-48bf-ad9e-7f13fdfdb4ed',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.043 274321 DEBUG nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Creating instance directory: /var/lib/nova/instances/5aefea54-941a-48bf-ad9e-7f13fdfdb4ed pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.044 274321 DEBUG nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Ensure instance console log exists: /var/lib/nova/instances/5aefea54-941a-48bf-ad9e-7f13fdfdb4ed/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.045 274321 DEBUG nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.046 274321 DEBUG nova.virt.libvirt.vif [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-01T09:52:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-328365138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005604213.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-328365138',id=7,image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-01T09:52:21Z,launched_on='np0005604213.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005604213.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='ebe5e345d591408fa955b2e811bfaffb',ramdisk_id='',reservation_id='r-hz7zc7vw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1924784790',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1924784790-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-01T09:52:21Z,user_data=None,user_id='336655b6a22d4371b0a5cd24b959dc9a',uuid=5aefea54-941a-48bf-ad9e-7f13fdfdb4ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "address": "fa:16:3e:6e:4d:83", "network": {"id": "01cb494b-1310-460f-acbe-602aefea39c6", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1791362587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "ebe5e345d591408fa955b2e811bfaffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96aeb3a2-ba", "ovs_interfaceid": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.047 274321 DEBUG nova.network.os_vif_util [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Converting VIF {"id": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "address": "fa:16:3e:6e:4d:83", "network": {"id": "01cb494b-1310-460f-acbe-602aefea39c6", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1791362587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "ebe5e345d591408fa955b2e811bfaffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap96aeb3a2-ba", "ovs_interfaceid": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.048 274321 DEBUG nova.network.os_vif_util [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:4d:83,bridge_name='br-int',has_traffic_filtering=True,id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa,network=Network(01cb494b-1310-460f-acbe-602aefea39c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96aeb3a2-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.049 274321 DEBUG os_vif [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:4d:83,bridge_name='br-int',has_traffic_filtering=True,id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa,network=Network(01cb494b-1310-460f-acbe-602aefea39c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96aeb3a2-ba') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.099 274321 DEBUG ovsdbapp.backend.ovs_idl [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.099 274321 DEBUG ovsdbapp.backend.ovs_idl [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.100 274321 DEBUG ovsdbapp.backend.ovs_idl [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.101 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.101 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [POLLOUT] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.102 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.103 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.105 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.108 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.130 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.130 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.130 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.131 274321 INFO oslo.privsep.daemon [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmp2vvhpzoa/privsep.sock']
Feb 01 09:52:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v99: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 5.6 MiB/s rd, 3.6 MiB/s wr, 208 op/s
Feb 01 09:52:28 np0005604215.localdomain ceph-mon[298604]: pgmap v99: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 5.6 MiB/s rd, 3.6 MiB/s wr, 208 op/s
Feb 01 09:52:28 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:28.431 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:26Z, description=, device_id=f4abece4-f6f1-47bb-a6bc-1160a4cf7739, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236f760>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323351c0>], id=37cf06b3-ab28-46e6-8f77-67f52e288c13, ip_allocation=immediate, mac_address=fa:16:3e:26:e3:7c, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:23Z, description=, dns_domain=, id=1a91bc36-a078-4e5e-bd8f-3f791a7ad269, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServerGroupTestJSON-129983686-network, port_security_enabled=True, project_id=a75b32a03c2b49f0927f81d1bf3f53d7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=48951, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=431, status=ACTIVE, subnets=['b8fa8a7c-8ae5-460f-a6c7-4e652af56379'], tags=[], tenant_id=a75b32a03c2b49f0927f81d1bf3f53d7, updated_at=2026-02-01T09:52:24Z, vlan_transparent=None, network_id=1a91bc36-a078-4e5e-bd8f-3f791a7ad269, port_security_enabled=False, project_id=a75b32a03c2b49f0927f81d1bf3f53d7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=455, status=DOWN, tags=[], tenant_id=a75b32a03c2b49f0927f81d1bf3f53d7, updated_at=2026-02-01T09:52:27Z on network 1a91bc36-a078-4e5e-bd8f-3f791a7ad269
Feb 01 09:52:28 np0005604215.localdomain systemd[1]: tmp-crun.RMAzWF.mount: Deactivated successfully.
Feb 01 09:52:28 np0005604215.localdomain dnsmasq[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/addn_hosts - 1 addresses
Feb 01 09:52:28 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/host
Feb 01 09:52:28 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/opts
Feb 01 09:52:28 np0005604215.localdomain podman[303920]: 2026-02-01 09:52:28.657751104 +0000 UTC m=+0.071134058 container kill 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.731 274321 INFO oslo.privsep.daemon [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Spawned new privsep daemon via rootwrap
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.629 303931 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.635 303931 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.638 303931 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none
Feb 01 09:52:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:28.638 303931 INFO oslo.privsep.daemon [-] privsep daemon running as pid 303931
Feb 01 09:52:28 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:28.908 259225 INFO neutron.agent.dhcp.agent [None req-467565a1-31c4-4904-a19f-829412b5f22d - - - - - -] DHCP configuration for ports {'37cf06b3-ab28-46e6-8f77-67f52e288c13'} is completed
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.014 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.014 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap96aeb3a2-ba, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.015 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap96aeb3a2-ba, col_values=(('external_ids', {'iface-id': '96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:4d:83', 'vm-uuid': '5aefea54-941a-48bf-ad9e-7f13fdfdb4ed'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.068 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.071 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.074 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.076 274321 INFO os_vif [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:4d:83,bridge_name='br-int',has_traffic_filtering=True,id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa,network=Network(01cb494b-1310-460f-acbe-602aefea39c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96aeb3a2-ba')
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.077 274321 DEBUG nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954
Feb 01 09:52:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:29.078 274321 DEBUG nova.compute.manager [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm_4plr78',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5aefea54-941a-48bf-ad9e-7f13fdfdb4ed',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668
Feb 01 09:52:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:52:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:52:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:52:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159002 "" "Go-http-client/1.1"
Feb 01 09:52:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:52:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19215 "" "Go-http-client/1.1"
Feb 01 09:52:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e93 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v100: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 25 KiB/s wr, 147 op/s
Feb 01 09:52:30 np0005604215.localdomain ceph-mon[298604]: pgmap v100: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 25 KiB/s wr, 147 op/s
Feb 01 09:52:30 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:30Z|00038|ovn_bfd|INFO|Enabled BFD on interface ovn-2186fb-0
Feb 01 09:52:30 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:30Z|00039|ovn_bfd|INFO|Enabled BFD on interface ovn-e1cc33-0
Feb 01 09:52:30 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:30Z|00040|ovn_bfd|INFO|Enabled BFD on interface ovn-45aa31-0
Feb 01 09:52:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:30.610 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:30.618 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:30.621 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:30.697 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:30 np0005604215.localdomain sudo[303949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:52:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:52:30 np0005604215.localdomain sudo[303949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:52:30 np0005604215.localdomain sudo[303949]: pam_unix(sudo:session): session closed for user root
Feb 01 09:52:31 np0005604215.localdomain sudo[303973]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:52:31 np0005604215.localdomain sudo[303973]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:52:31 np0005604215.localdomain podman[303966]: 2026-02-01 09:52:31.051125677 +0000 UTC m=+0.084534877 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:52:31 np0005604215.localdomain podman[303966]: 2026-02-01 09:52:31.060329824 +0000 UTC m=+0.093739014 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:52:31 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:52:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:31.194 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:31.469 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:52:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:52:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:52:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:52:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:52:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:52:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:31.590 274321 DEBUG nova.network.neutron [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Port 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa updated with migration profile {'migrating_to': 'np0005604215.localdomain'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354
Feb 01 09:52:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:31.593 274321 DEBUG nova.compute.manager [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=13312,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmpm_4plr78',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='5aefea54-941a-48bf-ad9e-7f13fdfdb4ed',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723
Feb 01 09:52:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:31.612 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:31 np0005604215.localdomain sudo[303973]: pam_unix(sudo:session): session closed for user root
Feb 01 09:52:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:31.689 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:52:31 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 688c4493-3079-406c-84fe-5a7e2c84e7cc (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:52:31 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 688c4493-3079-406c-84fe-5a7e2c84e7cc (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:52:31 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 688c4493-3079-406c-84fe-5a7e2c84e7cc (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:52:31 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:52:31 np0005604215.localdomain sshd[304039]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:52:31 np0005604215.localdomain sshd[304039]: Accepted publickey for nova from 172.17.0.107 port 55008 ssh2: ECDSA SHA256:ro0RZDXj08pmWDFvYIfkoVUhow0JbR0p/CQtyJtA2KY
Feb 01 09:52:31 np0005604215.localdomain systemd[1]: Created slice User Slice of UID 42436.
Feb 01 09:52:31 np0005604215.localdomain systemd[1]: Starting User Runtime Directory /run/user/42436...
Feb 01 09:52:31 np0005604215.localdomain systemd-logind[761]: New session 76 of user nova.
Feb 01 09:52:31 np0005604215.localdomain systemd[1]: Finished User Runtime Directory /run/user/42436.
Feb 01 09:52:31 np0005604215.localdomain systemd[1]: Starting User Manager for UID 42436...
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: pam_unix(systemd-user:session): session opened for user nova(uid=42436) by (uid=0)
Feb 01 09:52:32 np0005604215.localdomain sudo[304045]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:52:32 np0005604215.localdomain sudo[304045]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:52:32 np0005604215.localdomain sudo[304045]: pam_unix(sudo:session): session closed for user root
Feb 01 09:52:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v101: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 25 KiB/s wr, 147 op/s
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Queued start job for default target Main User Target.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Created slice User Application Slice.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Started Mark boot as successful after the user session has run 2 minutes.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Started Daily Cleanup of User's Temporary Directories.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Reached target Paths.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Reached target Timers.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Starting D-Bus User Message Bus Socket...
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Starting Create User's Volatile Files and Directories...
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Listening on D-Bus User Message Bus Socket.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Reached target Sockets.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Finished Create User's Volatile Files and Directories.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Reached target Basic System.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Reached target Main User Target.
Feb 01 09:52:32 np0005604215.localdomain systemd[304043]: Startup finished in 150ms.
Feb 01 09:52:32 np0005604215.localdomain systemd[1]: Started User Manager for UID 42436.
Feb 01 09:52:32 np0005604215.localdomain systemd[1]: Started Session 76 of User nova.
Feb 01 09:52:32 np0005604215.localdomain sshd[304039]: pam_unix(sshd:session): session opened for user nova(uid=42436) by (uid=0)
Feb 01 09:52:32 np0005604215.localdomain systemd[1]: Started libvirt secret daemon.
Feb 01 09:52:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:52:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:52:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:52:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:52:32 np0005604215.localdomain ceph-mon[298604]: pgmap v101: 177 pgs: 177 active+clean; 238 MiB data, 839 MiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 25 KiB/s wr, 147 op/s
Feb 01 09:52:32 np0005604215.localdomain kernel: tun: Universal TUN/TAP device driver, 1.6
Feb 01 09:52:32 np0005604215.localdomain kernel: device tap96aeb3a2-ba entered promiscuous mode
Feb 01 09:52:32 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939552.4201] manager: (tap96aeb3a2-ba): new Tun device (/org/freedesktop/NetworkManager/Devices/16)
Feb 01 09:52:32 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:32Z|00041|binding|INFO|Claiming lport 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa for this additional chassis.
Feb 01 09:52:32 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:32Z|00042|binding|INFO|96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa: Claiming fa:16:3e:6e:4d:83 10.100.0.11
Feb 01 09:52:32 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:32Z|00043|binding|INFO|Claiming lport d16170e5-2dd1-4d5e-a380-5344cdba0aa7 for this additional chassis.
Feb 01 09:52:32 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:32Z|00044|binding|INFO|d16170e5-2dd1-4d5e-a380-5344cdba0aa7: Claiming fa:16:3e:db:2d:9c 19.80.0.33
Feb 01 09:52:32 np0005604215.localdomain systemd-udevd[304112]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:32.424 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:32.432 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:32 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939552.4437] device (tap96aeb3a2-ba): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 01 09:52:32 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939552.4446] device (tap96aeb3a2-ba): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 01 09:52:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:32.480 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:32 np0005604215.localdomain systemd-machined[202466]: New machine qemu-1-instance-00000007.
Feb 01 09:52:32 np0005604215.localdomain systemd[1]: Started Virtual Machine qemu-1-instance-00000007.
Feb 01 09:52:32 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:32Z|00045|binding|INFO|Setting lport 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa ovn-installed in OVS
Feb 01 09:52:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:32.497 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:32.812 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939552.8116102, 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:52:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:32.812 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] VM Started (Lifecycle Event)
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.001 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.032 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939553.0323696, 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.033 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] VM Resumed (Lifecycle Event)
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.068 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.074 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.101 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] During the sync_power process the instance has moved from host np0005604213.localdomain to host np0005604215.localdomain
Feb 01 09:52:33 np0005604215.localdomain dnsmasq[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/addn_hosts - 0 addresses
Feb 01 09:52:33 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/host
Feb 01 09:52:33 np0005604215.localdomain dnsmasq-dhcp[303862]: read /var/lib/neutron/dhcp/1a91bc36-a078-4e5e-bd8f-3f791a7ad269/opts
Feb 01 09:52:33 np0005604215.localdomain podman[304182]: 2026-02-01 09:52:33.283624016 +0000 UTC m=+0.062927662 container kill 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:52:33 np0005604215.localdomain sshd[304076]: Received disconnect from 172.17.0.107 port 55008:11: disconnected by user
Feb 01 09:52:33 np0005604215.localdomain sshd[304076]: Disconnected from user nova 172.17.0.107 port 55008
Feb 01 09:52:33 np0005604215.localdomain sshd[304039]: pam_unix(sshd:session): session closed for user nova
Feb 01 09:52:33 np0005604215.localdomain systemd[1]: session-76.scope: Deactivated successfully.
Feb 01 09:52:33 np0005604215.localdomain systemd-logind[761]: Session 76 logged out. Waiting for processes to exit.
Feb 01 09:52:33 np0005604215.localdomain systemd-logind[761]: Removed session 76.
Feb 01 09:52:33 np0005604215.localdomain kernel: device tap9ba17182-29 left promiscuous mode
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.693 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:33 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:33Z|00046|binding|INFO|Releasing lport 9ba17182-297c-4dca-a0cf-d9bfe1422e70 from this chassis (sb_readonly=0)
Feb 01 09:52:33 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:33Z|00047|binding|INFO|Setting lport 9ba17182-297c-4dca-a0cf-d9bfe1422e70 down in Southbound
Feb 01 09:52:33 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:33.704 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-1a91bc36-a078-4e5e-bd8f-3f791a7ad269', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1a91bc36-a078-4e5e-bd8f-3f791a7ad269', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'a75b32a03c2b49f0927f81d1bf3f53d7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c10e3803-7396-403d-9d9d-ba485ed9d9b4, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9ba17182-297c-4dca-a0cf-d9bfe1422e70) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:33 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:33.706 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9ba17182-297c-4dca-a0cf-d9bfe1422e70 in datapath 1a91bc36-a078-4e5e-bd8f-3f791a7ad269 unbound from our chassis
Feb 01 09:52:33 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:33.710 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1a91bc36-a078-4e5e-bd8f-3f791a7ad269, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:52:33 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:33.711 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1428b3-8d02-4a0d-b9bd-5425f08ed35c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:33.722 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e94 e94: 6 total, 6 up, 6 in
Feb 01 09:52:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:34.104 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v103: 177 pgs: 177 active+clean; 292 MiB data, 987 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 5.0 MiB/s wr, 164 op/s
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:4d:83 10.100.0.11
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:4d:83 10.100.0.11
Feb 01 09:52:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00048|binding|INFO|Claiming lport 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa for this chassis.
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00049|binding|INFO|96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa: Claiming fa:16:3e:6e:4d:83 10.100.0.11
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00050|binding|INFO|Claiming lport d16170e5-2dd1-4d5e-a380-5344cdba0aa7 for this chassis.
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00051|binding|INFO|d16170e5-2dd1-4d5e-a380-5344cdba0aa7: Claiming fa:16:3e:db:2d:9c 19.80.0.33
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00052|binding|INFO|Setting lport 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa up in Southbound
Feb 01 09:52:34 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:34Z|00053|binding|INFO|Setting lport d16170e5-2dd1-4d5e-a380-5344cdba0aa7 up in Southbound
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.551 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:2d:9c 19.80.0.33'], port_security=['fa:16:3e:db:2d:9c 19.80.0.33'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-491001553', 'neutron:cidrs': '19.80.0.33/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c0246b2-3507-4017-b8dd-01251187a6c3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-491001553', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'f05aaf36-904c-44ae-a203-34e61744db7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=722e7a10-7816-489f-9516-bc350daf9fce, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=d16170e5-2dd1-4d5e-a380-5344cdba0aa7) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.554 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:4d:83 10.100.0.11'], port_security=['fa:16:3e:6e:4d:83 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-377096059', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5aefea54-941a-48bf-ad9e-7f13fdfdb4ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01cb494b-1310-460f-acbe-602aefea39c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-377096059', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'f05aaf36-904c-44ae-a203-34e61744db7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604213.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae7d4c2f-1d19-4933-99fa-b8aa62feb38e, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.556 158655 INFO neutron.agent.ovn.metadata.agent [-] Port d16170e5-2dd1-4d5e-a380-5344cdba0aa7 in datapath 9c0246b2-3507-4017-b8dd-01251187a6c3 bound to our chassis
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.560 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2f157b64-12ad-48f6-bd1f-788194f131e8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.561 158655 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9c0246b2-3507-4017-b8dd-01251187a6c3
Feb 01 09:52:34 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:34.772 2 WARNING neutron.plugins.ml2.drivers.mech_sriov.agent.sriov_nic_agent [req-7ebd8498-3f44-4651-9102-d6c4dae99d3c req-210d032b-cdb2-4687-85bd-53137bd4893b 0156acb7bf9847849608ca90a8674720 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] This port is not SRIOV, skip binding for port 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa.
Feb 01 09:52:34 np0005604215.localdomain ceph-mon[298604]: osdmap e94: 6 total, 6 up, 6 in
Feb 01 09:52:34 np0005604215.localdomain ceph-mon[298604]: pgmap v103: 177 pgs: 177 active+clean; 292 MiB data, 987 MiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 5.0 MiB/s wr, 164 op/s
Feb 01 09:52:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2572148168' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:52:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2572148168' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.972 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[14e41bcd-43c2-4679-99e7-973c272af255]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.974 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9c0246b2-31 in ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.975 303130 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9c0246b2-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.975 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[571caa3d-dfca-4ff5-b644-558f2f8425a7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.977 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[1d553cdf-dfe3-475b-8719-bbe284a0c0b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:34.998 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[13e64cd0-b4ea-43c0-bd14-ffc38f3b64c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.009 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[edc1753c-5368-42c4-9cfd-284bb2424772]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.011 158655 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp27n_5egp/privsep.sock']
Feb 01 09:52:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:35.061 274321 INFO nova.compute.manager [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Post operation of migration started
Feb 01 09:52:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e94 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:35.228 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquiring lock "refresh_cache-5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:52:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:35.229 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquired lock "refresh_cache-5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:52:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:35.229 274321 DEBUG nova.network.neutron [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.625 158655 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.626 158655 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp27n_5egp/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.519 304214 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.525 304214 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.528 304214 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.529 304214 INFO oslo.privsep.daemon [-] privsep daemon running as pid 304214
Feb 01 09:52:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:35.630 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[4d59efa9-45c2-4067-b997-c11d80440add]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e95 e95: 6 total, 6 up, 6 in
Feb 01 09:52:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:35.857 274321 DEBUG nova.network.neutron [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Updating instance_info_cache with network_info: [{"id": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "address": "fa:16:3e:6e:4d:83", "network": {"id": "01cb494b-1310-460f-acbe-602aefea39c6", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1791362587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "ebe5e345d591408fa955b2e811bfaffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aeb3a2-ba", "ovs_interfaceid": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:52:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:35.986 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Releasing lock "refresh_cache-5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.042 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.043 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.043 274321 DEBUG oslo_concurrency.lockutils [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.051 274321 INFO nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Sending announce-self command to QEMU monitor. Attempt 1 of 3
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.051 304214 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.052 304214 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.052 304214 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:36 np0005604215.localdomain virtqemud[224673]: Domain id=1 name='instance-00000007' uuid=5aefea54-941a-48bf-ad9e-7f13fdfdb4ed is tainted: custom-monitor
Feb 01 09:52:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v105: 177 pgs: 177 active+clean; 292 MiB data, 987 MiB used, 41 GiB / 42 GiB avail; 852 KiB/s rd, 6.3 MiB/s wr, 161 op/s
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.226 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:52:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.545 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[e7eea43a-4de5-4d2d-bfad-9f059f5ac240]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain systemd-udevd[304224]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:36 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939556.5768] manager: (tap9c0246b2-30): new Veth device (/org/freedesktop/NetworkManager/Devices/17)
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.571 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7626a49d-ed00-48c4-966a-074d73e1cfe5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.614 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[baa897f7-8275-4887-9a38-1a343785b2e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.618 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[c4f3573a-338a-4695-8ff7-7161b5154b30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9c0246b2-31: link becomes ready
Feb 01 09:52:36 np0005604215.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9c0246b2-30: link becomes ready
Feb 01 09:52:36 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939556.6387] device (tap9c0246b2-30): carrier: link connected
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.642 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[9bfa03a8-535c-4500-9e43-66f76ee3ad17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.662 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5ee79063-000d-450c-9117-4a2c09da834d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c0246b2-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:90:aa:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164074, 'reachable_time': 15046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304244, 'error': None, 'target': 'ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.678 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ebba61ea-027e-49b9-9332-1d942e8bf1cf]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe90:aa37'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1164074, 'tstamp': 1164074}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304245, 'error': None, 'target': 'ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.695 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7fecdd7f-048a-4dfa-9b35-f6fe39545552]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9c0246b2-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:90:aa:37'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164074, 'reachable_time': 15046, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304246, 'error': None, 'target': 'ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.722 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[68714008-38b9-42ec-8c3d-93ab475d3ac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.765 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[651633e6-78d6-4c0b-9d5d-7a2568a62750]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.767 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c0246b2-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.769 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.770 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9c0246b2-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:36 np0005604215.localdomain kernel: device tap9c0246b2-30 entered promiscuous mode
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.776 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.780 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.780 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9c0246b2-30, col_values=(('external_ids', {'iface-id': '8e91955e-c3fb-4309-8605-7dae9ca4cd95'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.783 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:36Z|00054|binding|INFO|Releasing lport 8e91955e-c3fb-4309-8605-7dae9ca4cd95 from this chassis (sb_readonly=0)
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.787 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.788 158655 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9c0246b2-3507-4017-b8dd-01251187a6c3.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9c0246b2-3507-4017-b8dd-01251187a6c3.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.789 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[444abee7-bd02-4fef-b2cc-c5b31a65cc79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.790 158655 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: global
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     log         /dev/log local0 debug
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     log-tag     haproxy-metadata-proxy-9c0246b2-3507-4017-b8dd-01251187a6c3
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     user        root
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     group       root
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     maxconn     1024
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     pidfile     /var/lib/neutron/external/pids/9c0246b2-3507-4017-b8dd-01251187a6c3.pid.haproxy
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     daemon
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: defaults
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     log global
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     mode http
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     option httplog
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     option dontlognull
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     option http-server-close
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     option forwardfor
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     retries                 3
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-request    30s
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout connect         30s
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout client          32s
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout server          32s
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-keep-alive 30s
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: listen listener
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     bind 169.254.169.254:80
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     server metadata /var/lib/neutron/metadata_proxy
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:     http-request add-header X-OVN-Network-ID 9c0246b2-3507-4017-b8dd-01251187a6c3
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 01 09:52:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:36.790 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:36 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:36.791 158655 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3', 'env', 'PROCESS_TAG=haproxy-9c0246b2-3507-4017-b8dd-01251187a6c3', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9c0246b2-3507-4017-b8dd-01251187a6c3.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 01 09:52:36 np0005604215.localdomain ceph-mon[298604]: osdmap e95: 6 total, 6 up, 6 in
Feb 01 09:52:36 np0005604215.localdomain ceph-mon[298604]: pgmap v105: 177 pgs: 177 active+clean; 292 MiB data, 987 MiB used, 41 GiB / 42 GiB avail; 852 KiB/s rd, 6.3 MiB/s wr, 161 op/s
Feb 01 09:52:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:52:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:37.061 274321 INFO nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Sending announce-self command to QEMU monitor. Attempt 2 of 3
Feb 01 09:52:37 np0005604215.localdomain podman[304279]: 
Feb 01 09:52:37 np0005604215.localdomain podman[304279]: 2026-02-01 09:52:37.265186705 +0000 UTC m=+0.089766069 container create 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 01 09:52:37 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:37Z|00055|binding|INFO|Releasing lport 8e91955e-c3fb-4309-8605-7dae9ca4cd95 from this chassis (sb_readonly=0)
Feb 01 09:52:37 np0005604215.localdomain podman[304279]: 2026-02-01 09:52:37.223586088 +0000 UTC m=+0.048165502 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:52:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:37.344 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:37 np0005604215.localdomain systemd[1]: Started libpod-conmon-905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb.scope.
Feb 01 09:52:37 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:37 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b99c826e4ffdecd670f48080610e81d3245f462deda0b0580ae2ad15e879a9a8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:37 np0005604215.localdomain podman[304279]: 2026-02-01 09:52:37.377660521 +0000 UTC m=+0.202239895 container init 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:52:37 np0005604215.localdomain podman[304279]: 2026-02-01 09:52:37.38761524 +0000 UTC m=+0.212194604 container start 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:37 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [NOTICE]   (304297) : New worker (304299) forked
Feb 01 09:52:37 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [NOTICE]   (304297) : Loading success.
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.433 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa in datapath 01cb494b-1310-460f-acbe-602aefea39c6 unbound from our chassis
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.436 158655 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 01cb494b-1310-460f-acbe-602aefea39c6
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.446 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[376e2b97-12ed-40cc-9805-f3798ee446c8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.447 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap01cb494b-11 in ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.449 303130 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap01cb494b-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.449 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5f0ca9c6-6b8b-4981-97b4-da4245e774fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.451 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[88712fa3-c102-490e-b6bc-04d3c2f66594]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.468 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[8d4df313-be1f-461c-b5ba-1e582d1d6739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.478 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[82598059-0d9b-4301-8235-8888046ea1ab]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.501 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[a1b368e4-b2c6-4eb9-a171-a924ff267b95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.506 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[326e019d-09d8-4052-9992-483de85fa64a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939557.5081] manager: (tap01cb494b-10): new Veth device (/org/freedesktop/NetworkManager/Devices/18)
Feb 01 09:52:37 np0005604215.localdomain systemd-udevd[304223]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.536 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[c3827b55-1cc3-4801-8d67-34eab1e5e586]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.539 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[91aa3261-a068-4745-8bfd-e16263dc866e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap01cb494b-10: link becomes ready
Feb 01 09:52:37 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939557.5575] device (tap01cb494b-10): carrier: link connected
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.562 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[7b9dd0d7-fe4d-4712-a0f6-ff605c625360]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.578 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[741e624d-414d-4a70-ae7c-af5b90887f87]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01cb494b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:32:98:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164166, 'reachable_time': 44924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304319, 'error': None, 'target': 'ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.593 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[41d40fef-2207-4cf3-a0ee-94fb99fd937f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe32:9873'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1164166, 'tstamp': 1164166}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 304321, 'error': None, 'target': 'ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.610 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[dd4050d4-7be4-4645-a22d-66f4c91dafbc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap01cb494b-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:32:98:73'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 18], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164166, 'reachable_time': 44924, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 304322, 'error': None, 'target': 'ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.637 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[b0be199a-26d4-4996-927d-5c797e758c2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.693 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[3f597552-4456-407b-b332-545f9e286f06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.694 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01cb494b-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.695 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.695 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap01cb494b-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:37.698 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:37 np0005604215.localdomain kernel: device tap01cb494b-10 entered promiscuous mode
Feb 01 09:52:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:37.701 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.702 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap01cb494b-10, col_values=(('external_ids', {'iface-id': '6efa26b8-94b4-4ffe-b212-c7bedef06410'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:37.704 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:37 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:37Z|00056|binding|INFO|Releasing lport 6efa26b8-94b4-4ffe-b212-c7bedef06410 from this chassis (sb_readonly=0)
Feb 01 09:52:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:37.714 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.716 158655 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/01cb494b-1310-460f-acbe-602aefea39c6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/01cb494b-1310-460f-acbe-602aefea39c6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.717 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf7c2f9-2105-47e9-8b0a-85a20806e78a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.718 158655 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: global
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     log         /dev/log local0 debug
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     log-tag     haproxy-metadata-proxy-01cb494b-1310-460f-acbe-602aefea39c6
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     user        root
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     group       root
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     maxconn     1024
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     pidfile     /var/lib/neutron/external/pids/01cb494b-1310-460f-acbe-602aefea39c6.pid.haproxy
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     daemon
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: defaults
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     log global
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     mode http
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     option httplog
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     option dontlognull
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     option http-server-close
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     option forwardfor
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     retries                 3
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-request    30s
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout connect         30s
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout client          32s
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout server          32s
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-keep-alive 30s
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: listen listener
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     bind 169.254.169.254:80
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     server metadata /var/lib/neutron/metadata_proxy
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:     http-request add-header X-OVN-Network-ID 01cb494b-1310-460f-acbe-602aefea39c6
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 01 09:52:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:37.719 158655 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6', 'env', 'PROCESS_TAG=haproxy-01cb494b-1310-460f-acbe-602aefea39c6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/01cb494b-1310-460f-acbe-602aefea39c6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 01 09:52:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e96 e96: 6 total, 6 up, 6 in
Feb 01 09:52:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:38.070 274321 INFO nova.virt.libvirt.driver [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Sending announce-self command to QEMU monitor. Attempt 3 of 3
Feb 01 09:52:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:38.077 274321 DEBUG nova.compute.manager [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:38.097 274321 DEBUG nova.objects.instance [None req-7ebd8498-3f44-4651-9102-d6c4dae99d3c 1c20a58be3994701970a12462e33ab8c 840578c6ea7d45ab96b8ea958c57962b - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032
Feb 01 09:52:38 np0005604215.localdomain podman[304369]: 
Feb 01 09:52:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v107: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 9.2 MiB/s rd, 16 MiB/s wr, 472 op/s
Feb 01 09:52:38 np0005604215.localdomain dnsmasq[303862]: exiting on receipt of SIGTERM
Feb 01 09:52:38 np0005604215.localdomain podman[304382]: 2026-02-01 09:52:38.161445772 +0000 UTC m=+0.061827028 container kill 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: libpod-1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38.scope: Deactivated successfully.
Feb 01 09:52:38 np0005604215.localdomain podman[304369]: 2026-02-01 09:52:38.203378269 +0000 UTC m=+0.136288929 container create b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:38 np0005604215.localdomain podman[304369]: 2026-02-01 09:52:38.108754279 +0000 UTC m=+0.041664959 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:52:38 np0005604215.localdomain podman[304397]: 2026-02-01 09:52:38.228621736 +0000 UTC m=+0.056705669 container died 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: Started libpod-conmon-b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c.scope.
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:38 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ae23c46f93dd79275de77520a02b78359c7b354b0c2cd55b4d41f11cf0d08430/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:38 np0005604215.localdomain podman[304397]: 2026-02-01 09:52:38.260160109 +0000 UTC m=+0.088244032 container cleanup 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-44b6a4c8167cba5b5fbdfbf9820bb6cd4a6fbd5e72379076f4e21cd139706606-merged.mount: Deactivated successfully.
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38-userdata-shm.mount: Deactivated successfully.
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: libpod-conmon-1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38.scope: Deactivated successfully.
Feb 01 09:52:38 np0005604215.localdomain podman[304404]: 2026-02-01 09:52:38.293497468 +0000 UTC m=+0.108141802 container remove 1b5f4b7c595945d1b4fdef3bd6ed5ad2e17597e59d7c07eff640617e98913e38 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-1a91bc36-a078-4e5e-bd8f-3f791a7ad269, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:52:38 np0005604215.localdomain podman[304369]: 2026-02-01 09:52:38.318364933 +0000 UTC m=+0.251275583 container init b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: tmp-crun.PVuFqE.mount: Deactivated successfully.
Feb 01 09:52:38 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d1a91bc36\x2da078\x2d4e5e\x2dbd8f\x2d3f791a7ad269.mount: Deactivated successfully.
Feb 01 09:52:38 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:38.339 259225 INFO neutron.agent.dhcp.agent [None req-eb142ce4-748c-4425-8127-dae17b9bce2e - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:52:38 np0005604215.localdomain podman[304369]: 2026-02-01 09:52:38.340145362 +0000 UTC m=+0.273056012 container start b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:52:38 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [NOTICE]   (304433) : New worker (304435) forked
Feb 01 09:52:38 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [NOTICE]   (304433) : Loading success.
Feb 01 09:52:38 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:38.514 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:52:38 np0005604215.localdomain ceph-mon[298604]: osdmap e96: 6 total, 6 up, 6 in
Feb 01 09:52:38 np0005604215.localdomain ceph-mon[298604]: pgmap v107: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 9.2 MiB/s rd, 16 MiB/s wr, 472 op/s
Feb 01 09:52:38 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3835095047' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e97 e97: 6 total, 6 up, 6 in
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.108 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.311 274321 DEBUG nova.compute.manager [req-de846797-8fb6-4cb8-8194-1db4a623ccb8 req-69e52d04-804d-4ce3-bb97-16a7251e078e 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received event network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.312 274321 DEBUG oslo_concurrency.lockutils [req-de846797-8fb6-4cb8-8194-1db4a623ccb8 req-69e52d04-804d-4ce3-bb97-16a7251e078e 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.312 274321 DEBUG oslo_concurrency.lockutils [req-de846797-8fb6-4cb8-8194-1db4a623ccb8 req-69e52d04-804d-4ce3-bb97-16a7251e078e 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.313 274321 DEBUG oslo_concurrency.lockutils [req-de846797-8fb6-4cb8-8194-1db4a623ccb8 req-69e52d04-804d-4ce3-bb97-16a7251e078e 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.313 274321 DEBUG nova.compute.manager [req-de846797-8fb6-4cb8-8194-1db4a623ccb8 req-69e52d04-804d-4ce3-bb97-16a7251e078e 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] No waiting events found dispatching network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.313 274321 WARNING nova.compute.manager [req-de846797-8fb6-4cb8-8194-1db4a623ccb8 req-69e52d04-804d-4ce3-bb97-16a7251e078e 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received unexpected event network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa for instance with vm_state active and task_state None.
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.892 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Acquiring lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.892 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.893 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Acquiring lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.893 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.893 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.895 274321 INFO nova.compute.manager [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Terminating instance
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.896 274321 DEBUG nova.compute.manager [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 01 09:52:39 np0005604215.localdomain ceph-mon[298604]: osdmap e97: 6 total, 6 up, 6 in
Feb 01 09:52:39 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3976685752' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:39 np0005604215.localdomain kernel: device tap96aeb3a2-ba left promiscuous mode
Feb 01 09:52:39 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939559.9794] device (tap96aeb3a2-ba): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.987 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00057|binding|INFO|Releasing lport 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa from this chassis (sb_readonly=0)
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00058|binding|INFO|Setting lport 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa down in Southbound
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00059|binding|INFO|Releasing lport d16170e5-2dd1-4d5e-a380-5344cdba0aa7 from this chassis (sb_readonly=0)
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00060|binding|INFO|Setting lport d16170e5-2dd1-4d5e-a380-5344cdba0aa7 down in Southbound
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00061|binding|INFO|Removing iface tap96aeb3a2-ba ovn-installed in OVS
Feb 01 09:52:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:39.990 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00062|binding|INFO|Releasing lport 6efa26b8-94b4-4ffe-b212-c7bedef06410 from this chassis (sb_readonly=0)
Feb 01 09:52:39 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:39Z|00063|binding|INFO|Releasing lport 8e91955e-c3fb-4309-8605-7dae9ca4cd95 from this chassis (sb_readonly=0)
Feb 01 09:52:40 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:40Z|00064|ovn_bfd|INFO|Disabled BFD on interface ovn-2186fb-0
Feb 01 09:52:40 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:40Z|00065|ovn_bfd|INFO|Disabled BFD on interface ovn-e1cc33-0
Feb 01 09:52:40 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:40Z|00066|ovn_bfd|INFO|Disabled BFD on interface ovn-45aa31-0
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.006 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:39.999 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:db:2d:9c 19.80.0.33'], port_security=['fa:16:3e:db:2d:9c 19.80.0.33'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-491001553', 'neutron:cidrs': '19.80.0.33/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c0246b2-3507-4017-b8dd-01251187a6c3', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-491001553', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'f05aaf36-904c-44ae-a203-34e61744db7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=722e7a10-7816-489f-9516-bc350daf9fce, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[], logical_port=d16170e5-2dd1-4d5e-a380-5344cdba0aa7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.002 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:4d:83 10.100.0.11'], port_security=['fa:16:3e:6e:4d:83 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-377096059', 'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '5aefea54-941a-48bf-ad9e-7f13fdfdb4ed', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-01cb494b-1310-460f-acbe-602aefea39c6', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-377096059', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '12', 'neutron:security_group_ids': 'f05aaf36-904c-44ae-a203-34e61744db7d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ae7d4c2f-1d19-4933-99fa-b8aa62feb38e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.004 158655 INFO neutron.agent.ovn.metadata.agent [-] Port d16170e5-2dd1-4d5e-a380-5344cdba0aa7 in datapath 9c0246b2-3507-4017-b8dd-01251187a6c3 unbound from our chassis
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.008 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 2f157b64-12ad-48f6-bd1f-788194f131e8 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.008 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c0246b2-3507-4017-b8dd-01251187a6c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.009 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[4f3b292a-8ebb-4cf5-9937-bfb38adf361d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.010 158655 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3 namespace which is not needed anymore
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.012 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000007.scope: Deactivated successfully.
Feb 01 09:52:40 np0005604215.localdomain systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000007.scope: Consumed 1.490s CPU time.
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.046 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:40Z|00067|binding|INFO|Releasing lport 6efa26b8-94b4-4ffe-b212-c7bedef06410 from this chassis (sb_readonly=0)
Feb 01 09:52:40 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:40Z|00068|binding|INFO|Releasing lport 8e91955e-c3fb-4309-8605-7dae9ca4cd95 from this chassis (sb_readonly=0)
Feb 01 09:52:40 np0005604215.localdomain systemd-machined[202466]: Machine qemu-1-instance-00000007 terminated.
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.052 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e97 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.132 274321 INFO nova.virt.libvirt.driver [-] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Instance destroyed successfully.
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.133 274321 DEBUG nova.objects.instance [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lazy-loading 'resources' on Instance uuid 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:52:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v109: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 7.9 MiB/s wr, 256 op/s
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.150 274321 DEBUG nova.virt.libvirt.vif [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2026-02-01T09:52:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-328365138',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005604215.localdomain',hostname='tempest-liveautoblockmigrationv225test-server-328365138',id=7,image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-01T09:52:21Z,launched_on='np0005604213.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='np0005604215.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='ebe5e345d591408fa955b2e811bfaffb',ramdisk_id='',reservation_id='r-hz7zc7vw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1924784790',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1924784790-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-01T09:52:38Z,user_data=None,user_id='336655b6a22d4371b0a5cd24b959dc9a',uuid=5aefea54-941a-48bf-ad9e-7f13fdfdb4ed,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "address": "fa:16:3e:6e:4d:83", "network": {"id": "01cb494b-1310-460f-acbe-602aefea39c6", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1791362587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "ebe5e345d591408fa955b2e811bfaffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aeb3a2-ba", "ovs_interfaceid": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.150 274321 DEBUG nova.network.os_vif_util [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Converting VIF {"id": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "address": "fa:16:3e:6e:4d:83", "network": {"id": "01cb494b-1310-460f-acbe-602aefea39c6", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-1791362587-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "ebe5e345d591408fa955b2e811bfaffb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap96aeb3a2-ba", "ovs_interfaceid": "96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.151 274321 DEBUG nova.network.os_vif_util [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:4d:83,bridge_name='br-int',has_traffic_filtering=True,id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa,network=Network(01cb494b-1310-460f-acbe-602aefea39c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96aeb3a2-ba') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.152 274321 DEBUG os_vif [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:4d:83,bridge_name='br-int',has_traffic_filtering=True,id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa,network=Network(01cb494b-1310-460f-acbe-602aefea39c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96aeb3a2-ba') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.154 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.155 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap96aeb3a2-ba, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.157 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.158 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.161 274321 INFO os_vif [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:4d:83,bridge_name='br-int',has_traffic_filtering=True,id=96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa,network=Network(01cb494b-1310-460f-acbe-602aefea39c6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap96aeb3a2-ba')
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [NOTICE]   (304297) : haproxy version is 2.8.14-c23fe91
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [NOTICE]   (304297) : path to executable is /usr/sbin/haproxy
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [WARNING]  (304297) : Exiting Master process...
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [WARNING]  (304297) : Exiting Master process...
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [ALERT]    (304297) : Current worker (304299) exited with code 143 (Terminated)
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3[304293]: [WARNING]  (304297) : All workers exited. Exiting... (0)
Feb 01 09:52:40 np0005604215.localdomain systemd[1]: libpod-905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb.scope: Deactivated successfully.
Feb 01 09:52:40 np0005604215.localdomain podman[304469]: 2026-02-01 09:52:40.203607979 +0000 UTC m=+0.081603486 container died 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:52:40 np0005604215.localdomain podman[304469]: 2026-02-01 09:52:40.244551424 +0000 UTC m=+0.122546891 container cleanup 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:52:40 np0005604215.localdomain podman[304508]: 2026-02-01 09:52:40.282491737 +0000 UTC m=+0.066441102 container cleanup 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:52:40 np0005604215.localdomain systemd[1]: libpod-conmon-905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb.scope: Deactivated successfully.
Feb 01 09:52:40 np0005604215.localdomain podman[304525]: 2026-02-01 09:52:40.356771632 +0000 UTC m=+0.090951765 container remove 905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.362 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[2b1c2b3e-8c0f-46b9-a708-6c9e22bc782b]: (4, ('Sun Feb  1 09:52:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3 (905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb)\n905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb\nSun Feb  1 09:52:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3 (905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb)\n905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.363 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[fb899725-e239-4edd-b40b-881d60ced2b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.364 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9c0246b2-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.366 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain kernel: device tap9c0246b2-30 left promiscuous mode
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.376 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.380 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[139ae1d7-6502-40ff-8b99-4f1127d2edf3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.390 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7f52f8d9-76bb-40e8-a621-06af7cd46539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.391 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[1406f483-0344-4366-8c6d-c2cb10d9090c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.408 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[8233a994-7653-4bcb-8e21-396d817e2c4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164064, 'reachable_time': 41213, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304543, 'error': None, 'target': 'ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.418 158836 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9c0246b2-3507-4017-b8dd-01251187a6c3 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.419 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[6d96ccd3-a7b5-4b46-b667-8a7bb80d469c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.419 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa in datapath 01cb494b-1310-460f-acbe-602aefea39c6 unbound from our chassis
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.421 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 01cb494b-1310-460f-acbe-602aefea39c6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.422 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d72da791-354a-4918-b555-262f9aa6a035]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.422 158655 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6 namespace which is not needed anymore
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [NOTICE]   (304433) : haproxy version is 2.8.14-c23fe91
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [NOTICE]   (304433) : path to executable is /usr/sbin/haproxy
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [WARNING]  (304433) : Exiting Master process...
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [ALERT]    (304433) : Current worker (304435) exited with code 143 (Terminated)
Feb 01 09:52:40 np0005604215.localdomain neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6[304425]: [WARNING]  (304433) : All workers exited. Exiting... (0)
Feb 01 09:52:40 np0005604215.localdomain systemd[1]: libpod-b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c.scope: Deactivated successfully.
Feb 01 09:52:40 np0005604215.localdomain podman[304562]: 2026-02-01 09:52:40.619999367 +0000 UTC m=+0.076367091 container died b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:40 np0005604215.localdomain podman[304562]: 2026-02-01 09:52:40.667171758 +0000 UTC m=+0.123539452 container cleanup b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:52:40 np0005604215.localdomain podman[304576]: 2026-02-01 09:52:40.687223073 +0000 UTC m=+0.065879804 container cleanup b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:52:40 np0005604215.localdomain systemd[1]: libpod-conmon-b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c.scope: Deactivated successfully.
Feb 01 09:52:40 np0005604215.localdomain podman[304591]: 2026-02-01 09:52:40.766750112 +0000 UTC m=+0.078261520 container remove b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.771 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d21b1b89-d9eb-484d-ab43-f1c67eda1b96]: (4, ('Sun Feb  1 09:52:40 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6 (b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c)\nb9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c\nSun Feb  1 09:52:40 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6 (b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c)\nb9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.773 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[3a15d205-ff39-43b7-b7e2-cf05220b0e08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.774 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap01cb494b-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.776 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain kernel: device tap01cb494b-10 left promiscuous mode
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.783 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.786 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[054674a1-652a-4cb2-84e8-05c3fd7e3d2e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.800 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[728bfe85-1294-4326-8ebc-906e073e0df1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.802 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[c21b413b-3622-4c0b-8a65-ccd500d97c26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.819 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[f9d57a2b-0763-4ab7-b3bd-119fdbbf1a7c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1164160, 'reachable_time': 27069, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 304608, 'error': None, 'target': 'ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.820 158836 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-01cb494b-1310-460f-acbe-602aefea39c6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 01 09:52:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:40.821 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[6d42a85c-dcab-4b23-a14d-5a89d8e54f5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.931 274321 INFO nova.virt.libvirt.driver [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Deleting instance files /var/lib/nova/instances/5aefea54-941a-48bf-ad9e-7f13fdfdb4ed_del
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.934 274321 INFO nova.virt.libvirt.driver [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Deletion of /var/lib/nova/instances/5aefea54-941a-48bf-ad9e-7f13fdfdb4ed_del complete
Feb 01 09:52:40 np0005604215.localdomain ceph-mon[298604]: pgmap v109: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 7.9 MiB/s wr, 256 op/s
Feb 01 09:52:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/4064514450' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.992 274321 DEBUG nova.virt.libvirt.host [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.993 274321 INFO nova.virt.libvirt.host [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] UEFI support detected
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.994 274321 INFO nova.compute.manager [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Took 1.10 seconds to destroy the instance on the hypervisor.
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.995 274321 DEBUG oslo.service.loopingcall [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.996 274321 DEBUG nova.compute.manager [-] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 01 09:52:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:40.996 274321 DEBUG nova.network.neutron [-] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 01 09:52:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ae23c46f93dd79275de77520a02b78359c7b354b0c2cd55b4d41f11cf0d08430-merged.mount: Deactivated successfully.
Feb 01 09:52:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9c3ff5d69e08cd7b2074869c3a668f42c533f574c855ca03e98330e4906679c-userdata-shm.mount: Deactivated successfully.
Feb 01 09:52:41 np0005604215.localdomain systemd[1]: run-netns-ovnmeta\x2d01cb494b\x2d1310\x2d460f\x2dacbe\x2d602aefea39c6.mount: Deactivated successfully.
Feb 01 09:52:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-b99c826e4ffdecd670f48080610e81d3245f462deda0b0580ae2ad15e879a9a8-merged.mount: Deactivated successfully.
Feb 01 09:52:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-905eac83e0bd4743dba3e7c52cadf071fcd55b92c98c70803043438547792bbb-userdata-shm.mount: Deactivated successfully.
Feb 01 09:52:41 np0005604215.localdomain systemd[1]: run-netns-ovnmeta\x2d9c0246b2\x2d3507\x2d4017\x2db8dd\x2d01251187a6c3.mount: Deactivated successfully.
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.256 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.351 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received event network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.352 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.352 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.353 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.353 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] No waiting events found dispatching network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.353 274321 WARNING nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received unexpected event network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa for instance with vm_state active and task_state deleting.
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.353 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received event network-vif-unplugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.354 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.354 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.354 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.355 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] No waiting events found dispatching network-vif-unplugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.355 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received event network-vif-unplugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.355 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received event network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.356 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.356 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.356 274321 DEBUG oslo_concurrency.lockutils [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.357 274321 DEBUG nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] No waiting events found dispatching network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:52:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:41.357 274321 WARNING nova.compute.manager [req-1ef95c59-cca0-475a-b8eb-6da6125e9b91 req-d00d56f1-388f-4854-a762-91e8ef9504ba 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Received unexpected event network-vif-plugged-96aeb3a2-ba77-4c7e-afdb-ef57beaf09fa for instance with vm_state active and task_state deleting.
Feb 01 09:52:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e98 e98: 6 total, 6 up, 6 in
Feb 01 09:52:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:41.771 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:41.771 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:41.771 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v111: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 7.9 MiB/s wr, 256 op/s
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.392 274321 DEBUG nova.network.neutron [-] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.423 274321 INFO nova.compute.manager [-] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Took 1.43 seconds to deallocate network for instance.
Feb 01 09:52:42 np0005604215.localdomain ceph-mon[298604]: osdmap e98: 6 total, 6 up, 6 in
Feb 01 09:52:42 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2017372018' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:42 np0005604215.localdomain ceph-mon[298604]: pgmap v111: 177 pgs: 177 active+clean; 383 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 8.0 MiB/s rd, 7.9 MiB/s wr, 256 op/s
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.483 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.484 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.487 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.527 274321 INFO nova.scheduler.client.report [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Deleted allocations for instance 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed
Feb 01 09:52:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:42.599 274321 DEBUG oslo_concurrency.lockutils [None req-37ef1d53-7a9b-45f3-b855-6e641547f320 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Lock "5aefea54-941a-48bf-ad9e-7f13fdfdb4ed" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:42.623 2 INFO neutron.agent.securitygroups_rpc [None req-c376665f-557d-4fc0-a2ce-3b9dbb425e99 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Stopping User Manager for UID 42436...
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Activating special unit Exit the Session...
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped target Main User Target.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped target Basic System.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped target Paths.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped target Sockets.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped target Timers.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped Mark boot as successful after the user session has run 2 minutes.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped Daily Cleanup of User's Temporary Directories.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Closed D-Bus User Message Bus Socket.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Stopped Create User's Volatile Files and Directories.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Removed slice User Application Slice.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Reached target Shutdown.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Finished Exit the Session.
Feb 01 09:52:43 np0005604215.localdomain systemd[304043]: Reached target Exit the Session.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: user@42436.service: Deactivated successfully.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Stopped User Manager for UID 42436.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Stopping User Runtime Directory /run/user/42436...
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: run-user-42436.mount: Deactivated successfully.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: user-runtime-dir@42436.service: Deactivated successfully.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Stopped User Runtime Directory /run/user/42436.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: Removed slice User Slice of UID 42436.
Feb 01 09:52:43 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/67460668' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:43 np0005604215.localdomain podman[304609]: 2026-02-01 09:52:43.622740158 +0000 UTC m=+0.082462781 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:52:43 np0005604215.localdomain podman[304610]: 2026-02-01 09:52:43.685316038 +0000 UTC m=+0.139982045 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:52:43 np0005604215.localdomain podman[304609]: 2026-02-01 09:52:43.688808087 +0000 UTC m=+0.148530701 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:52:43 np0005604215.localdomain podman[304610]: 2026-02-01 09:52:43.697731345 +0000 UTC m=+0.152397352 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:52:43 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:52:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v112: 177 pgs: 177 active+clean; 224 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 98 KiB/s rd, 29 KiB/s wr, 140 op/s
Feb 01 09:52:44 np0005604215.localdomain ceph-mon[298604]: pgmap v112: 177 pgs: 177 active+clean; 224 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 98 KiB/s rd, 29 KiB/s wr, 140 op/s
Feb 01 09:52:44 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:44.876 2 INFO neutron.agent.securitygroups_rpc [None req-651e6fa7-c546-4929-9315-764ba9e33bc3 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']
Feb 01 09:52:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e98 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:45.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:45.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:52:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:45.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:52:45 np0005604215.localdomain podman[304673]: 2026-02-01 09:52:45.108937495 +0000 UTC m=+0.033278059 container kill 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:52:45 np0005604215.localdomain dnsmasq[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/addn_hosts - 0 addresses
Feb 01 09:52:45 np0005604215.localdomain dnsmasq-dhcp[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/host
Feb 01 09:52:45 np0005604215.localdomain dnsmasq-dhcp[303619]: read /var/lib/neutron/dhcp/9c0246b2-3507-4017-b8dd-01251187a6c3/opts
Feb 01 09:52:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:45.115 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:52:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:45.157 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:46 np0005604215.localdomain dnsmasq[303619]: exiting on receipt of SIGTERM
Feb 01 09:52:46 np0005604215.localdomain podman[304710]: 2026-02-01 09:52:46.064392487 +0000 UTC m=+0.065794792 container kill 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:52:46 np0005604215.localdomain systemd[1]: libpod-23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0.scope: Deactivated successfully.
Feb 01 09:52:46 np0005604215.localdomain podman[304726]: 2026-02-01 09:52:46.128369262 +0000 UTC m=+0.046056228 container died 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:52:46 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3768220585' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:46 np0005604215.localdomain systemd[1]: tmp-crun.oCeIkm.mount: Deactivated successfully.
Feb 01 09:52:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v113: 177 pgs: 177 active+clean; 224 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 76 KiB/s rd, 23 KiB/s wr, 109 op/s
Feb 01 09:52:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0-userdata-shm.mount: Deactivated successfully.
Feb 01 09:52:46 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-53daeb5ecd744789a19f463b75866691ebb76af9c46948b934b77d0920c93713-merged.mount: Deactivated successfully.
Feb 01 09:52:46 np0005604215.localdomain podman[304726]: 2026-02-01 09:52:46.176690518 +0000 UTC m=+0.094377464 container remove 23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9c0246b2-3507-4017-b8dd-01251187a6c3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:52:46 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:46Z|00069|binding|INFO|Releasing lport c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 from this chassis (sb_readonly=0)
Feb 01 09:52:46 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:46Z|00070|binding|INFO|Setting lport c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 down in Southbound
Feb 01 09:52:46 np0005604215.localdomain kernel: device tapc8e9dce8-3c left promiscuous mode
Feb 01 09:52:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:46.185 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:46.202 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '19.80.0.3/24', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-9c0246b2-3507-4017-b8dd-01251187a6c3', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9c0246b2-3507-4017-b8dd-01251187a6c3', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ebe5e345d591408fa955b2e811bfaffb', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=722e7a10-7816-489f-9516-bc350daf9fce, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=c8e9dce8-3cef-4d4b-8d3c-5d13d0890663) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:46.203 158655 INFO neutron.agent.ovn.metadata.agent [-] Port c8e9dce8-3cef-4d4b-8d3c-5d13d0890663 in datapath 9c0246b2-3507-4017-b8dd-01251187a6c3 unbound from our chassis
Feb 01 09:52:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:46.206 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9c0246b2-3507-4017-b8dd-01251187a6c3, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:52:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:46.207 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ef1acf6a-cc22-4993-afaa-c6d6faa318fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:46.211 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:46 np0005604215.localdomain systemd[1]: libpod-conmon-23f4520b0a973c50c64041e4f473fe8222e37918b934ba053a5f01d9377a58c0.scope: Deactivated successfully.
Feb 01 09:52:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:46.258 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e99 e99: 6 total, 6 up, 6 in
Feb 01 09:52:46 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:46.464 259225 INFO neutron.agent.dhcp.agent [None req-5b53c1c1-32b1-4ba9-9916-e803cb5df1f1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:52:46 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:46.465 259225 INFO neutron.agent.dhcp.agent [None req-5b53c1c1-32b1-4ba9-9916-e803cb5df1f1 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:52:46 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:46.529 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:52:46 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:46.683 2 INFO neutron.agent.securitygroups_rpc [None req-2d42c471-7b15-4400-a993-fcf3849484f7 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']
Feb 01 09:52:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:46.742 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:47.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:47 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d9c0246b2\x2d3507\x2d4017\x2db8dd\x2d01251187a6c3.mount: Deactivated successfully.
Feb 01 09:52:47 np0005604215.localdomain ceph-mon[298604]: pgmap v113: 177 pgs: 177 active+clean; 224 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 76 KiB/s rd, 23 KiB/s wr, 109 op/s
Feb 01 09:52:47 np0005604215.localdomain ceph-mon[298604]: osdmap e99: 6 total, 6 up, 6 in
Feb 01 09:52:47 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2797850007' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:47 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:52:47.421 2 INFO neutron.agent.securitygroups_rpc [None req-0fc1e61c-d2d8-4451-b527-5803b4fad28d 336655b6a22d4371b0a5cd24b959dc9a ebe5e345d591408fa955b2e811bfaffb - - default default] Security group member updated ['f05aaf36-904c-44ae-a203-34e61744db7d']
Feb 01 09:52:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:48.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v115: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 231 op/s
Feb 01 09:52:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e100 e100: 6 total, 6 up, 6 in
Feb 01 09:52:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:52:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:52:48 np0005604215.localdomain podman[304754]: 2026-02-01 09:52:48.872980403 +0000 UTC m=+0.082955806 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, managed_by=edpm_ansible, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, release=1769056855, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git)
Feb 01 09:52:48 np0005604215.localdomain podman[304754]: 2026-02-01 09:52:48.889120446 +0000 UTC m=+0.099095859 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:52:48 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:52:48 np0005604215.localdomain podman[304755]: 2026-02-01 09:52:48.976390836 +0000 UTC m=+0.184451250 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 01 09:52:49 np0005604215.localdomain podman[304755]: 2026-02-01 09:52:49.012615646 +0000 UTC m=+0.220676100 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:52:49 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.097 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:49 np0005604215.localdomain ceph-mon[298604]: pgmap v115: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 231 op/s
Feb 01 09:52:49 np0005604215.localdomain ceph-mon[298604]: osdmap e100: 6 total, 6 up, 6 in
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.877 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.878 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.898 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.984 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.985 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.990 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 01 09:52:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:49.991 274321 INFO nova.compute.claims [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Claim successful on node np0005604215.localdomain
Feb 01 09:52:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.097 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.110 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.111 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.111 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.138 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v117: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 231 op/s
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.160 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:50.472 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:50.474 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.499 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:52:50 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/641399154' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.553 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.455s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.559 274321 DEBUG nova.compute.provider_tree [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.581 274321 DEBUG nova.scheduler.client.report [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.602 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.618s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.603 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.607 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.469s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.608 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.608 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.608 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.653 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.654 274321 DEBUG nova.network.neutron [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.667 274321 INFO nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.684 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.773 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.777 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.778 274321 INFO nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Creating image(s)
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.820 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.863 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.903 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.911 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquiring lock "f978c6f71b922ff24c45ca010751fdcbed665c95" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.912 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "f978c6f71b922ff24c45ca010751fdcbed665c95" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:50.967 274321 DEBUG nova.virt.libvirt.imagebackend [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Image locations are: [{'url': 'rbd://33fac0b9-80c7-560f-918a-c92d3021ca1e/images/a223c2d3-3df7-4d82-921c-31ace200d43c/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://33fac0b9-80c7-560f-918a-c92d3021ca1e/images/a223c2d3-3df7-4d82-921c-31ace200d43c/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1672829836' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.051 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.443s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.137 274321 WARNING oslo_policy.policy [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.138 274321 WARNING oslo_policy.policy [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.144 274321 DEBUG nova.policy [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: pgmap v117: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 231 op/s
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/641399154' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1672829836' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.261 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.290 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.292 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11745MB free_disk=41.70050811767578GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.293 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.293 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.355 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Instance aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.355 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.356 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=640MB phys_disk=41GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.418 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:52:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:52:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:52:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:52:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:52:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/750954759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:52:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2325372919' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.853 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.872 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.454s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.881 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.915 274321 ERROR nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] [req-9d1e416f-0fea-4fd5-b5be-e8ecaf324aa0] Failed to update inventory to [{'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID d5eeed9a-e4d0-4244-8d4e-39e5c8263590.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-9d1e416f-0fea-4fd5-b5be-e8ecaf324aa0"}]}
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.931 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.part --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.932 274321 DEBUG nova.virt.images [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] a223c2d3-3df7-4d82-921c-31ace200d43c was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.934 274321 DEBUG nova.privsep.utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.935 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.part /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.951 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.972 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.973 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 0, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:52:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:51.991 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.030 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.085 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.107 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.part /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.converted" returned: 0 in 0.172s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.112 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v118: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 121 op/s
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.186 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95.converted --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.188 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "f978c6f71b922ff24c45ca010751fdcbed665c95" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.276s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.221 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.226 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95 aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/750954759' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2325372919' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:52 np0005604215.localdomain ceph-mon[298604]: pgmap v118: 177 pgs: 177 active+clean; 304 MiB data, 1007 MiB used, 41 GiB / 42 GiB avail; 5.9 MiB/s rd, 5.8 MiB/s wr, 121 op/s
Feb 01 09:52:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2480888757' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:52:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2846492519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.553 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.559 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.617 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updated inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with generation 8 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 15738, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0, 'reserved': 0}, 'DISK_GB': {'total': 41, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.617 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 generation from 8 to 9 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.618 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.643 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.643 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.350s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.724 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/_base/f978c6f71b922ff24c45ca010751fdcbed665c95 aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.498s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.831 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] resizing rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk to 1073741824 resize /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:288
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.980 274321 DEBUG nova.objects.instance [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lazy-loading 'migration_context' on Instance uuid aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.998 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 01 09:52:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.999 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Ensure instance console log exists: /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:52.999 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.000 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.000 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:53.224 259225 INFO neutron.agent.linux.ip_lib [None req-32273a97-523d-4b8f-b365-237d9402098e - - - - - -] Device tap7ad39b92-32 cannot be used as it has no MAC address
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.227 274321 DEBUG nova.network.neutron [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Successfully updated port: 3c861704-c594-42f8-a5b3-a274ec84650f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.244 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquiring lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.244 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquired lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.245 274321 DEBUG nova.network.neutron [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.246 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:53 np0005604215.localdomain kernel: device tap7ad39b92-32 entered promiscuous mode
Feb 01 09:52:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:53Z|00071|binding|INFO|Claiming lport 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f for this chassis.
Feb 01 09:52:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:53Z|00072|binding|INFO|7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f: Claiming unknown
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.251 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:53 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939573.2524] manager: (tap7ad39b92-32): new Generic device (/org/freedesktop/NetworkManager/Devices/19)
Feb 01 09:52:53 np0005604215.localdomain systemd-udevd[305072]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:53.260 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fdca6946-14e8-4692-9d79-41002e703846', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fdca6946-14e8-4692-9d79-41002e703846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41697a815dfa4c5aaae37b529f6303e1', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=013c5d80-ec0c-4f6b-91c1-a2283198de95, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:53.262 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f in datapath fdca6946-14e8-4692-9d79-41002e703846 bound to our chassis
Feb 01 09:52:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:53.264 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fdca6946-14e8-4692-9d79-41002e703846 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:52:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:53.265 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7be5b6a9-dae6-4f93-a6cc-b187122fd5f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:53Z|00073|binding|INFO|Setting lport 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f ovn-installed in OVS
Feb 01 09:52:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:53Z|00074|binding|INFO|Setting lport 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f up in Southbound
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.271 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.288 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap7ad39b92-32: No such device
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.304 274321 DEBUG nova.compute.manager [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-changed-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.304 274321 DEBUG nova.compute.manager [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Refreshing instance network info cache due to event network-changed-3c861704-c594-42f8-a5b3-a274ec84650f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.305 274321 DEBUG oslo_concurrency.lockutils [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.314 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.336 274321 DEBUG nova.network.neutron [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.342 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2846492519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.634 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:53.636 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:52:54 np0005604215.localdomain podman[305143]: 
Feb 01 09:52:54 np0005604215.localdomain podman[305143]: 2026-02-01 09:52:54.10627568 +0000 UTC m=+0.081447490 container create a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.111 274321 DEBUG nova.network.neutron [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Updating instance_info_cache with network_info: [{"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.128 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Releasing lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.128 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Instance network_info: |[{"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.129 274321 DEBUG oslo_concurrency.lockutils [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquired lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.129 274321 DEBUG nova.network.neutron [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Refreshing network info cache for port 3c861704-c594-42f8-a5b3-a274ec84650f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.135 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Start _get_guest_xml network_info=[{"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-01T09:50:54Z,direct_url=<?>,disk_format='qcow2',id=a223c2d3-3df7-4d82-921c-31ace200d43c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='79df39cba1c14309b68e8b61518619fd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-01T09:50:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'image_id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.139 274321 WARNING nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:52:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551.scope.
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.147 274321 DEBUG nova.virt.libvirt.host [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Searching host: 'np0005604215.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.148 274321 DEBUG nova.virt.libvirt.host [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.150 274321 DEBUG nova.virt.libvirt.host [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Searching host: 'np0005604215.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.151 274321 DEBUG nova.virt.libvirt.host [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.152 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.153 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-01T09:50:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='04b6d75f-0335-413a-b9d6-dfe49d77feaf',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2026-02-01T09:50:54Z,direct_url=<?>,disk_format='qcow2',id=a223c2d3-3df7-4d82-921c-31ace200d43c,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='79df39cba1c14309b68e8b61518619fd',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2026-02-01T09:50:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 01 09:52:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v119: 177 pgs: 177 active+clean; 225 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 5.8 MiB/s wr, 274 op/s
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.153 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.154 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.154 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.155 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.156 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.156 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.157 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.158 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 01 09:52:54 np0005604215.localdomain systemd[1]: tmp-crun.aDVGmy.mount: Deactivated successfully.
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.158 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.158 274321 DEBUG nova.virt.hardware [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 01 09:52:54 np0005604215.localdomain podman[305143]: 2026-02-01 09:52:54.064376194 +0000 UTC m=+0.039548004 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.164 274321 DEBUG nova.privsep.utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.165 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68b1613e4d327631643e09ac0edff96b26245deaec5c43c3419a3ce4c98fd9cd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:54 np0005604215.localdomain podman[305143]: 2026-02-01 09:52:54.186904214 +0000 UTC m=+0.162075994 container init a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:52:54 np0005604215.localdomain podman[305143]: 2026-02-01 09:52:54.197192835 +0000 UTC m=+0.172364635 container start a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:52:54 np0005604215.localdomain dnsmasq[305163]: started, version 2.85 cachesize 150
Feb 01 09:52:54 np0005604215.localdomain dnsmasq[305163]: DNS service limited to local subnets
Feb 01 09:52:54 np0005604215.localdomain dnsmasq[305163]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:52:54 np0005604215.localdomain dnsmasq[305163]: warning: no upstream servers configured
Feb 01 09:52:54 np0005604215.localdomain dnsmasq-dhcp[305163]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:52:54 np0005604215.localdomain dnsmasq[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/addn_hosts - 0 addresses
Feb 01 09:52:54 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/host
Feb 01 09:52:54 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/opts
Feb 01 09:52:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:54.389 259225 INFO neutron.agent.dhcp.agent [None req-68b7025a-8cf1-4e5e-9031-f4a37aba59a8 - - - - - -] DHCP configuration for ports {'0b5f5605-6b74-496a-a8dc-57cf160bde76'} is completed
Feb 01 09:52:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:54.476 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2238660147' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:54 np0005604215.localdomain ceph-mon[298604]: pgmap v119: 177 pgs: 177 active+clean; 225 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 5.8 MiB/s wr, 274 op/s
Feb 01 09:52:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:52:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/776839631' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.569 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.404s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.603 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:54.608 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:52:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2689872368' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.058 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.451s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.061 274321 DEBUG nova.virt.libvirt.vif [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T09:52:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1216472824',display_name='tempest-LiveMigrationTest-server-1216472824',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005604215.localdomain',hostname='tempest-livemigrationtest-server-1216472824',id=8,image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005604215.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005604215.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8e4b0fb12f14fbaa248291aa43aacee',ramdisk_id='',reservation_id='r-w7wsdj02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-266774784',owner_user_name='tempest-LiveMigrationTest-266774784-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-01T09:52:50Z,user_data=None,user_id='0416f10a8d4f4da2a6dc6cbd271a3010',uuid=aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.061 274321 DEBUG nova.network.os_vif_util [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Converting VIF {"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.062 274321 DEBUG nova.network.os_vif_util [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.065 274321 DEBUG nova.objects.instance [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lazy-loading 'pci_devices' on Instance uuid aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.085 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] End _get_guest_xml xml=<domain type="kvm">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <uuid>aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469</uuid>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <name>instance-00000008</name>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <memory>131072</memory>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <vcpu>1</vcpu>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <metadata>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:name>tempest-LiveMigrationTest-server-1216472824</nova:name>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:creationTime>2026-02-01 09:52:54</nova:creationTime>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:flavor name="m1.nano">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:memory>128</nova:memory>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:disk>1</nova:disk>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:swap>0</nova:swap>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:ephemeral>0</nova:ephemeral>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:vcpus>1</nova:vcpus>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </nova:flavor>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:owner>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:user uuid="0416f10a8d4f4da2a6dc6cbd271a3010">tempest-LiveMigrationTest-266774784-project-member</nova:user>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:project uuid="d8e4b0fb12f14fbaa248291aa43aacee">tempest-LiveMigrationTest-266774784</nova:project>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </nova:owner>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:root type="image" uuid="a223c2d3-3df7-4d82-921c-31ace200d43c"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <nova:ports>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <nova:port uuid="3c861704-c594-42f8-a5b3-a274ec84650f">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:           <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         </nova:port>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </nova:ports>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </nova:instance>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </metadata>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <sysinfo type="smbios">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <system>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <entry name="manufacturer">RDO</entry>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <entry name="product">OpenStack Compute</entry>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <entry name="serial">aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469</entry>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <entry name="uuid">aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469</entry>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <entry name="family">Virtual Machine</entry>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </system>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </sysinfo>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <os>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <boot dev="hd"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <smbios mode="sysinfo"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </os>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <features>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <acpi/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <apic/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <vmcoreinfo/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </features>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <clock offset="utc">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <timer name="pit" tickpolicy="delay"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <timer name="hpet" present="no"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </clock>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <cpu mode="host-model" match="exact">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <topology sockets="1" cores="1" threads="1"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </cpu>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   <devices>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <disk type="network" device="disk">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <driver type="raw" cache="none"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <source protocol="rbd" name="vms/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.103" port="6789"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.104" port="6789"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.105" port="6789"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </source>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <auth username="openstack">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <secret type="ceph" uuid="33fac0b9-80c7-560f-918a-c92d3021ca1e"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </auth>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <target dev="vda" bus="virtio"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <disk type="network" device="cdrom">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <driver type="raw" cache="none"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <source protocol="rbd" name="vms/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk.config">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.103" port="6789"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.104" port="6789"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.105" port="6789"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </source>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <auth username="openstack">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:         <secret type="ceph" uuid="33fac0b9-80c7-560f-918a-c92d3021ca1e"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       </auth>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <target dev="sda" bus="sata"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <interface type="ethernet">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <mac address="fa:16:3e:c4:5a:4a"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <model type="virtio"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <driver name="vhost" rx_queue_size="512"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <mtu size="1442"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <target dev="tap3c861704-c5"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </interface>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <serial type="pty">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <log file="/var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/console.log" append="off"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </serial>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <video>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <model type="virtio"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </video>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <input type="tablet" bus="usb"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <rng model="virtio">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <backend model="random">/dev/urandom</backend>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </rng>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <controller type="usb" index="0"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     <memballoon model="virtio">
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:       <stats period="10"/>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:     </memballoon>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:   </devices>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: </domain>
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.087 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Preparing to wait for external event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.087 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.088 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.088 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e100 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.089 274321 DEBUG nova.virt.libvirt.vif [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2026-02-01T09:52:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1216472824',display_name='tempest-LiveMigrationTest-server-1216472824',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005604215.localdomain',hostname='tempest-livemigrationtest-server-1216472824',id=8,image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='np0005604215.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005604215.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='d8e4b0fb12f14fbaa248291aa43aacee',ramdisk_id='',reservation_id='r-w7wsdj02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-LiveMigrationTest-266774784',owner_user_name='tempest-LiveMigrationTest-266774784-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2026-02-01T09:52:50Z,user_data=None,user_id='0416f10a8d4f4da2a6dc6cbd271a3010',uuid=aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.090 274321 DEBUG nova.network.os_vif_util [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Converting VIF {"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.090 274321 DEBUG nova.network.os_vif_util [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.091 274321 DEBUG os_vif [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.097 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.097 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.098 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.101 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.101 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c861704-c5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.102 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap3c861704-c5, col_values=(('external_ids', {'iface-id': '3c861704-c594-42f8-a5b3-a274ec84650f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:5a:4a', 'vm-uuid': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.129 274321 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769939560.128999, 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.129 274321 INFO nova.compute.manager [-] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] VM Stopped (Lifecycle Event)
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.145 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.148 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.150 274321 DEBUG nova.compute.manager [None req-b071a4b8-fd0b-42af-bb18-6c522ddefb5e - - - - - -] [instance: 5aefea54-941a-48bf-ad9e-7f13fdfdb4ed] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.151 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.152 274321 INFO os_vif [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5')
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.291 274321 DEBUG nova.network.neutron [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Updated VIF entry in instance network info cache for port 3c861704-c594-42f8-a5b3-a274ec84650f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.292 274321 DEBUG nova.network.neutron [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Updating instance_info_cache with network_info: [{"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.297 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.297 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.297 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] No VIF found with MAC fa:16:3e:c4:5a:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.298 274321 INFO nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Using config drive
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.328 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.334 274321 DEBUG oslo_concurrency.lockutils [req-19a75de5-1935-487b-96d7-ffbfaccf042b req-785ea1dd-a71c-4232-b521-7c041566ee25 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Releasing lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:52:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/776839631' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3534695093' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:52:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2689872368' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.502 274321 INFO nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Creating config drive at /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.config
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.509 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp55raenok execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.636 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmp55raenok" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.677 274321 DEBUG nova.storage.rbd_utils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] rbd image aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.682 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.config aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.905 274321 DEBUG oslo_concurrency.processutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.config aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.906 274321 INFO nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Deleting local config drive /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.config because it was imported into RBD.
Feb 01 09:52:55 np0005604215.localdomain kernel: device tap3c861704-c5 entered promiscuous mode
Feb 01 09:52:55 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939575.9446] manager: (tap3c861704-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/20)
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.947 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.951 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00075|binding|INFO|Claiming lport 3c861704-c594-42f8-a5b3-a274ec84650f for this chassis.
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00076|binding|INFO|3c861704-c594-42f8-a5b3-a274ec84650f: Claiming fa:16:3e:c4:5a:4a 10.100.0.12
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00077|binding|INFO|Claiming lport 9adda630-e8be-4f28-9d6e-88decd53d5c0 for this chassis.
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00078|binding|INFO|9adda630-e8be-4f28-9d6e-88decd53d5c0: Claiming fa:16:3e:87:8a:c3 19.80.0.117
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.954 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939575.9583] device (tap3c861704-c5): state change: unmanaged -> unavailable (reason 'connection-assumed', sys-iface-state: 'external')
Feb 01 09:52:55 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939575.9595] device (tap3c861704-c5): state change: unavailable -> disconnected (reason 'none', sys-iface-state: 'external')
Feb 01 09:52:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00079|ovn_bfd|INFO|Enabled BFD on interface ovn-2186fb-0
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00080|ovn_bfd|INFO|Enabled BFD on interface ovn-e1cc33-0
Feb 01 09:52:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:55Z|00081|ovn_bfd|INFO|Enabled BFD on interface ovn-45aa31-0
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.975 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:8a:c3 19.80.0.117'], port_security=['fa:16:3e:87:8a:c3 19.80.0.117'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3c861704-c594-42f8-a5b3-a274ec84650f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-599288938', 'neutron:cidrs': '19.80.0.117/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f10af3d7-b861-4585-95de-68162ae73827', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-599288938', 'neutron:project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c3daae5-f0f3-42a8-b893-8c534dcb0055', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=13e91b2c-4ccc-47a7-a97e-5773902dea41, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=9adda630-e8be-4f28-9d6e-88decd53d5c0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.976 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:5a:4a 10.100.0.12'], port_security=['fa:16:3e:c4:5a:4a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1236294281', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1236294281', 'neutron:project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c3daae5-f0f3-42a8-b893-8c534dcb0055', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49493626-0ffa-4ff3-a83b-4e74511666de, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=3c861704-c594-42f8-a5b3-a274ec84650f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.977 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9adda630-e8be-4f28-9d6e-88decd53d5c0 in datapath f10af3d7-b861-4585-95de-68162ae73827 bound to our chassis
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.979 158655 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network f10af3d7-b861-4585-95de-68162ae73827
Feb 01 09:52:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:55.979 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.988 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d8eaf9-8bd2-4869-9bb2-d0116fb025e5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.989 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapf10af3d7-b1 in ovnmeta-f10af3d7-b861-4585-95de-68162ae73827 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.991 303130 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapf10af3d7-b0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.992 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ee3e87b7-f8ba-428c-af72-d12356bcdfe0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:55.992 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d4a33982-fd12-4707-a604-5de8550540f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:55 np0005604215.localdomain systemd-machined[202466]: New machine qemu-2-instance-00000008.
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.000 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[e19e21bb-336b-44c3-8a19-871286fa9ab1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.015 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain systemd[1]: Started Virtual Machine qemu-2-instance-00000008.
Feb 01 09:52:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:56Z|00082|binding|INFO|Setting lport 3c861704-c594-42f8-a5b3-a274ec84650f ovn-installed in OVS
Feb 01 09:52:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:56Z|00083|binding|INFO|Setting lport 3c861704-c594-42f8-a5b3-a274ec84650f up in Southbound
Feb 01 09:52:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:56Z|00084|binding|INFO|Setting lport 9adda630-e8be-4f28-9d6e-88decd53d5c0 up in Southbound
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.030 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.031 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[a74da3ff-5d86-4342-81fc-96b9bc64eaa5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.035 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.056 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[80378d7c-bd98-4e04-bb8e-2cf682007020]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.060 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[75ca2b2a-bd93-47f9-9fb0-568d411baebf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939576.0635] manager: (tapf10af3d7-b0): new Veth device (/org/freedesktop/NetworkManager/Devices/21)
Feb 01 09:52:56 np0005604215.localdomain podman[305295]: 2026-02-01 09:52:56.070199717 +0000 UTC m=+0.086380032 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.078 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain podman[305295]: 2026-02-01 09:52:56.082982596 +0000 UTC m=+0.099162921 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.092 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[340320f1-f27a-4767-a12b-226fcf9dfb17]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.094 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[d9094efb-0195-473c-afad-d748073e9496]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:52:56 np0005604215.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapf10af3d7-b1: link becomes ready
Feb 01 09:52:56 np0005604215.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tapf10af3d7-b0: link becomes ready
Feb 01 09:52:56 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939576.1145] device (tapf10af3d7-b0): carrier: link connected
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.118 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[674254f4-3113-44c3-9c42-2a9067c5e1f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.133 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5f42c68c-e95a-48ac-8563-9ae08fc5ced0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf10af3d7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:7d:a7:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1166021, 'reachable_time': 43965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305346, 'error': None, 'target': 'ovnmeta-f10af3d7-b861-4585-95de-68162ae73827', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.144 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[f0112b23-2185-4bf1-a773-9443b72e0f1c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7d:a738'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1166021, 'tstamp': 1166021}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305351, 'error': None, 'target': 'ovnmeta-f10af3d7-b861-4585-95de-68162ae73827', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v120: 177 pgs: 177 active+clean; 225 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 9.3 MiB/s rd, 4.8 MiB/s wr, 226 op/s
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.162 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5816c518-c589-4257-a0b2-d05a56520c25]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapf10af3d7-b1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:7d:a7:38'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 22], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1166021, 'reachable_time': 43965, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 305356, 'error': None, 'target': 'ovnmeta-f10af3d7-b861-4585-95de-68162ae73827', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.180 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d0340481-3e5f-4320-97d2-77b679e3e137]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.227 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[84c00111-c16e-4750-aa61-295551990670]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.229 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf10af3d7-b0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.230 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.231 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf10af3d7-b0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.276 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain kernel: device tapf10af3d7-b0 entered promiscuous mode
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.283 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapf10af3d7-b0, col_values=(('external_ids', {'iface-id': '2795e61c-14bf-4981-8534-106e0ef1f6ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.284 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:56Z|00085|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.286 158655 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/f10af3d7-b861-4585-95de-68162ae73827.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/f10af3d7-b861-4585-95de-68162ae73827.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.286 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.288 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ab15ac58-fe11-49fb-b0f1-a6ceaba9f12d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.289 158655 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: global
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     log         /dev/log local0 debug
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     log-tag     haproxy-metadata-proxy-f10af3d7-b861-4585-95de-68162ae73827
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     user        root
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     group       root
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     maxconn     1024
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     pidfile     /var/lib/neutron/external/pids/f10af3d7-b861-4585-95de-68162ae73827.pid.haproxy
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     daemon
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: defaults
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     log global
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     mode http
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     option httplog
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     option dontlognull
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     option http-server-close
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     option forwardfor
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     retries                 3
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-request    30s
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout connect         30s
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout client          32s
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout server          32s
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-keep-alive 30s
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: listen listener
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     bind 169.254.169.254:80
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     server metadata /var/lib/neutron/metadata_proxy
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:     http-request add-header X-OVN-Network-ID f10af3d7-b861-4585-95de-68162ae73827
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.292 158655 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-f10af3d7-b861-4585-95de-68162ae73827', 'env', 'PROCESS_TAG=haproxy-f10af3d7-b861-4585-95de-68162ae73827', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/f10af3d7-b861-4585-95de-68162ae73827.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.295 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.398 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939576.3975906, aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.398 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] VM Started (Lifecycle Event)
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.406 274321 DEBUG nova.compute.manager [req-69da1a6f-bec4-4b06-bf43-a949b5378ef3 req-de015c3c-5117-41a8-98de-7ea5a251312b 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.407 274321 DEBUG oslo_concurrency.lockutils [req-69da1a6f-bec4-4b06-bf43-a949b5378ef3 req-de015c3c-5117-41a8-98de-7ea5a251312b 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.407 274321 DEBUG oslo_concurrency.lockutils [req-69da1a6f-bec4-4b06-bf43-a949b5378ef3 req-de015c3c-5117-41a8-98de-7ea5a251312b 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.408 274321 DEBUG oslo_concurrency.lockutils [req-69da1a6f-bec4-4b06-bf43-a949b5378ef3 req-de015c3c-5117-41a8-98de-7ea5a251312b 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.408 274321 DEBUG nova.compute.manager [req-69da1a6f-bec4-4b06-bf43-a949b5378ef3 req-de015c3c-5117-41a8-98de-7ea5a251312b 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Processing event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.409 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.418 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.422 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.426 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 01 09:52:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e101 e101: 6 total, 6 up, 6 in
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.429 274321 INFO nova.virt.libvirt.driver [-] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Instance spawned successfully.
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.430 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.450 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.451 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939576.4015412, aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.451 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] VM Paused (Lifecycle Event)
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.457 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.457 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.458 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.459 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.459 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.460 274321 DEBUG nova.virt.libvirt.driver [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.466 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.470 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939576.4119885, aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.470 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] VM Resumed (Lifecycle Event)
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.496 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.499 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 01 09:52:56 np0005604215.localdomain ceph-mon[298604]: pgmap v120: 177 pgs: 177 active+clean; 225 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 9.3 MiB/s rd, 4.8 MiB/s wr, 226 op/s
Feb 01 09:52:56 np0005604215.localdomain ceph-mon[298604]: osdmap e101: 6 total, 6 up, 6 in
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.518 274321 INFO nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Took 5.74 seconds to spawn the instance on the hypervisor.
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.519 274321 DEBUG nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.527 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.587 274321 INFO nova.compute.manager [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Took 6.64 seconds to build instance.
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.615 274321 DEBUG oslo_concurrency.lockutils [None req-3f432c8d-f909-45bd-ad41-3a2b87f40070 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.737s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:56 np0005604215.localdomain podman[305424]: 
Feb 01 09:52:56 np0005604215.localdomain podman[305424]: 2026-02-01 09:52:56.700701832 +0000 UTC m=+0.089219993 container create 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:52:56 np0005604215.localdomain systemd[1]: Started libpod-conmon-18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd.scope.
Feb 01 09:52:56 np0005604215.localdomain podman[305424]: 2026-02-01 09:52:56.659105835 +0000 UTC m=+0.047624066 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:52:56 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:56 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19e3833c7d8d2b2c1fabb013d6f217a0b7dde45ed475f41dc07e52f74eb93e56/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:56 np0005604215.localdomain podman[305424]: 2026-02-01 09:52:56.786041041 +0000 UTC m=+0.174559192 container init 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:56 np0005604215.localdomain podman[305424]: 2026-02-01 09:52:56.796088375 +0000 UTC m=+0.184606526 container start 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 01 09:52:56 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [NOTICE]   (305441) : New worker (305443) forked
Feb 01 09:52:56 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [NOTICE]   (305441) : Loading success.
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.855 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 3c861704-c594-42f8-a5b3-a274ec84650f in datapath 9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 unbound from our chassis
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.858 158655 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.868 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[21855e73-97c1-466e-8b29-3409d726e1b9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.868 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap9acb9cb3-f1 in ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.871 303130 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap9acb9cb3-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.872 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[0ba19668-d160-4d96-9ad7-d5112e5f4bd2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.873 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d2916b0a-7ed7-482a-ba0e-87246f3d6af3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.880 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[8d6dd96e-0960-4aa9-9a71-68f777645630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.893 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[629a1e60-85f7-4d6e-b2ce-15aba309f732]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:56Z|00086|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.899 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:56Z|00087|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:52:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:56.910 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.922 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[a8200184-0f60-4d7b-a43c-e188bff322ca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939576.9320] manager: (tap9acb9cb3-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.930 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[71451b2a-ff68-4260-aa2b-b991675f2389]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain systemd-udevd[305332]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.970 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[c0599f5c-f1ee-4633-8967-3df41800fa7a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:56 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:56.979 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[8cf0b51a-450b-4348-a0b2-c8b3f82dec73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain kernel: IPv6: ADDRCONF(NETDEV_CHANGE): tap9acb9cb3-f0: link becomes ready
Feb 01 09:52:57 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939577.0036] device (tap9acb9cb3-f0): carrier: link connected
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.009 304214 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f27ee9-e3b6-4638-8915-2e93ffb3fba9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.029 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[1c42d1fa-610a-485d-a4c0-424f31530c1d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9acb9cb3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:3c:11:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1166110, 'reachable_time': 27192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305464, 'error': None, 'target': 'ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:57Z|00088|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:52:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:57.043 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.052 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ddeb8f83-394d-42c3-8d03-aa01db9990b0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3c:1150'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 1166110, 'tstamp': 1166110}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 305465, 'error': None, 'target': 'ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.073 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[301ffce0-aa8c-4cec-92b9-bc237d244f78]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap9acb9cb3-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', 'fa:16:3e:3c:11:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 23], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1166110, 'reachable_time': 27192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1400, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 305466, 'error': None, 'target': 'ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.103 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[bd3d3c33-e78a-45d9-8e13-89e8b9ed11e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.165 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[736bf7d0-c6dc-4460-8fd5-446be0007abb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.167 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9acb9cb3-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.167 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.168 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9acb9cb3-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:57 np0005604215.localdomain kernel: device tap9acb9cb3-f0 entered promiscuous mode
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.177 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap9acb9cb3-f0, col_values=(('external_ids', {'iface-id': '82d12955-5666-45d9-bcd4-64e768a2aca1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:52:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:57.170 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:57Z|00089|binding|INFO|Releasing lport 82d12955-5666-45d9-bcd4-64e768a2aca1 from this chassis (sb_readonly=0)
Feb 01 09:52:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:57.193 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.194 158655 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.196 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[dd438c51-97c3-4071-bfd6-17e67ff0da3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.197 158655 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: global
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     log         /dev/log local0 debug
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     log-tag     haproxy-metadata-proxy-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     user        root
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     group       root
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     maxconn     1024
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     pidfile     /var/lib/neutron/external/pids/9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8.pid.haproxy
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     daemon
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: defaults
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     log global
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     mode http
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     option httplog
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     option dontlognull
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     option http-server-close
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     option forwardfor
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     retries                 3
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-request    30s
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout connect         30s
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout client          32s
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout server          32s
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     timeout http-keep-alive 30s
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: listen listener
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     bind 169.254.169.254:80
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     server metadata /var/lib/neutron/metadata_proxy
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:     http-request add-header X-OVN-Network-ID 9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]:  create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107
Feb 01 09:52:57 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:52:57.199 158655 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'env', 'PROCESS_TAG=haproxy-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84
Feb 01 09:52:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:57.343 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:57Z, description=, device_id=a5140f30-05dc-4871-8e32-f21b0cfb774b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323186d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032318f40>], id=1f774af9-27f9-4f7f-a2be-1d66f28cfc73, ip_allocation=immediate, mac_address=fa:16:3e:c2:d0:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:50Z, description=, dns_domain=, id=fdca6946-14e8-4692-9d79-41002e703846, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1230844200-network, port_security_enabled=True, project_id=41697a815dfa4c5aaae37b529f6303e1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7344, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=598, status=ACTIVE, subnets=['2ecf4d67-5d58-4ddd-8fc7-11233acff6bf'], tags=[], tenant_id=41697a815dfa4c5aaae37b529f6303e1, updated_at=2026-02-01T09:52:52Z, vlan_transparent=None, network_id=fdca6946-14e8-4692-9d79-41002e703846, port_security_enabled=False, project_id=41697a815dfa4c5aaae37b529f6303e1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=630, status=DOWN, tags=[], tenant_id=41697a815dfa4c5aaae37b529f6303e1, updated_at=2026-02-01T09:52:57Z on network fdca6946-14e8-4692-9d79-41002e703846
Feb 01 09:52:57 np0005604215.localdomain dnsmasq[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/addn_hosts - 1 addresses
Feb 01 09:52:57 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/host
Feb 01 09:52:57 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/opts
Feb 01 09:52:57 np0005604215.localdomain podman[305506]: 2026-02-01 09:52:57.533130409 +0000 UTC m=+0.060608381 container kill a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:52:57 np0005604215.localdomain podman[305527]: 
Feb 01 09:52:57 np0005604215.localdomain podman[305527]: 2026-02-01 09:52:57.600393476 +0000 UTC m=+0.075372821 container create 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 01 09:52:57 np0005604215.localdomain systemd[1]: Started libpod-conmon-20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac.scope.
Feb 01 09:52:57 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:52:57 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/39afd0bb396a392dfd50d36fe6caf2b1c9a1e9797d65ee8ff3803b1095d1a5f1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:52:57 np0005604215.localdomain podman[305527]: 2026-02-01 09:52:57.662496381 +0000 UTC m=+0.137475746 container init 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:52:57 np0005604215.localdomain podman[305527]: 2026-02-01 09:52:57.566727006 +0000 UTC m=+0.041706361 image pull  quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Feb 01 09:52:57 np0005604215.localdomain podman[305527]: 2026-02-01 09:52:57.673186665 +0000 UTC m=+0.148166010 container start 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:52:57 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [NOTICE]   (305555) : New worker (305557) forked
Feb 01 09:52:57 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [NOTICE]   (305555) : Loading success.
Feb 01 09:52:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:57.746 259225 INFO neutron.agent.dhcp.agent [None req-64c0cadf-4783-4e2c-97ad-47f8c4c4f09c - - - - - -] DHCP configuration for ports {'1f774af9-27f9-4f7f-a2be-1d66f28cfc73'} is completed
Feb 01 09:52:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v122: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 2.2 MiB/s wr, 191 op/s
Feb 01 09:52:58 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:58.194 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:52:57Z, description=, device_id=a5140f30-05dc-4871-8e32-f21b0cfb774b, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032366a30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323663a0>], id=1f774af9-27f9-4f7f-a2be-1d66f28cfc73, ip_allocation=immediate, mac_address=fa:16:3e:c2:d0:66, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:52:50Z, description=, dns_domain=, id=fdca6946-14e8-4692-9d79-41002e703846, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ImagesNegativeTestJSON-1230844200-network, port_security_enabled=True, project_id=41697a815dfa4c5aaae37b529f6303e1, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7344, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=598, status=ACTIVE, subnets=['2ecf4d67-5d58-4ddd-8fc7-11233acff6bf'], tags=[], tenant_id=41697a815dfa4c5aaae37b529f6303e1, updated_at=2026-02-01T09:52:52Z, vlan_transparent=None, network_id=fdca6946-14e8-4692-9d79-41002e703846, port_security_enabled=False, project_id=41697a815dfa4c5aaae37b529f6303e1, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=630, status=DOWN, tags=[], tenant_id=41697a815dfa4c5aaae37b529f6303e1, updated_at=2026-02-01T09:52:57Z on network fdca6946-14e8-4692-9d79-41002e703846
Feb 01 09:52:58 np0005604215.localdomain ceph-mon[298604]: pgmap v122: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 2.2 MiB/s wr, 191 op/s
Feb 01 09:52:58 np0005604215.localdomain dnsmasq[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/addn_hosts - 1 addresses
Feb 01 09:52:58 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/host
Feb 01 09:52:58 np0005604215.localdomain podman[305582]: 2026-02-01 09:52:58.403084476 +0000 UTC m=+0.061622842 container kill a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:52:58 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/opts
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.466 274321 DEBUG nova.compute.manager [req-e6a913f1-11f9-4c84-b43e-7d6da1f11800 req-5ce59bf2-6e97-481c-b3a9-de5f6a93d356 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.467 274321 DEBUG oslo_concurrency.lockutils [req-e6a913f1-11f9-4c84-b43e-7d6da1f11800 req-5ce59bf2-6e97-481c-b3a9-de5f6a93d356 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.468 274321 DEBUG oslo_concurrency.lockutils [req-e6a913f1-11f9-4c84-b43e-7d6da1f11800 req-5ce59bf2-6e97-481c-b3a9-de5f6a93d356 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.468 274321 DEBUG oslo_concurrency.lockutils [req-e6a913f1-11f9-4c84-b43e-7d6da1f11800 req-5ce59bf2-6e97-481c-b3a9-de5f6a93d356 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.468 274321 DEBUG nova.compute.manager [req-e6a913f1-11f9-4c84-b43e-7d6da1f11800 req-5ce59bf2-6e97-481c-b3a9-de5f6a93d356 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.469 274321 WARNING nova.compute.manager [req-e6a913f1-11f9-4c84-b43e-7d6da1f11800 req-5ce59bf2-6e97-481c-b3a9-de5f6a93d356 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received unexpected event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with vm_state active and task_state None.
Feb 01 09:52:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:58Z|00090|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:52:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:52:58Z|00091|binding|INFO|Releasing lport 82d12955-5666-45d9-bcd4-64e768a2aca1 from this chassis (sb_readonly=0)
Feb 01 09:52:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:58.578 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:58 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:52:58.632 259225 INFO neutron.agent.dhcp.agent [None req-13614b67-4779-40c7-8e30-40a340638019 - - - - - -] DHCP configuration for ports {'1f774af9-27f9-4f7f-a2be-1d66f28cfc73'} is completed
Feb 01 09:52:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:59.414 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:52:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:59.755 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Check if temp file /var/lib/nova/instances/tmp58hd61t0 exists to indicate shared storage is being used for migration. Exists? False _check_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10065
Feb 01 09:52:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:52:59.755 274321 DEBUG nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] source check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp58hd61t0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_source /usr/lib/python3.9/site-packages/nova/compute/manager.py:8587
Feb 01 09:53:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:53:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:53:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:53:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159554 "" "Go-http-client/1.1"
Feb 01 09:53:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:53:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19704 "" "Go-http-client/1.1"
Feb 01 09:53:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e101 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v123: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 2.1 MiB/s wr, 191 op/s
Feb 01 09:53:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:00.170 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:00 np0005604215.localdomain ceph-mon[298604]: pgmap v123: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 2.1 MiB/s wr, 191 op/s
Feb 01 09:53:01 np0005604215.localdomain dnsmasq[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/addn_hosts - 0 addresses
Feb 01 09:53:01 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/host
Feb 01 09:53:01 np0005604215.localdomain dnsmasq-dhcp[305163]: read /var/lib/neutron/dhcp/fdca6946-14e8-4692-9d79-41002e703846/opts
Feb 01 09:53:01 np0005604215.localdomain podman[305621]: 2026-02-01 09:53:01.164459721 +0000 UTC m=+0.055448780 container kill a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:53:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:53:01 np0005604215.localdomain systemd[1]: tmp-crun.691ndm.mount: Deactivated successfully.
Feb 01 09:53:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:01.308 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:01 np0005604215.localdomain podman[305634]: 2026-02-01 09:53:01.312194226 +0000 UTC m=+0.126941678 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:53:01 np0005604215.localdomain podman[305634]: 2026-02-01 09:53:01.318632987 +0000 UTC m=+0.133380429 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:53:01 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:53:01 np0005604215.localdomain kernel: device tap7ad39b92-32 left promiscuous mode
Feb 01 09:53:01 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:01Z|00092|binding|INFO|Releasing lport 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f from this chassis (sb_readonly=0)
Feb 01 09:53:01 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:01Z|00093|binding|INFO|Setting lport 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f down in Southbound
Feb 01 09:53:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:01.447 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:01.458 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fdca6946-14e8-4692-9d79-41002e703846', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fdca6946-14e8-4692-9d79-41002e703846', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '41697a815dfa4c5aaae37b529f6303e1', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=013c5d80-ec0c-4f6b-91c1-a2283198de95, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:53:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:01.460 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 7ad39b92-32e7-4263-9dbc-bbb8eeb03c9f in datapath fdca6946-14e8-4692-9d79-41002e703846 unbound from our chassis
Feb 01 09:53:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:01.464 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fdca6946-14e8-4692-9d79-41002e703846, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:53:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:01.465 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[fb167eaa-06f5-4722-960f-a6882b40f940]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:01.465 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:01.466 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:53:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:53:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:53:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:53:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:53:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.081 274321 DEBUG nova.compute.manager [req-e96ac434-6161-40f5-9851-d0f1a0fc83e7 req-b8854354-eb8c-4756-adb8-5563a2e78b07 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-unplugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.082 274321 DEBUG oslo_concurrency.lockutils [req-e96ac434-6161-40f5-9851-d0f1a0fc83e7 req-b8854354-eb8c-4756-adb8-5563a2e78b07 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.082 274321 DEBUG oslo_concurrency.lockutils [req-e96ac434-6161-40f5-9851-d0f1a0fc83e7 req-b8854354-eb8c-4756-adb8-5563a2e78b07 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.082 274321 DEBUG oslo_concurrency.lockutils [req-e96ac434-6161-40f5-9851-d0f1a0fc83e7 req-b8854354-eb8c-4756-adb8-5563a2e78b07 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.083 274321 DEBUG nova.compute.manager [req-e96ac434-6161-40f5-9851-d0f1a0fc83e7 req-b8854354-eb8c-4756-adb8-5563a2e78b07 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-unplugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.083 274321 DEBUG nova.compute.manager [req-e96ac434-6161-40f5-9851-d0f1a0fc83e7 req-b8854354-eb8c-4756-adb8-5563a2e78b07 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-unplugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 01 09:53:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v124: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 2.1 MiB/s wr, 191 op/s
Feb 01 09:53:02 np0005604215.localdomain ceph-mon[298604]: pgmap v124: 177 pgs: 177 active+clean; 271 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 5.2 MiB/s rd, 2.1 MiB/s wr, 191 op/s
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.987 274321 INFO nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Took 2.76 seconds for pre_live_migration on destination host np0005604213.localdomain.
Feb 01 09:53:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:02.988 274321 DEBUG nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.001 274321 DEBUG nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=False,disk_available_mb=12288,disk_over_commit=False,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp58hd61t0',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='rbd',instance_relative_path='aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469',is_shared_block_storage=True,is_shared_instance_path=False,is_volume_backed=False,migration=Migration(ea09c78d-8a1e-497d-978c-c737a6e34821),old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) _do_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8939
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.003 274321 DEBUG nova.objects.instance [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lazy-loading 'migration_context' on Instance uuid aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.004 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Starting monitoring of live migration _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10639
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.006 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Operation thread is still running _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10440
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.006 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migration not running yet _live_migration_monitor /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10449
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.015 274321 DEBUG nova.virt.libvirt.vif [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-01T09:52:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1216472824',display_name='tempest-LiveMigrationTest-server-1216472824',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005604215.localdomain',hostname='tempest-livemigrationtest-server-1216472824',id=8,image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-01T09:52:56Z,launched_on='np0005604215.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005604215.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d8e4b0fb12f14fbaa248291aa43aacee',ramdisk_id='',reservation_id='r-w7wsdj02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-266774784',owner_user_name='tempest-LiveMigrationTest-266774784-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-01T09:52:56Z,user_data=None,user_id='0416f10a8d4f4da2a6dc6cbd271a3010',uuid=aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.015 274321 DEBUG nova.network.os_vif_util [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Converting VIF {"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.016 274321 DEBUG nova.network.os_vif_util [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.016 274321 DEBUG nova.virt.libvirt.migration [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Updating guest XML with vif config: <interface type="ethernet">
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]:   <mac address="fa:16:3e:c4:5a:4a"/>
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]:   <model type="virtio"/>
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]:   <driver name="vhost" rx_queue_size="512"/>
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]:   <mtu size="1442"/>
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]:   <target dev="tap3c861704-c5"/>
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: </interface>
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]:  _update_vif_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:388
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.017 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] About to invoke the migrate API _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10272
Feb 01 09:53:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:03Z|00094|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:53:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:03Z|00095|binding|INFO|Releasing lport 82d12955-5666-45d9-bcd4-64e768a2aca1 from this chassis (sb_readonly=0)
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.338 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:03 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e102 e102: 6 total, 6 up, 6 in
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.508 274321 DEBUG nova.virt.libvirt.migration [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Current None elapsed 0 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.509 274321 INFO nova.virt.libvirt.migration [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Increasing downtime to 50 ms after 0 sec elapsed time
Feb 01 09:53:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:03.585 274321 INFO nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migration running for 0 secs, memory 100% remaining (bytes processed=0, remaining=0, total=0); disk 100% remaining (bytes processed=0, remaining=0, total=0).
Feb 01 09:53:03 np0005604215.localdomain systemd[1]: tmp-crun.bQwlM9.mount: Deactivated successfully.
Feb 01 09:53:03 np0005604215.localdomain dnsmasq[305163]: exiting on receipt of SIGTERM
Feb 01 09:53:03 np0005604215.localdomain podman[305685]: 2026-02-01 09:53:03.779006139 +0000 UTC m=+0.082273156 container kill a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:53:03 np0005604215.localdomain systemd[1]: libpod-a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551.scope: Deactivated successfully.
Feb 01 09:53:03 np0005604215.localdomain podman[305698]: 2026-02-01 09:53:03.857049541 +0000 UTC m=+0.063348756 container died a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:53:03 np0005604215.localdomain podman[305698]: 2026-02-01 09:53:03.896237803 +0000 UTC m=+0.102536998 container cleanup a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:53:03 np0005604215.localdomain systemd[1]: libpod-conmon-a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551.scope: Deactivated successfully.
Feb 01 09:53:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:03.913 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b2254b33e02cd88b333d7a2648b2a1c4e56223d8c3a05b6047a2f46f1c9b1e9f" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 01 09:53:03 np0005604215.localdomain podman[305700]: 2026-02-01 09:53:03.957857444 +0000 UTC m=+0.157395807 container remove a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca6946-14e8-4692-9d79-41002e703846, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:53:03 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:03.993 259225 INFO neutron.agent.dhcp.agent [None req-aab5f074-fbc2-408b-94c6-8297f84ed382 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:53:04 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:04.004 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.087 274321 DEBUG nova.virt.libvirt.migration [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Current 50 elapsed 1 steps [(0, 50), (150, 95), (300, 140), (450, 185), (600, 230), (750, 275), (900, 320), (1050, 365), (1200, 410), (1350, 455), (1500, 500)] update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:512
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.088 274321 DEBUG nova.virt.libvirt.migration [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Downtime does not need to change update_downtime /usr/lib/python3.9/site-packages/nova/virt/libvirt/migration.py:525
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.131 274321 DEBUG nova.compute.manager [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.131 274321 DEBUG oslo_concurrency.lockutils [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.132 274321 DEBUG oslo_concurrency.lockutils [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.132 274321 DEBUG oslo_concurrency.lockutils [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.132 274321 DEBUG nova.compute.manager [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.133 274321 WARNING nova.compute.manager [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received unexpected event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with vm_state active and task_state migrating.
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.133 274321 DEBUG nova.compute.manager [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-changed-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.133 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 954 Content-Type: application/json Date: Sun, 01 Feb 2026 09:53:03 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ac906038-eb2a-46f7-b146-08f2a10e1e76 x-openstack-request-id: req-ac906038-eb2a-46f7-b146-08f2a10e1e76 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.133 274321 DEBUG nova.compute.manager [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Refreshing instance network info cache due to event network-changed-3c861704-c594-42f8-a5b3-a274ec84650f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.134 274321 DEBUG oslo_concurrency.lockutils [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.134 274321 DEBUG oslo_concurrency.lockutils [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquired lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.135 274321 DEBUG nova.network.neutron [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Refreshing network info cache for port 3c861704-c594-42f8-a5b3-a274ec84650f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.135 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "04b6d75f-0335-413a-b9d6-dfe49d77feaf", "name": "m1.nano", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/04b6d75f-0335-413a-b9d6-dfe49d77feaf"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/04b6d75f-0335-413a-b9d6-dfe49d77feaf"}]}, {"id": "371ff7cc-43c7-4354-b1ce-55c23740c8c8", "name": "m1.small", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/371ff7cc-43c7-4354-b1ce-55c23740c8c8"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/371ff7cc-43c7-4354-b1ce-55c23740c8c8"}]}, {"id": "d824a107-9738-4ab8-b2ca-4ac633695018", "name": "m1.micro", "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/d824a107-9738-4ab8-b2ca-4ac633695018"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/d824a107-9738-4ab8-b2ca-4ac633695018"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.136 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-ac906038-eb2a-46f7-b146-08f2a10e1e76 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.141 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET http://nova-internal.openstack.svc:8774/v2.1/flavors/04b6d75f-0335-413a-b9d6-dfe49d77feaf -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}b2254b33e02cd88b333d7a2648b2a1c4e56223d8c3a05b6047a2f46f1c9b1e9f" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Feb 01 09:53:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v126: 177 pgs: 177 active+clean; 273 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 2.7 MiB/s wr, 215 op/s
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.157 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 493 Content-Type: application/json Date: Sun, 01 Feb 2026 09:53:04 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-c644d01c-69d5-4b53-b4bc-f8743be220ca x-openstack-request-id: req-c644d01c-69d5-4b53-b4bc-f8743be220ca _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.157 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "04b6d75f-0335-413a-b9d6-dfe49d77feaf", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "http://nova-internal.openstack.svc:8774/v2.1/flavors/04b6d75f-0335-413a-b9d6-dfe49d77feaf"}, {"rel": "bookmark", "href": "http://nova-internal.openstack.svc:8774/flavors/04b6d75f-0335-413a-b9d6-dfe49d77feaf"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.158 12 DEBUG novaclient.v2.client [-] GET call to compute for http://nova-internal.openstack.svc:8774/v2.1/flavors/04b6d75f-0335-413a-b9d6-dfe49d77feaf used request id req-c644d01c-69d5-4b53-b4bc-f8743be220ca request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.160 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'name': 'tempest-LiveMigrationTest-server-1216472824', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000008', 'OS-EXT-SRV-ATTR:host': 'np0005604215.localdomain', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'hostId': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.161 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.194 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.196 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5528538a-368c-4c7d-9d75-add6c9fb4342', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.161479', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd675d96-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '468acf47711a1fc14d210938698e9dbc603ab512f200bb64f7574fe47737939c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.161479', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd678370-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '5a20c2fdc2eb8f524278d2f88a476a7b8741106bae07acd7ec24f4f24f8e6cde'}]}, 'timestamp': '2026-02-01 09:53:04.197590', '_unique_id': '838aaabc62cf4218ac5c4234d7c0020e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.204 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.215 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 / tap3c861704-c5 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.216 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '74abe0bb-1ba9-46fd-afbc-4da676f63708', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.212219', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd6a984e-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': 'c1bdd51776f50bab757749a0b0937cbe56d945eb140045f07f77218b70178eda'}]}, 'timestamp': '2026-02-01 09:53:04.217163', '_unique_id': '82f76ac227b14c45afed15d87236e108'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.218 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.223 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.224 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64467bae-b96d-4e5a-9806-d1c0f9e48230', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.223414', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd6ba9fa-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '7bd4ea2e1003a957a1f59d10624a2a6269d823331f77be7dfa4d06ec0f3e6b1e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.223414', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd6bc494-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': 'be76554aaa01d17ed8dca31ae100ed6fe83d3076f33053ee192b439595284d10'}]}, 'timestamp': '2026-02-01 09:53:04.224875', '_unique_id': '78d938bfc6b6469c8775d0955edfd178'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.232 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.233 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.261 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/cpu volume: 7270000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '81de59f9-9754-4bd0-9bad-e705b8cabdf9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 7270000000, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'timestamp': '2026-02-01T09:53:04.234036', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cd71768c-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.439842278, 'message_signature': '4b3144f5f03408e0f58a42e1fe954631a3d62006026100399012f717f8cd1b1d'}]}, 'timestamp': '2026-02-01 09:53:04.262186', '_unique_id': 'c7eb0e8b1e6d40ec9a87df95eaffa599'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.263 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.264 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.264 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.264 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469: ceilometer.compute.pollsters.NoVolumeException
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.264 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.265 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f6692a3-8708-41ee-84f8-9ff5a50da426', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.264998', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd71f86e-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '57011b6aff5cfa9a68141fab76f50014fbef1c404896a44f8f717e0bc731d852'}]}, 'timestamp': '2026-02-01 09:53:04.265440', '_unique_id': '5f280fc992354ae38ed3d395a1b318f7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.266 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.267 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.280 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.allocation volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.281 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.allocation volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2f400fe-3bba-4f38-8ecf-5ac39d14a1fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.267126', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd7470d0-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.446037592, 'message_signature': '799fa6377774f37bc9701c6401c5b28e3b0c6b1e29d65b068c86470f4d49c51d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.267126', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd7489f8-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.446037592, 'message_signature': 'd911dd07f48fa8129e137d6097f9657e824c26f93b65f5712c754fddad40d28a'}]}, 'timestamp': '2026-02-01 09:53:04.282593', '_unique_id': '7ba7089bff8240558562841a8c636786'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.284 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.286 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.287 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.287 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59a941c0-e733-48d5-a361-da8aaff70f18', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.287230', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd756274-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '72d026008701cda96b188d8eb7d34fbe741d7ac37618791cdd4a99d7753f9df9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.287230', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd757a0c-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '6862f9dff4d98d821ee4b2da85bf9f8605eedffd2dee63d1923e2e511c7c2e6a'}]}, 'timestamp': '2026-02-01 09:53:04.288547', '_unique_id': 'f162552d305742a7983c9fc9a06215dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.290 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.292 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.292 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.293 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8b9c1338-e2b2-4944-9116-9952d43e5cea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.292904', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd763f14-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.446037592, 'message_signature': 'f961a6392974dfcac3cb0614937948a3b91ed2efa7198bc28b66a43f06d25d56'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.292904', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd765d00-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.446037592, 'message_signature': 'f8a6f606fd2ba17b9bc284cf81e18f71dde47be1640197506904b60fce33acb7'}]}, 'timestamp': '2026-02-01 09:53:04.294281', '_unique_id': 'e6bc695eddfb4354a17233cfb0bddb85'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.295 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.297 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.298 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.298 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>]
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.299 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.300 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.300 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>]
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.301 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5de2ed71-b664-46a8-80d3-b8d58e4591fb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.301533', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd77a278-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '11b87d519f6d74a9f283fd2553f4425b7a8fb190817e97785b8590bf5aada41c'}]}, 'timestamp': '2026-02-01 09:53:04.302634', '_unique_id': '5554b36b7e6d4cfb946ad26327cfc9dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.303 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.306 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.307 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5579acb3-7299-4643-935e-2f1cb5a62318', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.307885', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd789322-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '08ff07aebad11774bf0787592d45441eec91e8fe8274cdfdcca69f1aa4010527'}]}, 'timestamp': '2026-02-01 09:53:04.308804', '_unique_id': '9c10451b948a4ebf81458b2398ae1a36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.312 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.315 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.316 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.317 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3d5319c4-c281-4654-8058-ce89a91d4a2a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.316071', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd79e6be-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '79a5d4a4de8559cd740d3057d75626b8afed5bc8a91fcaa92764226722eca2ec'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.316071', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd7a13b4-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': 'b78d4f8853ac31393d138861a4294908aca3b4712eaedf8af68c4cb33a7e1bf8'}]}, 'timestamp': '2026-02-01 09:53:04.319168', '_unique_id': 'd997cd6ca05c4890ace277d44b5195f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.320 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.322 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2a02e14-8ee5-419c-acc1-30dbdaa12530', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.322920', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd7ad268-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '3d9edf7634ad03a6a3ef746cbe2a4dfb618c08eb06a86f300c36234c9fc8d398'}]}, 'timestamp': '2026-02-01 09:53:04.323415', '_unique_id': 'e50dd316d2e84eee9190fb05baf0c490'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.324 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.329 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.328 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939584.3271024, aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.329 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] VM Paused (Lifecycle Event)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '210a017c-36a5-48d3-94bf-ed32bf1221ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.329055', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd7bc7d6-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '9380d86c7a9aeab58f9542fd7ec1e6b091c97028df172d0d86eca583d2ef2f99'}]}, 'timestamp': '2026-02-01 09:53:04.329720', '_unique_id': '29f1143dc4a04cd88f88f7bae3d1c114'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4131749-ecfe-4bb3-93c5-0d6a21f257df', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.331974', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd7c2fc8-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '59b1d7bc23f861eac2a345c4a13af723ce06200c275609028cdf2dd5a497aad4'}]}, 'timestamp': '2026-02-01 09:53:04.332361', '_unique_id': '3c1012ab3d2b4a4bb86819192819e06a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.332 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.333 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.333 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0db7823-ca3b-4137-9649-d445b7e9ea1e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.333837', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd7c76e0-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '5940d431c6fd92047d75a36a6f4bc6315ccf026240e3af66e19cedb48459a636'}]}, 'timestamp': '2026-02-01 09:53:04.334213', '_unique_id': '5b7fb62455e943438d6f132966b518b4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.334 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.335 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.335 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f6fa4010-4c43-4638-ba12-2ce79951d57e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.335610', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd7cbc90-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': '88b8ae1f0528a561f7397a953ba85608385af3d90338f2107b2e19e0be731814'}]}, 'timestamp': '2026-02-01 09:53:04.335942', '_unique_id': '5d26c13140124ebcbe15db2054ac24c9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.336 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.337 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.337 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.337 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9c10bc0-926e-49a3-b353-1a89443e4504', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.337364', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd7d0056-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '48d4702e892061bef399841a36a879d962949411b0f22afcb729e0fa8b7f5401'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.337364', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd7d0d4e-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '9146fce042392c4d6ef5770c1dfc49618b5a21fc13a1d265ebc85213f44b9ba4'}]}, 'timestamp': '2026-02-01 09:53:04.337995', '_unique_id': '9d3b0ca748874959a390ba0502b57f66'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.338 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.339 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.read.latency volume: 1011941075 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.339 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.read.latency volume: 1643661 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '10a31374-40e7-4515-ab58-4828367536e1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1011941075, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.339459', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd7d520e-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '67a45de5643a4d032cdabc649707393b37ee965de51a2179ace48941653baf43'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 1643661, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.339459', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd7d5dee-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.340424279, 'message_signature': '4197c0ad6419b387260c237ce80dc603d4ec07bc04c9ff37ff76846fd13fa1eb'}]}, 'timestamp': '2026-02-01 09:53:04.340056', '_unique_id': '1dfe80af780940999750529660a03cbe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.340 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.341 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.341 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.341 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>]
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.341 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.341 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e5b2f498-73e9-4d3a-ace2-6f8d671a2440', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'instance-00000008-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-tap3c861704-c5', 'timestamp': '2026-02-01T09:53:04.341967', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'tap3c861704-c5', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:c4:5a:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap3c861704-c5'}, 'message_id': 'cd7db442-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.391224413, 'message_signature': 'fe3b79d8a8843a52ec350807bc83903fb66ae183d460dfcde6d397336fe12c64'}]}, 'timestamp': '2026-02-01 09:53:04.342301', '_unique_id': '7a1bf14e883c4540b221c39a7319422d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.342 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.343 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.343 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.343 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-LiveMigrationTest-server-1216472824>]
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.344 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.344 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.usage volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.344 12 DEBUG ceilometer.compute.pollsters [-] aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '731498f3-34e3-4a5f-b3be-b2df51e784d3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-vda', 'timestamp': '2026-02-01T09:53:04.344243', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cd7e0ed8-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.446037592, 'message_signature': 'ca658ef0dba6a457ab19454ebcca764d72eaac3006cb7ebfaeed84cea4bb6724'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '0416f10a8d4f4da2a6dc6cbd271a3010', 'user_name': None, 'project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'project_name': None, 'resource_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-sda', 'timestamp': '2026-02-01T09:53:04.344243', 'resource_metadata': {'display_name': 'tempest-LiveMigrationTest-server-1216472824', 'name': 'instance-00000008', 'instance_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'instance_type': 'm1.nano', 'host': '8174b57508600476c08099301c2932d0aaf5f05709fcea5b89529be1', 'instance_host': 'np0005604215.localdomain', 'flavor': {'id': '04b6d75f-0335-413a-b9d6-dfe49d77feaf', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}, 'image_ref': 'a223c2d3-3df7-4d82-921c-31ace200d43c', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cd7e19e6-ff53-11f0-b4d6-fa163ed0c8c4', 'monotonic_time': 11668.446037592, 'message_signature': 'b9467638dba4edf3ef018b30aa0c835e2b83837f75a8ed7149952192ea4c0bc4'}]}, 'timestamp': '2026-02-01 09:53:04.344868', '_unique_id': 'b5bfa140c0264b85b835764c2e27a09e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     yield
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Feb 01 09:53:04 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:53:04.346 12 ERROR oslo_messaging.notify.messaging 
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.350 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.353 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.377 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] During sync_power_state the instance has a pending task (migrating). Skip.
Feb 01 09:53:04 np0005604215.localdomain kernel: device tap3c861704-c5 left promiscuous mode
Feb 01 09:53:04 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939584.4831] device (tap3c861704-c5): state change: disconnected -> unmanaged (reason 'unmanaged', sys-iface-state: 'removed')
Feb 01 09:53:04 np0005604215.localdomain ceph-mon[298604]: osdmap e102: 6 total, 6 up, 6 in
Feb 01 09:53:04 np0005604215.localdomain ceph-mon[298604]: pgmap v126: 177 pgs: 177 active+clean; 273 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 2.7 MiB/s wr, 215 op/s
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.493 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00096|binding|INFO|Releasing lport 3c861704-c594-42f8-a5b3-a274ec84650f from this chassis (sb_readonly=0)
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00097|binding|INFO|Setting lport 3c861704-c594-42f8-a5b3-a274ec84650f down in Southbound
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00098|binding|INFO|Releasing lport 9adda630-e8be-4f28-9d6e-88decd53d5c0 from this chassis (sb_readonly=0)
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00099|binding|INFO|Setting lport 9adda630-e8be-4f28-9d6e-88decd53d5c0 down in Southbound
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00100|binding|INFO|Removing iface tap3c861704-c5 ovn-installed in OVS
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00101|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00102|binding|INFO|Releasing lport 82d12955-5666-45d9-bcd4-64e768a2aca1 from this chassis (sb_readonly=0)
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.503 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:87:8a:c3 19.80.0.117'], port_security=['fa:16:3e:87:8a:c3 19.80.0.117'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['3c861704-c594-42f8-a5b3-a274ec84650f'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-599288938', 'neutron:cidrs': '19.80.0.117/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f10af3d7-b861-4585-95de-68162ae73827', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-599288938', 'neutron:project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'neutron:revision_number': '3', 'neutron:security_group_ids': '3c3daae5-f0f3-42a8-b893-8c534dcb0055', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=13e91b2c-4ccc-47a7-a97e-5773902dea41, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[], logical_port=9adda630-e8be-4f28-9d6e-88decd53d5c0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.507 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:5a:4a 10.100.0.12'], port_security=['fa:16:3e:c4:5a:4a 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain,np0005604213.localdomain', 'activation-strategy': 'rarp', 'additional-chassis-activated': '6d5b1744-6b18-45d1-b363-5f956c1e98d7'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-1236294281', 'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-1236294281', 'neutron:project_id': 'd8e4b0fb12f14fbaa248291aa43aacee', 'neutron:revision_number': '8', 'neutron:security_group_ids': '3c3daae5-f0f3-42a8-b893-8c534dcb0055', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=49493626-0ffa-4ff3-a83b-4e74511666de, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=3c861704-c594-42f8-a5b3-a274ec84650f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.508 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00103|ovn_bfd|INFO|Disabled BFD on interface ovn-2186fb-0
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00104|ovn_bfd|INFO|Disabled BFD on interface ovn-e1cc33-0
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00105|ovn_bfd|INFO|Disabled BFD on interface ovn-45aa31-0
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.514 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.509 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9adda630-e8be-4f28-9d6e-88decd53d5c0 in datapath f10af3d7-b861-4585-95de-68162ae73827 unbound from our chassis
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.512 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f10af3d7-b861-4585-95de-68162ae73827, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.512 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[60ece4d5-04a0-460e-9d05-77a9017aab43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.513 158655 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-f10af3d7-b861-4585-95de-68162ae73827 namespace which is not needed anymore
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000008.scope: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000008.scope: Consumed 8.380s CPU time.
Feb 01 09:53:04 np0005604215.localdomain systemd-machined[202466]: Machine qemu-2-instance-00000008 terminated.
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.538 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00106|binding|INFO|Releasing lport 2795e61c-14bf-4981-8534-106e0ef1f6ea from this chassis (sb_readonly=0)
Feb 01 09:53:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:04Z|00107|binding|INFO|Releasing lport 82d12955-5666-45d9-bcd4-64e768a2aca1 from this chassis (sb_readonly=0)
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.542 274321 DEBUG nova.network.neutron [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Updated VIF entry in instance network info cache for port 3c861704-c594-42f8-a5b3-a274ec84650f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.542 274321 DEBUG nova.network.neutron [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Updating instance_info_cache with network_info: [{"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"migrating_to": "np0005604213.localdomain"}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.544 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.566 274321 DEBUG oslo_concurrency.lockutils [req-2041bad4-b838-4660-9c17-42fd2a1f7647 req-0927f14c-811a-4976-9a22-ff204b93c533 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Releasing lock "refresh_cache-aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:53:04 np0005604215.localdomain virtqemud[224673]: Unable to get XATTR trusted.libvirt.security.ref_selinux on vms/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk: No such file or directory
Feb 01 09:53:04 np0005604215.localdomain virtqemud[224673]: Unable to get XATTR trusted.libvirt.security.ref_dac on vms/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_disk: No such file or directory
Feb 01 09:53:04 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939584.6519] manager: (tap3c861704-c5): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.665 274321 DEBUG nova.virt.libvirt.guest [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Domain has shutdown/gone away: Requested operation is not valid: domain is not running get_job_info /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:688
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.666 274321 INFO nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migration operation has completed
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.666 274321 INFO nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] _post_live_migration() is started..
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.677 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migrate API has completed _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10279
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.678 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migration operation thread has finished _live_migration_operation /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10327
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.678 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migration operation thread notification thread_finished /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10630
Feb 01 09:53:04 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [NOTICE]   (305441) : haproxy version is 2.8.14-c23fe91
Feb 01 09:53:04 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [NOTICE]   (305441) : path to executable is /usr/sbin/haproxy
Feb 01 09:53:04 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [WARNING]  (305441) : Exiting Master process...
Feb 01 09:53:04 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [ALERT]    (305441) : Current worker (305443) exited with code 143 (Terminated)
Feb 01 09:53:04 np0005604215.localdomain neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827[305437]: [WARNING]  (305441) : All workers exited. Exiting... (0)
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: libpod-18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd.scope: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain podman[305755]: 2026-02-01 09:53:04.701739132 +0000 UTC m=+0.073020887 container died 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:53:04 np0005604215.localdomain podman[305755]: 2026-02-01 09:53:04.743978778 +0000 UTC m=+0.115260553 container cleanup 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-19e3833c7d8d2b2c1fabb013d6f217a0b7dde45ed475f41dc07e52f74eb93e56-merged.mount: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd-userdata-shm.mount: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-68b1613e4d327631643e09ac0edff96b26245deaec5c43c3419a3ce4c98fd9cd-merged.mount: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a8b5538e4c67fb653899080c7d2e2ae7ee1acaa153b2d1f9c71087a7568aa551-userdata-shm.mount: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dfdca6946\x2d14e8\x2d4692\x2d9d79\x2d41002e703846.mount: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain podman[305777]: 2026-02-01 09:53:04.821940897 +0000 UTC m=+0.119525786 container cleanup 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: libpod-conmon-18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd.scope: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain podman[305792]: 2026-02-01 09:53:04.89258686 +0000 UTC m=+0.125393259 container remove 18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.897 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[144cbff2-6f36-4e85-9b83-55f704daa563]: (4, ('Sun Feb  1 09:53:04 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827 (18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd)\n18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd\nSun Feb  1 09:53:04 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-f10af3d7-b861-4585-95de-68162ae73827 (18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd)\n18dc1643a3318e3bd3500e8b99d71786f9c3ff5685f82f7ae225b36ee835fddd\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.900 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ab76b243-6c81-4d15-8c53-0008b5396b5d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.901 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf10af3d7-b0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.903 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain kernel: device tapf10af3d7-b0 left promiscuous mode
Feb 01 09:53:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:04.915 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.918 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[58bdf574-9076-4ae7-bb75-60cf1b6c87f9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.936 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[078ed8d1-b5a2-4bb7-abfd-e04a16411659]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.937 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[157c70f0-7414-4069-bcab-f2af33925df2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.958 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[34f3ad12-693d-4802-ae76-5bb6436da710]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1166015, 'reachable_time': 16855, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305814, 'error': None, 'target': 'ovnmeta-f10af3d7-b861-4585-95de-68162ae73827', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.961 158836 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-f10af3d7-b861-4585-95de-68162ae73827 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.961 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3de44b-6d39-435f-b73f-3baa424b4c15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain systemd[1]: run-netns-ovnmeta\x2df10af3d7\x2db861\x2d4585\x2d95de\x2d68162ae73827.mount: Deactivated successfully.
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.963 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 3c861704-c594-42f8-a5b3-a274ec84650f in datapath 9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 unbound from our chassis
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.966 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.967 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[c9390e4d-6dce-4812-90fc-a573d72760c8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:04.967 158655 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 namespace which is not needed anymore
Feb 01 09:53:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e102 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:05 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [NOTICE]   (305555) : haproxy version is 2.8.14-c23fe91
Feb 01 09:53:05 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [NOTICE]   (305555) : path to executable is /usr/sbin/haproxy
Feb 01 09:53:05 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [WARNING]  (305555) : Exiting Master process...
Feb 01 09:53:05 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [ALERT]    (305555) : Current worker (305557) exited with code 143 (Terminated)
Feb 01 09:53:05 np0005604215.localdomain neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8[305550]: [WARNING]  (305555) : All workers exited. Exiting... (0)
Feb 01 09:53:05 np0005604215.localdomain systemd[1]: libpod-20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac.scope: Deactivated successfully.
Feb 01 09:53:05 np0005604215.localdomain podman[305831]: 2026-02-01 09:53:05.158096646 +0000 UTC m=+0.074753141 container died 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.209 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:05 np0005604215.localdomain podman[305831]: 2026-02-01 09:53:05.230331528 +0000 UTC m=+0.146988013 container cleanup 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:53:05 np0005604215.localdomain podman[305843]: 2026-02-01 09:53:05.259978782 +0000 UTC m=+0.095241830 container cleanup 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:53:05 np0005604215.localdomain systemd[1]: libpod-conmon-20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac.scope: Deactivated successfully.
Feb 01 09:53:05 np0005604215.localdomain podman[305858]: 2026-02-01 09:53:05.323038007 +0000 UTC m=+0.075476593 container remove 20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.327 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[22a65303-60ee-42e1-bd7f-3b23abced818]: (4, ('Sun Feb  1 09:53:05 AM UTC 2026 Stopping container neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 (20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac)\n20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac\nSun Feb  1 09:53:05 AM UTC 2026 Deleting container neutron-haproxy-ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 (20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac)\n20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.329 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[d5c18e24-98bc-4b0d-8954-f0df45b92103]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.331 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9acb9cb3-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.333 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:05 np0005604215.localdomain kernel: device tap9acb9cb3-f0 left promiscuous mode
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.346 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.353 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[01d31c1c-c913-4498-ab70-ee92f524e399]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.371 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ca80c0b1-0045-4b05-8523-1849d1c528f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.373 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[a0e5992d-3c9d-460d-a759-e29afe00c6ff]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.391 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[c28f1971-825a-4bf0-9036-198a5099474d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_QDISC', 'noqueue'], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 1, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 1166101, 'reachable_time': 40281, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 37, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}]], 'header': {'length': 1356, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 305882, 'error': None, 'target': 'ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.393 158836 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607
Feb 01 09:53:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:05.393 158836 DEBUG oslo.privsep.daemon [-] privsep: reply[868bf49f-7f2d-44ba-9dfd-4311c0b04256]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e103 e103: 6 total, 6 up, 6 in
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.517 274321 DEBUG nova.network.neutron [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Activated binding for port 3c861704-c594-42f8-a5b3-a274ec84650f and host np0005604213.localdomain migrate_instance_start /usr/lib/python3.9/site-packages/nova/network/neutron.py:3181
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.518 274321 DEBUG nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Calling driver.post_live_migration_at_source with original source VIFs from migrate_data: [{"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9326
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.519 274321 DEBUG nova.virt.libvirt.vif [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2026-02-01T09:52:49Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-LiveMigrationTest-server-1216472824',display_name='tempest-LiveMigrationTest-server-1216472824',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(5),hidden=False,host='np0005604215.localdomain',hostname='tempest-livemigrationtest-server-1216472824',id=8,image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',info_cache=InstanceInfoCache,instance_type_id=5,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2026-02-01T09:52:56Z,launched_on='np0005604215.localdomain',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='np0005604215.localdomain',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='d8e4b0fb12f14fbaa248291aa43aacee',ramdisk_id='',reservation_id='r-w7wsdj02',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='a223c2d3-3df7-4d82-921c-31ace200d43c',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveMigrationTest-266774784',owner_user_name='tempest-LiveMigrationTest-266774784-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=<?>,updated_at=2026-02-01T09:52:59Z,user_data=None,user_id='0416f10a8d4f4da2a6dc6cbd271a3010',uuid=aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.520 274321 DEBUG nova.network.os_vif_util [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Converting VIF {"id": "3c861704-c594-42f8-a5b3-a274ec84650f", "address": "fa:16:3e:c4:5a:4a", "network": {"id": "9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8", "bridge": "br-int", "label": "tempest-LiveMigrationTest-1381927866-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true, "dhcp_server": "10.100.0.3"}}], "meta": {"injected": false, "tenant_id": "d8e4b0fb12f14fbaa248291aa43aacee", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap3c861704-c5", "ovs_interfaceid": "3c861704-c594-42f8-a5b3-a274ec84650f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.521 274321 DEBUG nova.network.os_vif_util [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.522 274321 DEBUG os_vif [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.525 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 22 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.525 274321 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c861704-c5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.527 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.529 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.533 274321 INFO os_vif [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:5a:4a,bridge_name='br-int',has_traffic_filtering=True,id=3c861704-c594-42f8-a5b3-a274ec84650f,network=Network(9acb9cb3-fbe8-4ec2-bc71-dc5c4af33bf8),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tap3c861704-c5')
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.533 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.534 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.534 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.free_pci_device_allocations_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.535 274321 DEBUG nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Calling driver.cleanup from _post_live_migration _post_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:9349
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.535 274321 INFO nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Deleting instance files /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_del
Feb 01 09:53:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:05.536 274321 INFO nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Deletion of /var/lib/nova/instances/aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469_del complete
Feb 01 09:53:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-39afd0bb396a392dfd50d36fe6caf2b1c9a1e9797d65ee8ff3803b1095d1a5f1-merged.mount: Deactivated successfully.
Feb 01 09:53:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-20b10dafd04dcdac1f900a4509924987511c6ee1e0225d7a81833bc4be8c96ac-userdata-shm.mount: Deactivated successfully.
Feb 01 09:53:05 np0005604215.localdomain systemd[1]: run-netns-ovnmeta\x2d9acb9cb3\x2dfbe8\x2d4ec2\x2dbc71\x2ddc5c4af33bf8.mount: Deactivated successfully.
Feb 01 09:53:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v128: 177 pgs: 177 active+clean; 273 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 43 KiB/s wr, 129 op/s
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.181 274321 DEBUG nova.compute.manager [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-unplugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.182 274321 DEBUG oslo_concurrency.lockutils [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.182 274321 DEBUG oslo_concurrency.lockutils [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.182 274321 DEBUG oslo_concurrency.lockutils [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.183 274321 DEBUG nova.compute.manager [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-unplugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.183 274321 DEBUG nova.compute.manager [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-unplugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with task_state migrating. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.183 274321 DEBUG nova.compute.manager [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.183 274321 DEBUG oslo_concurrency.lockutils [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.184 274321 DEBUG oslo_concurrency.lockutils [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.184 274321 DEBUG oslo_concurrency.lockutils [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.184 274321 DEBUG nova.compute.manager [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.185 274321 WARNING nova.compute.manager [req-f416e53e-abb5-42e1-b255-6e1114a6396e req-bd7fedae-6685-4b41-a9d8-b2bfe1575530 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received unexpected event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with vm_state active and task_state migrating.
Feb 01 09:53:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:06.335 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:06 np0005604215.localdomain ceph-mon[298604]: osdmap e103: 6 total, 6 up, 6 in
Feb 01 09:53:06 np0005604215.localdomain ceph-mon[298604]: pgmap v128: 177 pgs: 177 active+clean; 273 MiB data, 933 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 43 KiB/s wr, 129 op/s
Feb 01 09:53:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e104 e104: 6 total, 6 up, 6 in
Feb 01 09:53:07 np0005604215.localdomain ceph-mon[298604]: osdmap e104: 6 total, 6 up, 6 in
Feb 01 09:53:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v130: 177 pgs: 177 active+clean; 375 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 12 MiB/s wr, 403 op/s
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.221 274321 DEBUG nova.compute.manager [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.221 274321 DEBUG oslo_concurrency.lockutils [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.222 274321 DEBUG oslo_concurrency.lockutils [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.223 274321 DEBUG oslo_concurrency.lockutils [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.223 274321 DEBUG nova.compute.manager [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.223 274321 WARNING nova.compute.manager [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received unexpected event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with vm_state active and task_state migrating.
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.224 274321 DEBUG nova.compute.manager [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.224 274321 DEBUG oslo_concurrency.lockutils [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.225 274321 DEBUG oslo_concurrency.lockutils [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.225 274321 DEBUG oslo_concurrency.lockutils [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.225 274321 DEBUG nova.compute.manager [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] No waiting events found dispatching network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.225 274321 WARNING nova.compute.manager [req-4c59c29d-830c-40ee-b3c6-7698c99c415d req-e10655f7-fe7e-4d0d-ada1-5bbbda1749a5 366b10c1124b4cc182e6512cf437f582 8c7e182e9edd4a9496010d2b1c99e9ab - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Received unexpected event network-vif-plugged-3c861704-c594-42f8-a5b3-a274ec84650f for instance with vm_state active and task_state migrating.
Feb 01 09:53:08 np0005604215.localdomain ceph-mon[298604]: pgmap v130: 177 pgs: 177 active+clean; 375 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 11 MiB/s rd, 12 MiB/s wr, 403 op/s
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.905 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Acquiring lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.906 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.906 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.929 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.929 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.930 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.930 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:53:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:08.931 274321 DEBUG oslo_concurrency.processutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:53:09 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/452746631' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.375 274321 DEBUG oslo_concurrency.processutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:09 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2511160075' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:09 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/452746631' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.585 274321 WARNING nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.586 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11703MB free_disk=41.567874908447266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.587 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.587 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.626 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Migration for instance aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 refers to another host's instance! _pair_instances_to_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:903
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.648 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Skipping migration as instance is neither resizing nor live-migrating. _update_usage_from_migrations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1491
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.672 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Migration ea09c78d-8a1e-497d-978c-c737a6e34821 is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.726 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Instance 4239e79f-2907-476f-baff-d30c06ed6f5f has been scheduled to this compute host, the scheduler has made an allocation against this compute node but the instance has yet to start. Skipping heal of allocation: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1692
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.727 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.727 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.760 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Acquiring lock "4239e79f-2907-476f-baff-d30c06ed6f5f" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.760 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "4239e79f-2907-476f-baff-d30c06ed6f5f" acquired by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.761 274321 INFO nova.compute.manager [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Unshelving
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.806 274321 DEBUG oslo_concurrency.processutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:09.849 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e104 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v131: 177 pgs: 177 active+clean; 375 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.1 MiB/s rd, 10 MiB/s wr, 206 op/s
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/681495947' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.246 274321 DEBUG oslo_concurrency.processutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.440s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.252 274321 DEBUG nova.compute.provider_tree [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.268 274321 DEBUG nova.scheduler.client.report [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.293 274321 DEBUG nova.compute.resource_tracker [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.293 274321 DEBUG oslo_concurrency.lockutils [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.706s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.298 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.449s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.301 274321 INFO nova.compute.manager [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Migrating instance to np0005604213.localdomain finished successfully.
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.303 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'pci_requests' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.323 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'numa_topology' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.337 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.338 274321 INFO nova.compute.claims [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Claim successful on node np0005604215.localdomain
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.412 274321 INFO nova.scheduler.client.report [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] Deleted allocation for migration ea09c78d-8a1e-497d-978c-c737a6e34821
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.412 274321 DEBUG nova.virt.libvirt.driver [None req-73b7e1e8-a1ec-429c-ad9f-34aa9f07a43a 7818b8c14c694d9c97606ff05af9b8e2 ef9394e0b21548a491d64bf76f5f6368 - - default default] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Live migration monitoring is all done _live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10662
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.450 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.567 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: pgmap v131: 177 pgs: 177 active+clean; 375 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.1 MiB/s rd, 10 MiB/s wr, 206 op/s
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/681495947' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:53:10 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/77892895' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.914 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.921 274321 DEBUG nova.compute.provider_tree [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.947 274321 DEBUG nova.scheduler.client.report [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:53:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:10.981 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.023 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Acquiring lock "refresh_cache-4239e79f-2907-476f-baff-d30c06ed6f5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.024 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Acquired lock "refresh_cache-4239e79f-2907-476f-baff-d30c06ed6f5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.024 274321 DEBUG nova.network.neutron [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.111 274321 DEBUG nova.network.neutron [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.337 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:11 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e105 e105: 6 total, 6 up, 6 in
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.563 274321 DEBUG nova.network.neutron [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.580 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Releasing lock "refresh_cache-4239e79f-2907-476f-baff-d30c06ed6f5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.582 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.582 274321 INFO nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Creating image(s)
Feb 01 09:53:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/77892895' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:11 np0005604215.localdomain ceph-mon[298604]: osdmap e105: 6 total, 6 up, 6 in
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.642 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] rbd image 4239e79f-2907-476f-baff-d30c06ed6f5f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.648 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'trusted_certs' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.715 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] rbd image 4239e79f-2907-476f-baff-d30c06ed6f5f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.753 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] rbd image 4239e79f-2907-476f-baff-d30c06ed6f5f_disk does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.758 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Acquiring lock "07cd30132c7ce8edc7b720bc0da60a930c4de600" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.759 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "07cd30132c7ce8edc7b720bc0da60a930c4de600" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:11.943 274321 DEBUG nova.virt.libvirt.imagebackend [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Image locations are: [{'url': 'rbd://33fac0b9-80c7-560f-918a-c92d3021ca1e/images/5de7fa57-3d53-423f-a108-b9d18fedfc3f/snap', 'metadata': {'store': 'default_backend'}}, {'url': 'rbd://33fac0b9-80c7-560f-918a-c92d3021ca1e/images/5de7fa57-3d53-423f-a108-b9d18fedfc3f/snap', 'metadata': {}}] clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1085
Feb 01 09:53:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:12.022 274321 DEBUG nova.virt.libvirt.imagebackend [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Selected location: {'url': 'rbd://33fac0b9-80c7-560f-918a-c92d3021ca1e/images/5de7fa57-3d53-423f-a108-b9d18fedfc3f/snap', 'metadata': {'store': 'default_backend'}} clone /usr/lib/python3.9/site-packages/nova/virt/libvirt/imagebackend.py:1094
Feb 01 09:53:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:12.023 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] cloning images/5de7fa57-3d53-423f-a108-b9d18fedfc3f@snap to None/4239e79f-2907-476f-baff-d30c06ed6f5f_disk clone /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:261
Feb 01 09:53:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v133: 177 pgs: 177 active+clean; 375 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.1 MiB/s rd, 11 MiB/s wr, 207 op/s
Feb 01 09:53:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:12.199 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "07cd30132c7ce8edc7b720bc0da60a930c4de600" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.440s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:12.352 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'migration_context' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:12.451 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] flattening vms/4239e79f-2907-476f-baff-d30c06ed6f5f_disk flatten /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:314
Feb 01 09:53:12 np0005604215.localdomain ceph-mon[298604]: pgmap v133: 177 pgs: 177 active+clean; 375 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 7.1 MiB/s rd, 11 MiB/s wr, 207 op/s
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.321 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Image rbd:vms/4239e79f-2907-476f-baff-d30c06ed6f5f_disk:id=openstack:conf=/etc/ceph/ceph.conf flattened successfully while unshelving instance. _try_fetch_image_cache /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11007
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.322 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.322 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Ensure instance console log exists: /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.322 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.323 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.323 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.324 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Start _get_guest_xml network_info=[] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='',container_format='bare',created_at=2026-02-01T09:52:49Z,direct_url=<?>,disk_format='raw',id=5de7fa57-3d53-423f-a108-b9d18fedfc3f,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1815488958-shelved',owner='049ec09f02c049edbfda9ad51af738d7',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-02-01T09:53:07Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'encryption_options': None, 'device_name': '/dev/vda', 'encryption_format': None, 'boot_index': 0, 'encryption_secret_uuid': None, 'disk_bus': 'virtio', 'size': 0, 'guest_format': None, 'device_type': 'disk', 'encrypted': False, 'image_id': 'a223c2d3-3df7-4d82-921c-31ace200d43c'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.328 274321 WARNING nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.329 274321 DEBUG nova.virt.libvirt.host [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Searching host: 'np0005604215.localdomain' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.330 274321 DEBUG nova.virt.libvirt.host [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.331 274321 DEBUG nova.virt.libvirt.host [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Searching host: 'np0005604215.localdomain' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.332 274321 DEBUG nova.virt.libvirt.host [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.332 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CPU mode 'host-model' models '' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.332 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Getting desirable topologies for flavor Flavor(created_at=2026-02-01T09:50:53Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='04b6d75f-0335-413a-b9d6-dfe49d77feaf',id=5,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='',container_format='bare',created_at=2026-02-01T09:52:49Z,direct_url=<?>,disk_format='raw',id=5de7fa57-3d53-423f-a108-b9d18fedfc3f,min_disk=1,min_ram=0,name='tempest-UnshelveToHostMultiNodesTest-server-1815488958-shelved',owner='049ec09f02c049edbfda9ad51af738d7',properties=ImageMetaProps,protected=<?>,size=1073741824,status='active',tags=<?>,updated_at=2026-02-01T09:53:07Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.333 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.333 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.333 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.333 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.334 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.334 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.334 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.334 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.334 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.335 274321 DEBUG nova.virt.hardware [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.335 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'vcpu_model' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.384 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:53:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:53:13 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:53:13 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3696398923' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.807 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.844 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] rbd image 4239e79f-2907-476f-baff-d30c06ed6f5f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:53:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:13.849 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Running cmd (subprocess): ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:13 np0005604215.localdomain podman[306183]: 2026-02-01 09:53:13.874264367 +0000 UTC m=+0.079851831 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:53:13 np0005604215.localdomain systemd[1]: tmp-crun.S3iEuz.mount: Deactivated successfully.
Feb 01 09:53:13 np0005604215.localdomain podman[306185]: 2026-02-01 09:53:13.948130919 +0000 UTC m=+0.152650929 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:53:13 np0005604215.localdomain podman[306185]: 2026-02-01 09:53:13.962653571 +0000 UTC m=+0.167173571 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:53:13 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:53:14 np0005604215.localdomain podman[306183]: 2026-02-01 09:53:14.01298666 +0000 UTC m=+0.218574104 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 01 09:53:14 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:53:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v134: 177 pgs: 177 active+clean; 226 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 6.5 MiB/s rd, 9.0 MiB/s wr, 358 op/s
Feb 01 09:53:14 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:53:14 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4276625103' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.273 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CMD "ceph mon dump --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.276 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.316 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] End _get_guest_xml xml=<domain type="kvm">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <uuid>4239e79f-2907-476f-baff-d30c06ed6f5f</uuid>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <name>instance-00000006</name>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <memory>131072</memory>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <vcpu>1</vcpu>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <metadata>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:package version="27.5.2-0.20260127144738.eaa65f0.el9"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:name>tempest-UnshelveToHostMultiNodesTest-server-1815488958</nova:name>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:creationTime>2026-02-01 09:53:13</nova:creationTime>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:flavor name="m1.nano">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:memory>128</nova:memory>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:disk>1</nova:disk>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:swap>0</nova:swap>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:ephemeral>0</nova:ephemeral>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:vcpus>1</nova:vcpus>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       </nova:flavor>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:owner>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:user uuid="2d1e212774fc48c5970abb8787ca767f">tempest-UnshelveToHostMultiNodesTest-51338059-project-member</nova:user>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <nova:project uuid="049ec09f02c049edbfda9ad51af738d7">tempest-UnshelveToHostMultiNodesTest-51338059</nova:project>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       </nova:owner>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:root type="image" uuid="5de7fa57-3d53-423f-a108-b9d18fedfc3f"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <nova:ports/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </nova:instance>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </metadata>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <sysinfo type="smbios">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <system>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <entry name="manufacturer">RDO</entry>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <entry name="product">OpenStack Compute</entry>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <entry name="version">27.5.2-0.20260127144738.eaa65f0.el9</entry>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <entry name="serial">4239e79f-2907-476f-baff-d30c06ed6f5f</entry>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <entry name="uuid">4239e79f-2907-476f-baff-d30c06ed6f5f</entry>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <entry name="family">Virtual Machine</entry>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </system>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </sysinfo>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <os>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <type arch="x86_64" machine="q35">hvm</type>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <boot dev="hd"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <smbios mode="sysinfo"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </os>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <features>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <acpi/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <apic/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <vmcoreinfo/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </features>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <clock offset="utc">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <timer name="pit" tickpolicy="delay"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <timer name="rtc" tickpolicy="catchup"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <timer name="hpet" present="no"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </clock>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <cpu mode="host-model" match="exact">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <topology sockets="1" cores="1" threads="1"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </cpu>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   <devices>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <disk type="network" device="disk">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <driver type="raw" cache="none"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <source protocol="rbd" name="vms/4239e79f-2907-476f-baff-d30c06ed6f5f_disk">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.103" port="6789"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.104" port="6789"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.105" port="6789"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       </source>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <auth username="openstack">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <secret type="ceph" uuid="33fac0b9-80c7-560f-918a-c92d3021ca1e"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       </auth>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <target dev="vda" bus="virtio"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <disk type="network" device="cdrom">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <driver type="raw" cache="none"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <source protocol="rbd" name="vms/4239e79f-2907-476f-baff-d30c06ed6f5f_disk.config">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.103" port="6789"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.104" port="6789"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <host name="172.18.0.105" port="6789"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       </source>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <auth username="openstack">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:         <secret type="ceph" uuid="33fac0b9-80c7-560f-918a-c92d3021ca1e"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       </auth>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <target dev="sda" bus="sata"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </disk>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <serial type="pty">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <log file="/var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/console.log" append="off"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </serial>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <graphics type="vnc" autoport="yes" listen="::0"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <video>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <model type="virtio"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </video>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <input type="tablet" bus="usb"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <input type="keyboard" bus="usb"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <rng model="virtio">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <backend model="random">/dev/urandom</backend>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </rng>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="pci" model="pcie-root-port"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <controller type="usb" index="0"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     <memballoon model="virtio">
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:       <stats period="10"/>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:     </memballoon>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:   </devices>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: </domain>
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]:  _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.363 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.364 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.365 274321 INFO nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Using config drive
Feb 01 09:53:14 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3696398923' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.398 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] rbd image 4239e79f-2907-476f-baff-d30c06ed6f5f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.447 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'ec2_ids' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.474 274321 DEBUG nova.objects.instance [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lazy-loading 'keypairs' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.531 274321 INFO nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Creating config drive at /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/disk.config
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.536 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpupdz8vr7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.665 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20260127144738.eaa65f0.el9 -quiet -J -r -V config-2 /tmp/tmpupdz8vr7" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.708 274321 DEBUG nova.storage.rbd_utils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] rbd image 4239e79f-2907-476f-baff-d30c06ed6f5f_disk.config does not exist __init__ /usr/lib/python3.9/site-packages/nova/storage/rbd_utils.py:80
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.714 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Running cmd (subprocess): rbd import --pool vms /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/disk.config 4239e79f-2907-476f-baff-d30c06ed6f5f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.929 274321 DEBUG oslo_concurrency.processutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] CMD "rbd import --pool vms /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/disk.config 4239e79f-2907-476f-baff-d30c06ed6f5f_disk.config --image-format=2 --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.216s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:14.931 274321 INFO nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Deleting local config drive /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f/disk.config because it was imported into RBD.
Feb 01 09:53:14 np0005604215.localdomain systemd-machined[202466]: New machine qemu-3-instance-00000006.
Feb 01 09:53:15 np0005604215.localdomain systemd[1]: Started Virtual Machine qemu-3-instance-00000006.
Feb 01 09:53:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e105 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.343 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939595.3430214, 4239e79f-2907-476f-baff-d30c06ed6f5f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.344 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] VM Resumed (Lifecycle Event)
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.347 274321 DEBUG nova.compute.manager [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.348 274321 DEBUG nova.virt.libvirt.driver [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.352 274321 INFO nova.virt.libvirt.driver [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Instance spawned successfully.
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.370 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.374 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.398 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.398 274321 DEBUG nova.virt.driver [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] Emitting event <LifecycleEvent: 1769939595.3442507, 4239e79f-2907-476f-baff-d30c06ed6f5f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.399 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] VM Started (Lifecycle Event)
Feb 01 09:53:15 np0005604215.localdomain ceph-mon[298604]: pgmap v134: 177 pgs: 177 active+clean; 226 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 6.5 MiB/s rd, 9.0 MiB/s wr, 358 op/s
Feb 01 09:53:15 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4276625103' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.421 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.425 274321 DEBUG nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Synchronizing instance power state after lifecycle event "Started"; current vm_state: shelved_offloaded, current task_state: spawning, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.449 274321 INFO nova.compute.manager [None req-c47a18c5-3008-48c5-bac9-714d6e200798 - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] During sync_power_state the instance has a pending task (spawning). Skip.
Feb 01 09:53:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:15.594 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:16 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:16.056 2 INFO neutron.agent.securitygroups_rpc [None req-fe72f4fd-5cc1-4afa-94a0-35085a503c7b 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']
Feb 01 09:53:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v135: 177 pgs: 177 active+clean; 226 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 5.4 MiB/s rd, 7.5 MiB/s wr, 297 op/s
Feb 01 09:53:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:16.342 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:16 np0005604215.localdomain ceph-mon[298604]: pgmap v135: 177 pgs: 177 active+clean; 226 MiB data, 873 MiB used, 41 GiB / 42 GiB avail; 5.4 MiB/s rd, 7.5 MiB/s wr, 297 op/s
Feb 01 09:53:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e106 e106: 6 total, 6 up, 6 in
Feb 01 09:53:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:17.117 274321 DEBUG nova.compute.manager [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:53:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:17.170 2 INFO neutron.agent.securitygroups_rpc [None req-847588ff-1f30-46d7-9f2d-cc2e866fd5e9 0416f10a8d4f4da2a6dc6cbd271a3010 d8e4b0fb12f14fbaa248291aa43aacee - - default default] Security group member updated ['3c3daae5-f0f3-42a8-b893-8c534dcb0055']
Feb 01 09:53:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:17.191 274321 DEBUG oslo_concurrency.lockutils [None req-18be2f1f-a66a-4029-b12b-e9a556c8f79e ade63c676767402eb16f3f5df77b141e c2c1f738d9b04a26b94dfcbe1966af64 - - default default] Lock "4239e79f-2907-476f-baff-d30c06ed6f5f" "released" by "nova.compute.manager.ComputeManager.unshelve_instance.<locals>.do_unshelve_instance" :: held 7.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:17 np0005604215.localdomain ceph-mon[298604]: osdmap e106: 6 total, 6 up, 6 in
Feb 01 09:53:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v137: 177 pgs: 177 active+clean; 226 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 9.3 MiB/s rd, 6.1 MiB/s wr, 404 op/s
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.184 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Acquiring lock "4239e79f-2907-476f-baff-d30c06ed6f5f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.185 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lock "4239e79f-2907-476f-baff-d30c06ed6f5f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.185 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Acquiring lock "4239e79f-2907-476f-baff-d30c06ed6f5f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.186 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lock "4239e79f-2907-476f-baff-d30c06ed6f5f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.186 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lock "4239e79f-2907-476f-baff-d30c06ed6f5f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.189 274321 INFO nova.compute.manager [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Terminating instance
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.190 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Acquiring lock "refresh_cache-4239e79f-2907-476f-baff-d30c06ed6f5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.190 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Acquired lock "refresh_cache-4239e79f-2907-476f-baff-d30c06ed6f5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.191 274321 DEBUG nova.network.neutron [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.396 274321 DEBUG nova.network.neutron [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 01 09:53:18 np0005604215.localdomain ceph-mon[298604]: pgmap v137: 177 pgs: 177 active+clean; 226 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 9.3 MiB/s rd, 6.1 MiB/s wr, 404 op/s
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.653 274321 DEBUG nova.network.neutron [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.723 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Releasing lock "refresh_cache-4239e79f-2907-476f-baff-d30c06ed6f5f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.724 274321 DEBUG nova.compute.manager [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120
Feb 01 09:53:18 np0005604215.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Deactivated successfully.
Feb 01 09:53:18 np0005604215.localdomain systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000006.scope: Consumed 3.903s CPU time.
Feb 01 09:53:18 np0005604215.localdomain systemd-machined[202466]: Machine qemu-3-instance-00000006 terminated.
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.952 274321 INFO nova.virt.libvirt.driver [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Instance destroyed successfully.
Feb 01 09:53:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:18.953 274321 DEBUG nova.objects.instance [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lazy-loading 'resources' on Instance uuid 4239e79f-2907-476f-baff-d30c06ed6f5f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.666 274321 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769939584.665273, aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.666 274321 INFO nova.compute.manager [-] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] VM Stopped (Lifecycle Event)
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.685 274321 DEBUG nova.compute.manager [None req-1fe7d5a0-e11e-4a4f-92c2-5e487da5699f - - - - - -] [instance: aa6bae02-2ed3-49c7-9c3d-2e8d69c1b469] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.729 274321 INFO nova.virt.libvirt.driver [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Deleting instance files /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f_del
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.730 274321 INFO nova.virt.libvirt.driver [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Deletion of /var/lib/nova/instances/4239e79f-2907-476f-baff-d30c06ed6f5f_del complete
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.775 274321 INFO nova.compute.manager [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Took 1.05 seconds to destroy the instance on the hypervisor.
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.776 274321 DEBUG oslo.service.loopingcall [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.776 274321 DEBUG nova.compute.manager [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.777 274321 DEBUG nova.network.neutron [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803
Feb 01 09:53:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:53:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.867 274321 DEBUG nova.network.neutron [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323
Feb 01 09:53:19 np0005604215.localdomain podman[306410]: 2026-02-01 09:53:19.874190401 +0000 UTC m=+0.082379629 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, com.redhat.component=ubi9-minimal-container, version=9.7, architecture=x86_64, vcs-type=git, container_name=openstack_network_exporter, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z)
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.890 274321 DEBUG nova.network.neutron [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116
Feb 01 09:53:19 np0005604215.localdomain podman[306410]: 2026-02-01 09:53:19.891636665 +0000 UTC m=+0.099825883 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, release=1769056855, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, maintainer=Red Hat, Inc.)
Feb 01 09:53:19 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.911 274321 INFO nova.compute.manager [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Took 0.13 seconds to deallocate network for instance.
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.953 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:19.954 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:19 np0005604215.localdomain podman[306411]: 2026-02-01 09:53:19.997433963 +0000 UTC m=+0.200921294 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent)
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.016 274321 DEBUG oslo_concurrency.processutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:20 np0005604215.localdomain podman[306411]: 2026-02-01 09:53:20.032706302 +0000 UTC m=+0.236193703 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:53:20 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:53:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e106 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v138: 177 pgs: 177 active+clean; 226 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 5.6 MiB/s wr, 371 op/s
Feb 01 09:53:20 np0005604215.localdomain ceph-mon[298604]: pgmap v138: 177 pgs: 177 active+clean; 226 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 8.5 MiB/s rd, 5.6 MiB/s wr, 371 op/s
Feb 01 09:53:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:53:20 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1092996455' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.506 274321 DEBUG oslo_concurrency.processutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.490s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.513 274321 DEBUG nova.compute.provider_tree [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.532 274321 DEBUG nova.scheduler.client.report [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.555 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.601s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.581 274321 INFO nova.scheduler.client.report [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Deleted allocations for instance 4239e79f-2907-476f-baff-d30c06ed6f5f
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.640 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:20.713 274321 DEBUG oslo_concurrency.lockutils [None req-9d8f1100-e1dd-4ab5-9e1c-f13fdf5e3656 2d1e212774fc48c5970abb8787ca767f 049ec09f02c049edbfda9ad51af738d7 - - default default] Lock "4239e79f-2907-476f-baff-d30c06ed6f5f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.529s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:21 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1092996455' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:53:21
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['manila_metadata', 'manila_data', '.mgr', 'backups', 'vms', 'images', 'volumes']
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:53:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:21.344 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:53:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e107 e107: 6 total, 6 up, 6 in
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.006578574295086544 of space, bias 1.0, pg target 1.315714859017309 quantized to 32 (current 32)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8555772569444443 quantized to 32 (current 32)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.0021628687418574354 quantized to 16 (current 16)
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:53:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:53:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v140: 177 pgs: 177 active+clean; 226 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 8.7 MiB/s rd, 5.8 MiB/s wr, 218 op/s
Feb 01 09:53:22 np0005604215.localdomain ceph-mon[298604]: osdmap e107: 6 total, 6 up, 6 in
Feb 01 09:53:22 np0005604215.localdomain ceph-mon[298604]: pgmap v140: 177 pgs: 177 active+clean; 226 MiB data, 951 MiB used, 41 GiB / 42 GiB avail; 8.7 MiB/s rd, 5.8 MiB/s wr, 218 op/s
Feb 01 09:53:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v141: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 8.8 MiB/s rd, 5.8 MiB/s wr, 259 op/s
Feb 01 09:53:24 np0005604215.localdomain ceph-mon[298604]: pgmap v141: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 8.8 MiB/s rd, 5.8 MiB/s wr, 259 op/s
Feb 01 09:53:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:25.669 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v142: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 4.8 MiB/s wr, 213 op/s
Feb 01 09:53:26 np0005604215.localdomain ceph-mon[298604]: pgmap v142: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 7.2 MiB/s rd, 4.8 MiB/s wr, 213 op/s
Feb 01 09:53:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:26.347 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:53:26 np0005604215.localdomain systemd[1]: tmp-crun.okSPaM.mount: Deactivated successfully.
Feb 01 09:53:26 np0005604215.localdomain podman[306470]: 2026-02-01 09:53:26.877609604 +0000 UTC m=+0.089185361 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true)
Feb 01 09:53:26 np0005604215.localdomain podman[306470]: 2026-02-01 09:53:26.916711973 +0000 UTC m=+0.128287680 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:53:26 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:53:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:27.817 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:53:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:27.818 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:53:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:27.819 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v143: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.4 KiB/s wr, 32 op/s
Feb 01 09:53:28 np0005604215.localdomain ceph-mon[298604]: pgmap v143: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.4 KiB/s wr, 32 op/s
Feb 01 09:53:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:28.821 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:53:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:53:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:53:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:53:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 09:53:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:53:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18295 "" "Go-http-client/1.1"
Feb 01 09:53:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e107 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v144: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.4 KiB/s wr, 32 op/s
Feb 01 09:53:30 np0005604215.localdomain ceph-mon[298604]: pgmap v144: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.4 KiB/s wr, 32 op/s
Feb 01 09:53:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:30.702 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:31.350 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e108 e108: 6 total, 6 up, 6 in
Feb 01 09:53:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:53:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:53:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:53:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:53:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:53:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:53:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:53:31 np0005604215.localdomain podman[306489]: 2026-02-01 09:53:31.666325683 +0000 UTC m=+0.078585690 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:53:31 np0005604215.localdomain podman[306489]: 2026-02-01 09:53:31.67842243 +0000 UTC m=+0.090682447 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:53:31 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:53:32 np0005604215.localdomain sudo[306511]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:53:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v146: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.4 KiB/s wr, 32 op/s
Feb 01 09:53:32 np0005604215.localdomain sudo[306511]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:53:32 np0005604215.localdomain sudo[306511]: pam_unix(sudo:session): session closed for user root
Feb 01 09:53:32 np0005604215.localdomain sudo[306529]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:53:32 np0005604215.localdomain sudo[306529]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:53:32 np0005604215.localdomain ceph-mon[298604]: osdmap e108: 6 total, 6 up, 6 in
Feb 01 09:53:32 np0005604215.localdomain ceph-mon[298604]: pgmap v146: 177 pgs: 177 active+clean; 145 MiB data, 742 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 1.4 KiB/s wr, 32 op/s
Feb 01 09:53:32 np0005604215.localdomain sudo[306529]: pam_unix(sudo:session): session closed for user root
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:53:33 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 7815552d-7e18-42d6-bd21-6acc14c1a76e (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:53:33 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 7815552d-7e18-42d6-bd21-6acc14c1a76e (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:53:33 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 7815552d-7e18-42d6-bd21-6acc14c1a76e (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:53:33 np0005604215.localdomain sudo[306578]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:53:33 np0005604215.localdomain sudo[306578]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:53:33 np0005604215.localdomain sudo[306578]: pam_unix(sudo:session): session closed for user root
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:53:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e109 e109: 6 total, 6 up, 6 in
Feb 01 09:53:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:33.950 274321 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1769939598.9493322, 4239e79f-2907-476f-baff-d30c06ed6f5f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653
Feb 01 09:53:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:33.951 274321 INFO nova.compute.manager [-] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] VM Stopped (Lifecycle Event)
Feb 01 09:53:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:33.976 274321 DEBUG nova.compute.manager [None req-7e9157d0-d47c-4b39-aa9d-529866017e1a - - - - - -] [instance: 4239e79f-2907-476f-baff-d30c06ed6f5f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762
Feb 01 09:53:34 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:34.098 2 INFO neutron.agent.securitygroups_rpc [req-75bc0aa1-37ef-492b-a96a-ae9080ee75e0 req-edda2748-5bcc-42c3-8a62-8fe3b52553b6 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['f0c61cda-1998-487f-b5b2-ae9c4848f56a']
Feb 01 09:53:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v148: 177 pgs: 177 active+clean; 145 MiB data, 746 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.1 KiB/s wr, 18 op/s
Feb 01 09:53:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:53:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:53:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:53:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:53:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:53:34 np0005604215.localdomain snmpd[67757]: empty variable list in _query
Feb 01 09:53:34 np0005604215.localdomain ceph-mon[298604]: osdmap e109: 6 total, 6 up, 6 in
Feb 01 09:53:34 np0005604215.localdomain ceph-mon[298604]: pgmap v148: 177 pgs: 177 active+clean; 145 MiB data, 746 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.1 KiB/s wr, 18 op/s
Feb 01 09:53:34 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:34.836 2 INFO neutron.agent.securitygroups_rpc [req-82289f6f-42d9-438b-b261-05589eed2efe req-98f019aa-b49d-4aef-94dc-ee96ed3719e9 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['ada4c3f2-cdfe-4dd3-85f7-4e743664f11d']
Feb 01 09:53:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2427119556' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:53:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2427119556' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:53:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:35.733 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:35 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:35.788 2 INFO neutron.agent.securitygroups_rpc [req-91571c20-84f3-4df1-9546-4b115d3d0f93 req-0f489c88-5b53-4bd7-842f-0fffa7ebc222 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['879f68ae-8832-4697-b764-9db0f8c3108c']
Feb 01 09:53:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v149: 177 pgs: 177 active+clean; 145 MiB data, 746 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.1 KiB/s wr, 18 op/s
Feb 01 09:53:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:36.352 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:53:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:53:36 np0005604215.localdomain ceph-mon[298604]: pgmap v149: 177 pgs: 177 active+clean; 145 MiB data, 746 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.1 KiB/s wr, 18 op/s
Feb 01 09:53:36 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:36.623 2 INFO neutron.agent.securitygroups_rpc [req-876e83f6-6b17-43eb-b040-065589623e5f req-6a5cb928-de96-4098-a70d-23e5abe4d6ce dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['98cb19e2-acc2-4297-8b83-10025f09d04b']
Feb 01 09:53:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:53:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:37.598 2 INFO neutron.agent.securitygroups_rpc [req-92fefcaa-7c6d-4e1a-a20a-67097b188e7e req-0e7bfb69-2aec-4c7e-b452-953e89b3814f dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['374381c7-702b-4257-92ff-7af171862681']
Feb 01 09:53:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:37.768 2 INFO neutron.agent.securitygroups_rpc [req-2c86665d-931d-4ec0-bcba-f7d3083dc82f req-7b6541b5-97df-453e-9e47-7bd01fb85ab0 dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['374381c7-702b-4257-92ff-7af171862681']
Feb 01 09:53:38 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:38.109 2 INFO neutron.agent.securitygroups_rpc [req-eb564804-d725-4326-adb0-58732ff445a4 req-615b10cf-445f-485b-a611-28146c91f72c dca8da9c475e44f19383733eded7ebf5 e44a50a3d96541748629cacff5ef78b0 - - default default] Security group rule updated ['374381c7-702b-4257-92ff-7af171862681']
Feb 01 09:53:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v150: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Feb 01 09:53:38 np0005604215.localdomain ceph-mon[298604]: pgmap v150: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 3.9 KiB/s wr, 49 op/s
Feb 01 09:53:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e109 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v151: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 3.5 KiB/s wr, 45 op/s
Feb 01 09:53:40 np0005604215.localdomain ceph-mon[298604]: pgmap v151: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 32 KiB/s rd, 3.5 KiB/s wr, 45 op/s
Feb 01 09:53:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:40.735 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:41.354 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e110 e110: 6 total, 6 up, 6 in
Feb 01 09:53:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:41.771 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:41.772 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:41.772 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v153: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Feb 01 09:53:42 np0005604215.localdomain ceph-mon[298604]: osdmap e110: 6 total, 6 up, 6 in
Feb 01 09:53:42 np0005604215.localdomain ceph-mon[298604]: pgmap v153: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 21 KiB/s rd, 1.6 KiB/s wr, 28 op/s
Feb 01 09:53:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v154: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:53:44 np0005604215.localdomain ceph-mon[298604]: pgmap v154: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:53:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:53:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:53:44 np0005604215.localdomain podman[306596]: 2026-02-01 09:53:44.87314658 +0000 UTC m=+0.084199276 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:53:44 np0005604215.localdomain podman[306597]: 2026-02-01 09:53:44.915698477 +0000 UTC m=+0.127064382 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:53:44 np0005604215.localdomain podman[306597]: 2026-02-01 09:53:44.928457394 +0000 UTC m=+0.139823309 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:53:44 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:53:44 np0005604215.localdomain podman[306596]: 2026-02-01 09:53:44.943737651 +0000 UTC m=+0.154790397 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Feb 01 09:53:44 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:53:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:45.737 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:46.102 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:46.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:53:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:46.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:53:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:46.119 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:53:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v155: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:53:46 np0005604215.localdomain ceph-mon[298604]: pgmap v155: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:53:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:46.356 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v156: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:48 np0005604215.localdomain ceph-mon[298604]: pgmap v156: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:49.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:49.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e110 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v157: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:50 np0005604215.localdomain ceph-mon[298604]: pgmap v157: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:53:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:53:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:50.367 259225 INFO neutron.agent.linux.ip_lib [None req-d99bc822-9c44-49c0-bb45-91a23f333e23 - - - - - -] Device tapf87036fa-d5 cannot be used as it has no MAC address
Feb 01 09:53:50 np0005604215.localdomain systemd[1]: tmp-crun.Xe1ULa.mount: Deactivated successfully.
Feb 01 09:53:50 np0005604215.localdomain podman[306644]: 2026-02-01 09:53:50.396781238 +0000 UTC m=+0.094109004 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, architecture=x86_64, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.7, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_id=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal)
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.403 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:50 np0005604215.localdomain kernel: device tapf87036fa-d5 entered promiscuous mode
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.413 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:50Z|00108|binding|INFO|Claiming lport f87036fa-d537-4b85-b37c-c486487fff03 for this chassis.
Feb 01 09:53:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:50Z|00109|binding|INFO|f87036fa-d537-4b85-b37c-c486487fff03: Claiming unknown
Feb 01 09:53:50 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939630.4162] manager: (tapf87036fa-d5): new Generic device (/org/freedesktop/NetworkManager/Devices/24)
Feb 01 09:53:50 np0005604215.localdomain systemd-udevd[306682]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:53:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:50.423 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-64c4abd2-68ab-4da2-b883-4056dccfe81b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-64c4abd2-68ab-4da2-b883-4056dccfe81b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1713821b0f794e3b830e51e1263a38e8', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a20ebc6e-8957-43a9-8b71-59702d481dc9, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=f87036fa-d537-4b85-b37c-c486487fff03) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:53:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:50.425 158655 INFO neutron.agent.ovn.metadata.agent [-] Port f87036fa-d537-4b85-b37c-c486487fff03 in datapath 64c4abd2-68ab-4da2-b883-4056dccfe81b bound to our chassis
Feb 01 09:53:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:50.426 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 64c4abd2-68ab-4da2-b883-4056dccfe81b or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:53:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:50.428 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[07484f54-37ff-4631-b351-2d0a841ee2cc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:50Z|00110|binding|INFO|Setting lport f87036fa-d537-4b85-b37c-c486487fff03 ovn-installed in OVS
Feb 01 09:53:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:50Z|00111|binding|INFO|Setting lport f87036fa-d537-4b85-b37c-c486487fff03 up in Southbound
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.457 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain podman[306646]: 2026-02-01 09:53:50.460480063 +0000 UTC m=+0.154130615 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain podman[306644]: 2026-02-01 09:53:50.465428178 +0000 UTC m=+0.162755944 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.7, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible)
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapf87036fa-d5: No such device
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.499 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:50 np0005604215.localdomain podman[306646]: 2026-02-01 09:53:50.520890086 +0000 UTC m=+0.214540638 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.530 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:50 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:53:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:50.738 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:50.804 2 INFO neutron.agent.securitygroups_rpc [None req-200cf6df-4bba-4fb6-b3b8-7b487bc0871d 3ef0026b934441b28e0635d7a99bc592 d1284af7476748758a037c2a7d34b7a2 - - default default] Security group member updated ['02728618-05ed-4a37-93a2-59fcc09c3239']
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.121 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.121 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.122 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.122 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:51 np0005604215.localdomain podman[306778]: 
Feb 01 09:53:51 np0005604215.localdomain podman[306778]: 2026-02-01 09:53:51.331223545 +0000 UTC m=+0.080505690 container create 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.359 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:51 np0005604215.localdomain podman[306778]: 2026-02-01 09:53:51.287327787 +0000 UTC m=+0.036610002 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:53:51 np0005604215.localdomain systemd[1]: Started libpod-conmon-2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b.scope.
Feb 01 09:53:51 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:53:51 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a21cdbba4626f33fa36df1df6f6f66a3be4030297ce337d866e792ea89e48cc9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:53:51 np0005604215.localdomain podman[306778]: 2026-02-01 09:53:51.434827735 +0000 UTC m=+0.184109880 container init 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:53:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:53:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:53:51 np0005604215.localdomain podman[306778]: 2026-02-01 09:53:51.45616838 +0000 UTC m=+0.205450525 container start 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:53:51 np0005604215.localdomain dnsmasq[306796]: started, version 2.85 cachesize 150
Feb 01 09:53:51 np0005604215.localdomain dnsmasq[306796]: DNS service limited to local subnets
Feb 01 09:53:51 np0005604215.localdomain dnsmasq[306796]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:53:51 np0005604215.localdomain dnsmasq[306796]: warning: no upstream servers configured
Feb 01 09:53:51 np0005604215.localdomain dnsmasq-dhcp[306796]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:53:51 np0005604215.localdomain dnsmasq[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/addn_hosts - 0 addresses
Feb 01 09:53:51 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/host
Feb 01 09:53:51 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/opts
Feb 01 09:53:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:53:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:53:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:53:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:53:51 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:53:51.499 2 INFO neutron.agent.securitygroups_rpc [None req-07f1805c-f1e5-49eb-9bf2-554d43f01479 3ef0026b934441b28e0635d7a99bc592 d1284af7476748758a037c2a7d34b7a2 - - default default] Security group member updated ['02728618-05ed-4a37-93a2-59fcc09c3239']
Feb 01 09:53:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:53:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/987860528' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.533 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.410s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:51 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:51.626 259225 INFO neutron.agent.dhcp.agent [None req-c0440d8d-6f1b-4813-bbfd-fb17e4f3bc44 - - - - - -] DHCP configuration for ports {'4042c1e9-20ae-449b-8b73-91339d0f2377'} is completed
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.741 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.742 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11671MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.742 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.743 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.791 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.791 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:53:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:51.807 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:53:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v158: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:53:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3703738378' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:52.223 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.416s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:53:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:52.230 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:53:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:52.251 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:53:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:52.282 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:53:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:52.283 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.540s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:53:52 np0005604215.localdomain systemd[1]: tmp-crun.O9XdB6.mount: Deactivated successfully.
Feb 01 09:53:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/987860528' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:52 np0005604215.localdomain ceph-mon[298604]: pgmap v158: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3703738378' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:53.281 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:53.302 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:53.303 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:53:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:53.303 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:53:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e111 e111: 6 total, 6 up, 6 in
Feb 01 09:53:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2673048468' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/242006960' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v160: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:54.361 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:53:54Z, description=, device_id=cd47e43f-fc78-414f-aa7f-74876586e763, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236fe50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236f0d0>], id=a7dd6bf3-338e-4748-ab7d-96e7f31fe4ba, ip_allocation=immediate, mac_address=fa:16:3e:92:32:fc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:53:48Z, description=, dns_domain=, id=64c4abd2-68ab-4da2-b883-4056dccfe81b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-960610517-network, port_security_enabled=True, project_id=1713821b0f794e3b830e51e1263a38e8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34016, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=906, status=ACTIVE, subnets=['d305c769-28bb-47c7-92e1-5f2d5081f6eb'], tags=[], tenant_id=1713821b0f794e3b830e51e1263a38e8, updated_at=2026-02-01T09:53:49Z, vlan_transparent=None, network_id=64c4abd2-68ab-4da2-b883-4056dccfe81b, port_security_enabled=False, project_id=1713821b0f794e3b830e51e1263a38e8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=945, status=DOWN, tags=[], tenant_id=1713821b0f794e3b830e51e1263a38e8, updated_at=2026-02-01T09:53:54Z on network 64c4abd2-68ab-4da2-b883-4056dccfe81b
Feb 01 09:53:54 np0005604215.localdomain dnsmasq[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/addn_hosts - 1 addresses
Feb 01 09:53:54 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/host
Feb 01 09:53:54 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/opts
Feb 01 09:53:54 np0005604215.localdomain podman[306837]: 2026-02-01 09:53:54.563480697 +0000 UTC m=+0.059200116 container kill 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:53:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:54.787 259225 INFO neutron.agent.dhcp.agent [None req-a5280b46-cccf-45c4-ad12-83144e7c5d92 - - - - - -] DHCP configuration for ports {'a7dd6bf3-338e-4748-ab7d-96e7f31fe4ba'} is completed
Feb 01 09:53:54 np0005604215.localdomain ceph-mon[298604]: osdmap e111: 6 total, 6 up, 6 in
Feb 01 09:53:54 np0005604215.localdomain ceph-mon[298604]: pgmap v160: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e111 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:53:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:55.740 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3002027950' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3808040281' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:53:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:55.902 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:53:54Z, description=, device_id=cd47e43f-fc78-414f-aa7f-74876586e763, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032335250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032335b80>], id=a7dd6bf3-338e-4748-ab7d-96e7f31fe4ba, ip_allocation=immediate, mac_address=fa:16:3e:92:32:fc, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:53:48Z, description=, dns_domain=, id=64c4abd2-68ab-4da2-b883-4056dccfe81b, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ServersTestFqdnHostnames-960610517-network, port_security_enabled=True, project_id=1713821b0f794e3b830e51e1263a38e8, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=34016, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=906, status=ACTIVE, subnets=['d305c769-28bb-47c7-92e1-5f2d5081f6eb'], tags=[], tenant_id=1713821b0f794e3b830e51e1263a38e8, updated_at=2026-02-01T09:53:49Z, vlan_transparent=None, network_id=64c4abd2-68ab-4da2-b883-4056dccfe81b, port_security_enabled=False, project_id=1713821b0f794e3b830e51e1263a38e8, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=945, status=DOWN, tags=[], tenant_id=1713821b0f794e3b830e51e1263a38e8, updated_at=2026-02-01T09:53:54Z on network 64c4abd2-68ab-4da2-b883-4056dccfe81b
Feb 01 09:53:56 np0005604215.localdomain systemd[1]: tmp-crun.yWenqp.mount: Deactivated successfully.
Feb 01 09:53:56 np0005604215.localdomain dnsmasq[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/addn_hosts - 1 addresses
Feb 01 09:53:56 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/host
Feb 01 09:53:56 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/opts
Feb 01 09:53:56 np0005604215.localdomain podman[306876]: 2026-02-01 09:53:56.145089978 +0000 UTC m=+0.075113053 container kill 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 01 09:53:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v161: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:56.362 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:56 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:53:56.458 259225 INFO neutron.agent.dhcp.agent [None req-5315bbd6-1862-429e-951c-d2c0ebb5080f - - - - - -] DHCP configuration for ports {'a7dd6bf3-338e-4748-ab7d-96e7f31fe4ba'} is completed
Feb 01 09:53:56 np0005604215.localdomain ceph-mon[298604]: pgmap v161: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:53:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:57Z|00112|ovn_bfd|INFO|Enabled BFD on interface ovn-2186fb-0
Feb 01 09:53:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:57Z|00113|ovn_bfd|INFO|Enabled BFD on interface ovn-e1cc33-0
Feb 01 09:53:57 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:57Z|00114|ovn_bfd|INFO|Enabled BFD on interface ovn-45aa31-0
Feb 01 09:53:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:57.313 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:57.317 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:57.337 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:57.343 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:57.377 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:53:57 np0005604215.localdomain systemd[1]: tmp-crun.qbi4ky.mount: Deactivated successfully.
Feb 01 09:53:57 np0005604215.localdomain podman[306899]: 2026-02-01 09:53:57.892520757 +0000 UTC m=+0.103296531 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, config_id=ceilometer_agent_compute)
Feb 01 09:53:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e112 e112: 6 total, 6 up, 6 in
Feb 01 09:53:57 np0005604215.localdomain podman[306899]: 2026-02-01 09:53:57.904636344 +0000 UTC m=+0.115412088 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:53:57 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:53:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v163: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Feb 01 09:53:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:58.286 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:58 np0005604215.localdomain dnsmasq[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/addn_hosts - 0 addresses
Feb 01 09:53:58 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/host
Feb 01 09:53:58 np0005604215.localdomain podman[306934]: 2026-02-01 09:53:58.503346577 +0000 UTC m=+0.062337554 container kill 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:53:58 np0005604215.localdomain dnsmasq-dhcp[306796]: read /var/lib/neutron/dhcp/64c4abd2-68ab-4da2-b883-4056dccfe81b/opts
Feb 01 09:53:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:58Z|00115|ovn_bfd|INFO|Disabled BFD on interface ovn-2186fb-0
Feb 01 09:53:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:58Z|00116|ovn_bfd|INFO|Disabled BFD on interface ovn-e1cc33-0
Feb 01 09:53:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:58Z|00117|ovn_bfd|INFO|Disabled BFD on interface ovn-45aa31-0
Feb 01 09:53:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:58.566 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:58.569 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:58.588 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:58.677 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:58Z|00118|binding|INFO|Releasing lport f87036fa-d537-4b85-b37c-c486487fff03 from this chassis (sb_readonly=0)
Feb 01 09:53:58 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:53:58Z|00119|binding|INFO|Setting lport f87036fa-d537-4b85-b37c-c486487fff03 down in Southbound
Feb 01 09:53:58 np0005604215.localdomain kernel: device tapf87036fa-d5 left promiscuous mode
Feb 01 09:53:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:58.691 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-64c4abd2-68ab-4da2-b883-4056dccfe81b', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-64c4abd2-68ab-4da2-b883-4056dccfe81b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1713821b0f794e3b830e51e1263a38e8', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a20ebc6e-8957-43a9-8b71-59702d481dc9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=f87036fa-d537-4b85-b37c-c486487fff03) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:53:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:58.692 158655 INFO neutron.agent.ovn.metadata.agent [-] Port f87036fa-d537-4b85-b37c-c486487fff03 in datapath 64c4abd2-68ab-4da2-b883-4056dccfe81b unbound from our chassis
Feb 01 09:53:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:58.695 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 64c4abd2-68ab-4da2-b883-4056dccfe81b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:53:58 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:53:58.696 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[6fe2c731-9fcb-44b7-b736-2e27922f341e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:53:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:53:58.711 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:53:58 np0005604215.localdomain ceph-mon[298604]: osdmap e112: 6 total, 6 up, 6 in
Feb 01 09:53:58 np0005604215.localdomain ceph-mon[298604]: pgmap v163: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Feb 01 09:54:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:54:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:54:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:54:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157180 "" "Go-http-client/1.1"
Feb 01 09:54:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:54:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18771 "" "Go-http-client/1.1"
Feb 01 09:54:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v164: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Feb 01 09:54:00 np0005604215.localdomain ceph-mon[298604]: pgmap v164: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 2.0 KiB/s wr, 18 op/s
Feb 01 09:54:00 np0005604215.localdomain dnsmasq[306796]: exiting on receipt of SIGTERM
Feb 01 09:54:00 np0005604215.localdomain podman[306974]: 2026-02-01 09:54:00.736475665 +0000 UTC m=+0.059579918 container kill 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:54:00 np0005604215.localdomain systemd[1]: libpod-2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b.scope: Deactivated successfully.
Feb 01 09:54:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:00.742 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:00 np0005604215.localdomain podman[306987]: 2026-02-01 09:54:00.818036378 +0000 UTC m=+0.062692385 container died 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 01 09:54:00 np0005604215.localdomain systemd[1]: tmp-crun.awsOCc.mount: Deactivated successfully.
Feb 01 09:54:00 np0005604215.localdomain podman[306987]: 2026-02-01 09:54:00.854100602 +0000 UTC m=+0.098756569 container cleanup 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:54:00 np0005604215.localdomain systemd[1]: libpod-conmon-2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b.scope: Deactivated successfully.
Feb 01 09:54:00 np0005604215.localdomain podman[306988]: 2026-02-01 09:54:00.900625072 +0000 UTC m=+0.140945435 container remove 2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-64c4abd2-68ab-4da2-b883-4056dccfe81b, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:54:00 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:00.930 259225 INFO neutron.agent.dhcp.agent [None req-400aff62-7135-4736-b3a6-720eef750e81 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:54:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:01.340 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:54:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:01.366 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:54:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:54:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:54:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:54:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:54:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:54:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a21cdbba4626f33fa36df1df6f6f66a3be4030297ce337d866e792ea89e48cc9-merged.mount: Deactivated successfully.
Feb 01 09:54:01 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b527cbbb8d1866e999c8ae7231dfe8eb8b5171c4fa767cfc10b56b33fab020b-userdata-shm.mount: Deactivated successfully.
Feb 01 09:54:01 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d64c4abd2\x2d68ab\x2d4da2\x2db883\x2d4056dccfe81b.mount: Deactivated successfully.
Feb 01 09:54:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:54:01 np0005604215.localdomain systemd[1]: tmp-crun.FJJRLy.mount: Deactivated successfully.
Feb 01 09:54:01 np0005604215.localdomain podman[307015]: 2026-02-01 09:54:01.860366378 +0000 UTC m=+0.095290911 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:54:01 np0005604215.localdomain podman[307015]: 2026-02-01 09:54:01.870999629 +0000 UTC m=+0.105924192 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:54:01 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:54:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:01.944 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v165: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 1.8 KiB/s wr, 17 op/s
Feb 01 09:54:02 np0005604215.localdomain ceph-mon[298604]: pgmap v165: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 12 KiB/s rd, 1.8 KiB/s wr, 17 op/s
Feb 01 09:54:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.0 KiB/s wr, 39 op/s
Feb 01 09:54:04 np0005604215.localdomain ceph-mon[298604]: pgmap v166: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.0 KiB/s wr, 39 op/s
Feb 01 09:54:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e112 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:05.787 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.0 KiB/s wr, 39 op/s
Feb 01 09:54:06 np0005604215.localdomain ceph-mon[298604]: pgmap v167: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 3.0 KiB/s wr, 39 op/s
Feb 01 09:54:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:06.368 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e113 e113: 6 total, 6 up, 6 in
Feb 01 09:54:07 np0005604215.localdomain ceph-mon[298604]: osdmap e113: 6 total, 6 up, 6 in
Feb 01 09:54:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v169: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:54:08 np0005604215.localdomain ceph-mon[298604]: pgmap v169: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:54:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v170: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:54:10 np0005604215.localdomain ceph-mon[298604]: pgmap v170: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:54:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:10.789 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:11.372 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v171: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:54:12 np0005604215.localdomain ceph-mon[298604]: pgmap v171: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 18 KiB/s rd, 1.4 KiB/s wr, 24 op/s
Feb 01 09:54:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v172: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:54:14 np0005604215.localdomain ceph-mon[298604]: pgmap v172: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:54:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e113 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:15.130 2 INFO neutron.agent.securitygroups_rpc [req-a317a60f-1d94-4e94-8ad3-4c22c8825b6a req-40e6bba5-b2d9-4d66-aeb2-e562a81ad61e aacab7e8f6444706a62ff16c6574833f d0194caf1b6343f4859fdcc75c872cf3 - - default default] Security group rule updated ['639fab50-7eda-41c7-96b9-ca352e9a9f06']
Feb 01 09:54:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e114 e114: 6 total, 6 up, 6 in
Feb 01 09:54:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:15.723 2 INFO neutron.agent.securitygroups_rpc [req-4b9050d5-0e1f-4517-a597-752dbe7a20e4 req-3b9daee1-df3b-4626-857b-13f8996518fb aacab7e8f6444706a62ff16c6574833f d0194caf1b6343f4859fdcc75c872cf3 - - default default] Security group rule updated ['639fab50-7eda-41c7-96b9-ca352e9a9f06']
Feb 01 09:54:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:54:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:54:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:15.791 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:15 np0005604215.localdomain podman[307038]: 2026-02-01 09:54:15.885461717 +0000 UTC m=+0.085821176 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:54:15 np0005604215.localdomain podman[307038]: 2026-02-01 09:54:15.924641298 +0000 UTC m=+0.125000737 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true)
Feb 01 09:54:15 np0005604215.localdomain podman[307039]: 2026-02-01 09:54:15.936394585 +0000 UTC m=+0.134531355 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:54:15 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:54:15 np0005604215.localdomain podman[307039]: 2026-02-01 09:54:15.946581432 +0000 UTC m=+0.144718142 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:54:15 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:54:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v174: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: osdmap e114: 6 total, 6 up, 6 in
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: pgmap v174: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:54:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:16.373 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #22. Immutable memtables: 0.
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.508308) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 9] Flushing memtable with next log file: 22
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656508348, "job": 9, "event": "flush_started", "num_memtables": 1, "num_entries": 1987, "num_deletes": 261, "total_data_size": 2767283, "memory_usage": 2812848, "flush_reason": "Manual Compaction"}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 9] Level-0 flush table #23: started
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656520186, "cf_name": "default", "job": 9, "event": "table_file_creation", "file_number": 23, "file_size": 1775229, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 16690, "largest_seqno": 18672, "table_properties": {"data_size": 1767717, "index_size": 4405, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2053, "raw_key_size": 16439, "raw_average_key_size": 20, "raw_value_size": 1752217, "raw_average_value_size": 2187, "num_data_blocks": 193, "num_entries": 801, "num_filter_entries": 801, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939534, "oldest_key_time": 1769939534, "file_creation_time": 1769939656, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 23, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 9] Flush lasted 11921 microseconds, and 5219 cpu microseconds.
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.520229) [db/flush_job.cc:967] [default] [JOB 9] Level-0 flush table #23: 1775229 bytes OK
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.520253) [db/memtable_list.cc:519] [default] Level-0 commit table #23 started
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.522957) [db/memtable_list.cc:722] [default] Level-0 commit table #23: memtable #1 done
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.522981) EVENT_LOG_v1 {"time_micros": 1769939656522975, "job": 9, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.523002) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 9] Try to delete WAL files size 2758165, prev total WAL file size 2758489, number of live WAL files 2.
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000019.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.523792) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0033373633' seq:72057594037927935, type:22 .. '6C6F676D0034303134' seq:0, type:0; will stop at (end)
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 10] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 9 Base level 0, inputs: [23(1733KB)], [21(20MB)]
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656523833, "job": 10, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [23], "files_L6": [21], "score": -1, "input_data_size": 23504588, "oldest_snapshot_seqno": -1}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 10] Generated table #24: 12521 keys, 23327628 bytes, temperature: kUnknown
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656683729, "cf_name": "default", "job": 10, "event": "table_file_creation", "file_number": 24, "file_size": 23327628, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 23253044, "index_size": 42163, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31365, "raw_key_size": 336652, "raw_average_key_size": 26, "raw_value_size": 23036583, "raw_average_value_size": 1839, "num_data_blocks": 1607, "num_entries": 12521, "num_filter_entries": 12521, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939656, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 24, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.684191) [db/compaction/compaction_job.cc:1663] [default] [JOB 10] Compacted 1@0 + 1@6 files to L6 => 23327628 bytes
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.686806) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 146.8 rd, 145.7 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 20.7 +0.0 blob) out(22.2 +0.0 blob), read-write-amplify(26.4) write-amplify(13.1) OK, records in: 13059, records dropped: 538 output_compression: NoCompression
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.686827) EVENT_LOG_v1 {"time_micros": 1769939656686818, "job": 10, "event": "compaction_finished", "compaction_time_micros": 160151, "compaction_time_cpu_micros": 57526, "output_level": 6, "num_output_files": 1, "total_output_size": 23327628, "num_input_records": 13059, "num_output_records": 12521, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000023.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656687139, "job": 10, "event": "table_file_deletion", "file_number": 23}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000021.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939656689142, "job": 10, "event": "table_file_deletion", "file_number": 21}
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.523717) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.689206) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.689213) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.689217) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.689220) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:16 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:16.689224) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:17.027 2 INFO neutron.agent.securitygroups_rpc [None req-f508e6c2-9093-4f47-a287-c55eb4d8e7d1 ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group rule updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']
Feb 01 09:54:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:17.268 2 INFO neutron.agent.securitygroups_rpc [None req-f0037227-79b1-4433-9269-e9d8a6c269aa ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group rule updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']
Feb 01 09:54:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e115 e115: 6 total, 6 up, 6 in
Feb 01 09:54:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v176: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 3.4 KiB/s wr, 41 op/s
Feb 01 09:54:18 np0005604215.localdomain ceph-mon[298604]: osdmap e115: 6 total, 6 up, 6 in
Feb 01 09:54:18 np0005604215.localdomain ceph-mon[298604]: pgmap v176: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 3.4 KiB/s wr, 41 op/s
Feb 01 09:54:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e116 e116: 6 total, 6 up, 6 in
Feb 01 09:54:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e117 e117: 6 total, 6 up, 6 in
Feb 01 09:54:19 np0005604215.localdomain ceph-mon[298604]: osdmap e116: 6 total, 6 up, 6 in
Feb 01 09:54:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 5.5 KiB/s wr, 68 op/s
Feb 01 09:54:20 np0005604215.localdomain ceph-mon[298604]: osdmap e117: 6 total, 6 up, 6 in
Feb 01 09:54:20 np0005604215.localdomain ceph-mon[298604]: pgmap v179: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 49 KiB/s rd, 5.5 KiB/s wr, 68 op/s
Feb 01 09:54:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:54:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:54:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:20.792 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:20 np0005604215.localdomain podman[307086]: 2026-02-01 09:54:20.871568433 +0000 UTC m=+0.082987223 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Feb 01 09:54:20 np0005604215.localdomain podman[307086]: 2026-02-01 09:54:20.879743405 +0000 UTC m=+0.091162165 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:54:20 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:54:20 np0005604215.localdomain podman[307085]: 2026-02-01 09:54:20.927468804 +0000 UTC m=+0.142513566 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, architecture=x86_64, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, release=1769056855, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., managed_by=edpm_ansible, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, version=9.7, io.openshift.expose-services=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9)
Feb 01 09:54:20 np0005604215.localdomain podman[307085]: 2026-02-01 09:54:20.939682643 +0000 UTC m=+0.154727405 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., managed_by=edpm_ansible, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Feb 01 09:54:20 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:54:21
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['images', 'manila_metadata', '.mgr', 'manila_data', 'volumes', 'vms', 'backups']
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:54:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:21.376 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004301291614321608 of space, bias 1.0, pg target 0.8588245589928811 quantized to 32 (current 32)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.002170138888888889 quantized to 16 (current 16)
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:54:21 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/602535804' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 4.5 KiB/s wr, 55 op/s
Feb 01 09:54:22 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:22.462 2 INFO neutron.agent.securitygroups_rpc [req-83e0e8cb-3429-4ade-bafe-f7d6f9e3d311 req-5bc5dc3b-802b-43fa-a784-b37e50cbe40a ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group member updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']
Feb 01 09:54:22 np0005604215.localdomain ceph-mon[298604]: pgmap v180: 177 pgs: 177 active+clean; 145 MiB data, 750 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 4.5 KiB/s wr, 55 op/s
Feb 01 09:54:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v181: 177 pgs: 177 active+clean; 192 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 92 KiB/s rd, 3.2 MiB/s wr, 132 op/s
Feb 01 09:54:24 np0005604215.localdomain ceph-mon[298604]: pgmap v181: 177 pgs: 177 active+clean; 192 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 92 KiB/s rd, 3.2 MiB/s wr, 132 op/s
Feb 01 09:54:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:54:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/2769998936' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:54:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e117 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:25 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2769998936' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:54:25 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2263668386' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:54:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e118 e118: 6 total, 6 up, 6 in
Feb 01 09:54:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:25.794 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v183: 177 pgs: 177 active+clean; 192 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 2.8 MiB/s wr, 115 op/s
Feb 01 09:54:26 np0005604215.localdomain ceph-mon[298604]: osdmap e118: 6 total, 6 up, 6 in
Feb 01 09:54:26 np0005604215.localdomain ceph-mon[298604]: pgmap v183: 177 pgs: 177 active+clean; 192 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 80 KiB/s rd, 2.8 MiB/s wr, 115 op/s
Feb 01 09:54:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:26.378 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e119 e119: 6 total, 6 up, 6 in
Feb 01 09:54:27 np0005604215.localdomain ceph-mon[298604]: osdmap e119: 6 total, 6 up, 6 in
Feb 01 09:54:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v185: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.7 MiB/s wr, 152 op/s
Feb 01 09:54:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:28.522 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:54:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:28.523 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:28.524 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:54:28 np0005604215.localdomain ceph-mon[298604]: pgmap v185: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.7 MiB/s wr, 152 op/s
Feb 01 09:54:28 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:28.735 2 INFO neutron.agent.securitygroups_rpc [None req-ff9f5f38-da01-49cd-ad4a-92231356a657 ff147cab913d4d439b1d697fdf7e96ba dd3a0e574d0f493cafe8d66c78341de5 - - default default] Security group member updated ['39ab8694-6bb0-4b5a-b2c8-cff6705213f5']
Feb 01 09:54:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:54:28 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:28.859 259225 INFO neutron.agent.linux.ip_lib [None req-50add5a8-54a9-4f30-886b-965a6a102f12 - - - - - -] Device tap189326ee-2f cannot be used as it has no MAC address
Feb 01 09:54:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:28.881 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:28 np0005604215.localdomain systemd[1]: tmp-crun.J4EQDp.mount: Deactivated successfully.
Feb 01 09:54:28 np0005604215.localdomain kernel: device tap189326ee-2f entered promiscuous mode
Feb 01 09:54:28 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939668.9030] manager: (tap189326ee-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/25)
Feb 01 09:54:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:28.904 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:28 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:28Z|00120|binding|INFO|Claiming lport 189326ee-2f74-4f24-9cd3-a164e6fb714b for this chassis.
Feb 01 09:54:28 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:28Z|00121|binding|INFO|189326ee-2f74-4f24-9cd3-a164e6fb714b: Claiming unknown
Feb 01 09:54:28 np0005604215.localdomain podman[307125]: 2026-02-01 09:54:28.906444633 +0000 UTC m=+0.099860376 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, managed_by=edpm_ansible)
Feb 01 09:54:28 np0005604215.localdomain systemd-udevd[307149]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:54:28 np0005604215.localdomain podman[307125]: 2026-02-01 09:54:28.917914118 +0000 UTC m=+0.111329881 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:54:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:28.920 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c02f9419-6799-4a45-bf83-c316a3817c7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c02f9419-6799-4a45-bf83-c316a3817c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aa5c461f9764c8e9c6f7f88a3f3fe97', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08cda268-c34a-48a4-b851-ac14c0cb1641, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=189326ee-2f74-4f24-9cd3-a164e6fb714b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:54:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:28.922 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 189326ee-2f74-4f24-9cd3-a164e6fb714b in datapath c02f9419-6799-4a45-bf83-c316a3817c7c bound to our chassis
Feb 01 09:54:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:28.924 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c02f9419-6799-4a45-bf83-c316a3817c7c or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:54:28 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:28.925 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[513fd65b-9f65-42ed-a190-e397e77c9d88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:54:28 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:54:28 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:28Z|00122|binding|INFO|Setting lport 189326ee-2f74-4f24-9cd3-a164e6fb714b ovn-installed in OVS
Feb 01 09:54:28 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:28Z|00123|binding|INFO|Setting lport 189326ee-2f74-4f24-9cd3-a164e6fb714b up in Southbound
Feb 01 09:54:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:28.952 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:28.985 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:29.013 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:29.526 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:54:29 np0005604215.localdomain podman[307206]: 
Feb 01 09:54:29 np0005604215.localdomain podman[307206]: 2026-02-01 09:54:29.848864353 +0000 UTC m=+0.087156751 container create 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:54:29 np0005604215.localdomain systemd[1]: Started libpod-conmon-4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604.scope.
Feb 01 09:54:29 np0005604215.localdomain podman[307206]: 2026-02-01 09:54:29.807326176 +0000 UTC m=+0.045618624 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:54:29 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:54:29 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/113189f63466168ef0eabf3272676643536a7b94540e37806059557d37db92bb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:54:29 np0005604215.localdomain podman[307206]: 2026-02-01 09:54:29.942893017 +0000 UTC m=+0.181185425 container init 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:54:29 np0005604215.localdomain dnsmasq[307224]: started, version 2.85 cachesize 150
Feb 01 09:54:29 np0005604215.localdomain dnsmasq[307224]: DNS service limited to local subnets
Feb 01 09:54:29 np0005604215.localdomain dnsmasq[307224]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:54:29 np0005604215.localdomain dnsmasq[307224]: warning: no upstream servers configured
Feb 01 09:54:29 np0005604215.localdomain dnsmasq-dhcp[307224]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:54:29 np0005604215.localdomain dnsmasq[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/addn_hosts - 0 addresses
Feb 01 09:54:29 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/host
Feb 01 09:54:29 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/opts
Feb 01 09:54:29 np0005604215.localdomain podman[307206]: 2026-02-01 09:54:29.969226863 +0000 UTC m=+0.207519261 container start 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:54:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:54:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:54:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:54:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157180 "" "Go-http-client/1.1"
Feb 01 09:54:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:54:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18767 "" "Go-http-client/1.1"
Feb 01 09:54:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e119 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:30 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:30.132 259225 INFO neutron.agent.dhcp.agent [None req-8c5e2a40-9261-4f0f-b0c6-6bc97b29844d - - - - - -] DHCP configuration for ports {'26db4edd-796f-4cee-a122-9e82374993e6'} is completed
Feb 01 09:54:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v186: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.7 MiB/s wr, 152 op/s
Feb 01 09:54:30 np0005604215.localdomain ceph-mon[298604]: pgmap v186: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 107 KiB/s rd, 2.7 MiB/s wr, 152 op/s
Feb 01 09:54:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:30.796 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:30 np0005604215.localdomain systemd[1]: tmp-crun.Mr6ssH.mount: Deactivated successfully.
Feb 01 09:54:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:31.402 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:54:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:54:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:54:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:54:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:54:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:54:32 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:32.152 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:31Z, description=, device_id=2d5747c6-cbdf-4151-8b77-e62f81a5dd69, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b49190>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b49370>], id=d960006d-012f-4999-af0e-537b8af1210c, ip_allocation=immediate, mac_address=fa:16:3e:42:59:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:26Z, description=, dns_domain=, id=c02f9419-6799-4a45-bf83-c316a3817c7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-656358925, port_security_enabled=True, project_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4020, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1188, status=ACTIVE, subnets=['b85ccb18-7d4a-4256-96ba-e762f4efe60c'], tags=[], tenant_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, updated_at=2026-02-01T09:54:27Z, vlan_transparent=None, network_id=c02f9419-6799-4a45-bf83-c316a3817c7c, port_security_enabled=False, project_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1219, status=DOWN, tags=[], tenant_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, updated_at=2026-02-01T09:54:31Z on network c02f9419-6799-4a45-bf83-c316a3817c7c
Feb 01 09:54:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v187: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 24 KiB/s wr, 42 op/s
Feb 01 09:54:32 np0005604215.localdomain dnsmasq[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/addn_hosts - 1 addresses
Feb 01 09:54:32 np0005604215.localdomain podman[307242]: 2026-02-01 09:54:32.356512623 +0000 UTC m=+0.056801321 container kill 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:54:32 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/host
Feb 01 09:54:32 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/opts
Feb 01 09:54:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:54:32 np0005604215.localdomain podman[307257]: 2026-02-01 09:54:32.480853746 +0000 UTC m=+0.091207368 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:54:32 np0005604215.localdomain podman[307257]: 2026-02-01 09:54:32.494611152 +0000 UTC m=+0.104964784 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:54:32 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:54:32 np0005604215.localdomain ceph-mon[298604]: pgmap v187: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 24 KiB/s wr, 42 op/s
Feb 01 09:54:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e120 e120: 6 total, 6 up, 6 in
Feb 01 09:54:32 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:32.590 259225 INFO neutron.agent.dhcp.agent [None req-14d13840-ac7b-4ed9-8f95-e381216dacfd - - - - - -] DHCP configuration for ports {'d960006d-012f-4999-af0e-537b8af1210c'} is completed
Feb 01 09:54:33 np0005604215.localdomain sudo[307287]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:54:33 np0005604215.localdomain sudo[307287]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:54:33 np0005604215.localdomain sudo[307287]: pam_unix(sudo:session): session closed for user root
Feb 01 09:54:33 np0005604215.localdomain sudo[307305]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:54:33 np0005604215.localdomain sudo[307305]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:54:33 np0005604215.localdomain ceph-mon[298604]: osdmap e120: 6 total, 6 up, 6 in
Feb 01 09:54:33 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:33.576 2 INFO neutron.agent.securitygroups_rpc [None req-9125c9fe-67c0-46c6-98f6-2771b3ce7427 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']
Feb 01 09:54:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v189: 177 pgs: 177 active+clean; 304 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 14 MiB/s wr, 174 op/s
Feb 01 09:54:34 np0005604215.localdomain sudo[307305]: pam_unix(sudo:session): session closed for user root
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:54:34 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 57610793-d3df-41d6-8fdb-39c5f443f0b7 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:54:34 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 57610793-d3df-41d6-8fdb-39c5f443f0b7 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:54:34 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 57610793-d3df-41d6-8fdb-39c5f443f0b7 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:54:34 np0005604215.localdomain sudo[307354]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:54:34 np0005604215.localdomain sudo[307354]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:54:34 np0005604215.localdomain sudo[307354]: pam_unix(sudo:session): session closed for user root
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: pgmap v189: 177 pgs: 177 active+clean; 304 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.9 MiB/s rd, 14 MiB/s wr, 174 op/s
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:54:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e121 e121: 6 total, 6 up, 6 in
Feb 01 09:54:34 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:34.705 2 INFO neutron.agent.securitygroups_rpc [None req-1d734dba-1bcf-45d6-b4fb-cb8bacf3e60d 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']
Feb 01 09:54:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e121 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:35 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:35.144 2 INFO neutron.agent.securitygroups_rpc [None req-546fc9ec-f61e-4cb1-ba06-2baf55334087 ff147cab913d4d439b1d697fdf7e96ba dd3a0e574d0f493cafe8d66c78341de5 - - default default] Security group member updated ['39ab8694-6bb0-4b5a-b2c8-cff6705213f5']
Feb 01 09:54:35 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:35.218 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:31Z, description=, device_id=2d5747c6-cbdf-4151-8b77-e62f81a5dd69, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323356a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032335280>], id=d960006d-012f-4999-af0e-537b8af1210c, ip_allocation=immediate, mac_address=fa:16:3e:42:59:f8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:26Z, description=, dns_domain=, id=c02f9419-6799-4a45-bf83-c316a3817c7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-656358925, port_security_enabled=True, project_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4020, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1188, status=ACTIVE, subnets=['b85ccb18-7d4a-4256-96ba-e762f4efe60c'], tags=[], tenant_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, updated_at=2026-02-01T09:54:27Z, vlan_transparent=None, network_id=c02f9419-6799-4a45-bf83-c316a3817c7c, port_security_enabled=False, project_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1219, status=DOWN, tags=[], tenant_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, updated_at=2026-02-01T09:54:31Z on network c02f9419-6799-4a45-bf83-c316a3817c7c
Feb 01 09:54:35 np0005604215.localdomain dnsmasq[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/addn_hosts - 1 addresses
Feb 01 09:54:35 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/host
Feb 01 09:54:35 np0005604215.localdomain podman[307389]: 2026-02-01 09:54:35.452380628 +0000 UTC m=+0.063544479 container kill 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:54:35 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/opts
Feb 01 09:54:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e122 e122: 6 total, 6 up, 6 in
Feb 01 09:54:35 np0005604215.localdomain ceph-mon[298604]: osdmap e121: 6 total, 6 up, 6 in
Feb 01 09:54:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2444974192' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:54:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2444974192' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:54:35 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:35.646 259225 INFO neutron.agent.dhcp.agent [None req-4b5745f8-2e45-4fe8-925d-bc8f590991bb - - - - - -] DHCP configuration for ports {'d960006d-012f-4999-af0e-537b8af1210c'} is completed
Feb 01 09:54:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:35.798 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:35 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:35.954 2 INFO neutron.agent.securitygroups_rpc [None req-9bcabe14-1107-4613-a202-1866c6f3ee13 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']
Feb 01 09:54:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v192: 177 pgs: 177 active+clean; 304 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 19 MiB/s wr, 175 op/s
Feb 01 09:54:36 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:36.300 2 INFO neutron.agent.securitygroups_rpc [None req-7830ec90-7d5e-4488-a4fe-cfe1f6b35ae5 21d02ef23bf34fe3ad07a151844e8a84 7aa5c461f9764c8e9c6f7f88a3f3fe97 - - default default] Security group member updated ['a27a2b34-3872-4d18-89d2-71a867c33b37']
Feb 01 09:54:36 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:36.340 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:35Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00334c29d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032bdef40>], id=772a1edd-8dae-414e-a3f1-bfe14c7a0938, ip_allocation=immediate, mac_address=fa:16:3e:d6:e7:2b, name=tempest-FloatingIPNegativeTestJSON-1037090788, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:26Z, description=, dns_domain=, id=c02f9419-6799-4a45-bf83-c316a3817c7c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPNegativeTestJSON-test-network-656358925, port_security_enabled=True, project_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4020, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1188, status=ACTIVE, subnets=['b85ccb18-7d4a-4256-96ba-e762f4efe60c'], tags=[], tenant_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, updated_at=2026-02-01T09:54:27Z, vlan_transparent=None, network_id=c02f9419-6799-4a45-bf83-c316a3817c7c, port_security_enabled=True, project_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['a27a2b34-3872-4d18-89d2-71a867c33b37'], standard_attr_id=1250, status=DOWN, tags=[], tenant_id=7aa5c461f9764c8e9c6f7f88a3f3fe97, updated_at=2026-02-01T09:54:36Z on network c02f9419-6799-4a45-bf83-c316a3817c7c
Feb 01 09:54:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:36.436 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:54:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:54:36 np0005604215.localdomain podman[307428]: 2026-02-01 09:54:36.588549842 +0000 UTC m=+0.059971638 container kill 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:54:36 np0005604215.localdomain dnsmasq[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/addn_hosts - 2 addresses
Feb 01 09:54:36 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/host
Feb 01 09:54:36 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/opts
Feb 01 09:54:36 np0005604215.localdomain ceph-mon[298604]: osdmap e122: 6 total, 6 up, 6 in
Feb 01 09:54:36 np0005604215.localdomain ceph-mon[298604]: pgmap v192: 177 pgs: 177 active+clean; 304 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 19 MiB/s wr, 175 op/s
Feb 01 09:54:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:54:36 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:36.785 259225 INFO neutron.agent.dhcp.agent [None req-b6af9f1a-7cb7-4173-b71f-a0a40a4d3f3a - - - - - -] DHCP configuration for ports {'772a1edd-8dae-414e-a3f1-bfe14c7a0938'} is completed
Feb 01 09:54:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:37.324 2 INFO neutron.agent.securitygroups_rpc [None req-2518325d-e6ff-412d-931e-351c87841bd0 9a33ad723bea40f8bb6325e752986a5b 7b25cdb96bed441fa12160a57bca4d9c - - default default] Security group member updated ['e61e0f68-6135-4301-ab8c-68625c4e91d7']
Feb 01 09:54:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v193: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 19 MiB/s wr, 268 op/s
Feb 01 09:54:38 np0005604215.localdomain ceph-mon[298604]: pgmap v193: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 3.9 MiB/s rd, 19 MiB/s wr, 268 op/s
Feb 01 09:54:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e122 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v194: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 4.3 KiB/s wr, 73 op/s
Feb 01 09:54:40 np0005604215.localdomain ceph-mon[298604]: pgmap v194: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 4.3 KiB/s wr, 73 op/s
Feb 01 09:54:40 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:40.541 2 INFO neutron.agent.securitygroups_rpc [None req-515ecb9d-14c8-49e9-8847-f48fb7c12a8c 21d02ef23bf34fe3ad07a151844e8a84 7aa5c461f9764c8e9c6f7f88a3f3fe97 - - default default] Security group member updated ['a27a2b34-3872-4d18-89d2-71a867c33b37']
Feb 01 09:54:40 np0005604215.localdomain podman[307467]: 2026-02-01 09:54:40.793925226 +0000 UTC m=+0.057837703 container kill 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:54:40 np0005604215.localdomain dnsmasq[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/addn_hosts - 1 addresses
Feb 01 09:54:40 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/host
Feb 01 09:54:40 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/opts
Feb 01 09:54:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:40.800 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:41.470 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 e123: 6 total, 6 up, 6 in
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.772 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.773 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.773 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:54:41 np0005604215.localdomain podman[307507]: 2026-02-01 09:54:41.821730193 +0000 UTC m=+0.068053799 container kill 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:54:41 np0005604215.localdomain dnsmasq[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/addn_hosts - 0 addresses
Feb 01 09:54:41 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/host
Feb 01 09:54:41 np0005604215.localdomain dnsmasq-dhcp[307224]: read /var/lib/neutron/dhcp/c02f9419-6799-4a45-bf83-c316a3817c7c/opts
Feb 01 09:54:41 np0005604215.localdomain systemd[1]: tmp-crun.LxDESU.mount: Deactivated successfully.
Feb 01 09:54:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:41Z|00124|binding|INFO|Releasing lport 189326ee-2f74-4f24-9cd3-a164e6fb714b from this chassis (sb_readonly=0)
Feb 01 09:54:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:41.951 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:41Z|00125|binding|INFO|Setting lport 189326ee-2f74-4f24-9cd3-a164e6fb714b down in Southbound
Feb 01 09:54:41 np0005604215.localdomain kernel: device tap189326ee-2f left promiscuous mode
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.961 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c02f9419-6799-4a45-bf83-c316a3817c7c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c02f9419-6799-4a45-bf83-c316a3817c7c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7aa5c461f9764c8e9c6f7f88a3f3fe97', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08cda268-c34a-48a4-b851-ac14c0cb1641, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=189326ee-2f74-4f24-9cd3-a164e6fb714b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.964 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 189326ee-2f74-4f24-9cd3-a164e6fb714b in datapath c02f9419-6799-4a45-bf83-c316a3817c7c unbound from our chassis
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.967 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c02f9419-6799-4a45-bf83-c316a3817c7c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:54:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:41.968 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf8dc0f-21a4-406e-8e3d-a40bf6449db4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:54:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:41.972 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v196: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 4.3 KiB/s wr, 73 op/s
Feb 01 09:54:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:42.246 2 INFO neutron.agent.securitygroups_rpc [None req-85d5be93-4f1a-4c18-9ce2-6a112b54530f 84f3db440e5d42c59396aab4e1ffcfd9 2a205e14a65e4950b2897f78a7089f09 - - default default] Security group member updated ['9edef165-badf-4d99-97d5-46869e0947c8']
Feb 01 09:54:42 np0005604215.localdomain ceph-mon[298604]: osdmap e123: 6 total, 6 up, 6 in
Feb 01 09:54:42 np0005604215.localdomain ceph-mon[298604]: pgmap v196: 177 pgs: 177 active+clean; 192 MiB data, 815 MiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 4.3 KiB/s wr, 73 op/s
Feb 01 09:54:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:42.565 2 INFO neutron.agent.securitygroups_rpc [None req-86fc0e8a-dc01-4804-b333-df33401eb55c ba01912592664d639fa7a27174068a0f a8a2395fa8604962aa6888633ff95bee - - default default] Security group member updated ['adcc453c-f15e-407c-b903-8df7ba9f8ef6']
Feb 01 09:54:43 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:43.114 2 INFO neutron.agent.securitygroups_rpc [None req-ea460609-a7b7-4b88-8971-9c496984f41d ba01912592664d639fa7a27174068a0f a8a2395fa8604962aa6888633ff95bee - - default default] Security group member updated ['adcc453c-f15e-407c-b903-8df7ba9f8ef6']
Feb 01 09:54:43 np0005604215.localdomain dnsmasq[307224]: exiting on receipt of SIGTERM
Feb 01 09:54:43 np0005604215.localdomain podman[307549]: 2026-02-01 09:54:43.131253728 +0000 UTC m=+0.064521420 container kill 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:54:43 np0005604215.localdomain systemd[1]: libpod-4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604.scope: Deactivated successfully.
Feb 01 09:54:43 np0005604215.localdomain podman[307563]: 2026-02-01 09:54:43.210188375 +0000 UTC m=+0.061696563 container died 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:54:43 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604-userdata-shm.mount: Deactivated successfully.
Feb 01 09:54:43 np0005604215.localdomain podman[307563]: 2026-02-01 09:54:43.245903711 +0000 UTC m=+0.097411859 container cleanup 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:54:43 np0005604215.localdomain systemd[1]: libpod-conmon-4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604.scope: Deactivated successfully.
Feb 01 09:54:43 np0005604215.localdomain podman[307565]: 2026-02-01 09:54:43.29492363 +0000 UTC m=+0.136687346 container remove 4d79b84367e5eeadca304688702fd21953ae916c1a53dd25201c5a123ebed604 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c02f9419-6799-4a45-bf83-c316a3817c7c, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:54:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:43.653 259225 INFO neutron.agent.dhcp.agent [None req-c7ad89cb-3f40-49bb-b07d-34594349c61b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:54:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:43.653 259225 INFO neutron.agent.dhcp.agent [None req-c7ad89cb-3f40-49bb-b07d-34594349c61b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:54:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-113189f63466168ef0eabf3272676643536a7b94540e37806059557d37db92bb-merged.mount: Deactivated successfully.
Feb 01 09:54:44 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dc02f9419\x2d6799\x2d4a45\x2dbf83\x2dc316a3817c7c.mount: Deactivated successfully.
Feb 01 09:54:44 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:44.193 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:54:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v197: 177 pgs: 177 active+clean; 217 MiB data, 892 MiB used, 41 GiB / 42 GiB avail; 341 KiB/s rd, 2.9 MiB/s wr, 129 op/s
Feb 01 09:54:44 np0005604215.localdomain ceph-mon[298604]: pgmap v197: 177 pgs: 177 active+clean; 217 MiB data, 892 MiB used, 41 GiB / 42 GiB avail; 341 KiB/s rd, 2.9 MiB/s wr, 129 op/s
Feb 01 09:54:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:44.609 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:44 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:44.662 2 INFO neutron.agent.securitygroups_rpc [None req-9845e2e7-e54c-48b9-9b8f-f8c7a4c52742 84f3db440e5d42c59396aab4e1ffcfd9 2a205e14a65e4950b2897f78a7089f09 - - default default] Security group member updated ['9edef165-badf-4d99-97d5-46869e0947c8']
Feb 01 09:54:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:45.839 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:46.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:46.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:54:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:46.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:54:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:46.114 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:54:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v198: 177 pgs: 177 active+clean; 217 MiB data, 892 MiB used, 41 GiB / 42 GiB avail; 293 KiB/s rd, 2.5 MiB/s wr, 111 op/s
Feb 01 09:54:46 np0005604215.localdomain ceph-mon[298604]: pgmap v198: 177 pgs: 177 active+clean; 217 MiB data, 892 MiB used, 41 GiB / 42 GiB avail; 293 KiB/s rd, 2.5 MiB/s wr, 111 op/s
Feb 01 09:54:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:46.471 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:54:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:54:46 np0005604215.localdomain podman[307591]: 2026-02-01 09:54:46.862760189 +0000 UTC m=+0.067844114 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:54:46 np0005604215.localdomain podman[307591]: 2026-02-01 09:54:46.900732646 +0000 UTC m=+0.105816521 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:54:46 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:54:46 np0005604215.localdomain podman[307590]: 2026-02-01 09:54:46.995624777 +0000 UTC m=+0.201439174 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 01 09:54:47 np0005604215.localdomain podman[307590]: 2026-02-01 09:54:47.033741458 +0000 UTC m=+0.239555895 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:54:47 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:54:47 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:47.344 259225 INFO neutron.agent.linux.ip_lib [None req-4a49e685-4102-47ab-9f1a-28ef199c1e58 - - - - - -] Device tap9f362718-c5 cannot be used as it has no MAC address
Feb 01 09:54:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:47.365 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:47 np0005604215.localdomain kernel: device tap9f362718-c5 entered promiscuous mode
Feb 01 09:54:47 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939687.3742] manager: (tap9f362718-c5): new Generic device (/org/freedesktop/NetworkManager/Devices/26)
Feb 01 09:54:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:47.373 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:47 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:47Z|00126|binding|INFO|Claiming lport 9f362718-c529-402d-be4c-23264e6d4d0a for this chassis.
Feb 01 09:54:47 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:47Z|00127|binding|INFO|9f362718-c529-402d-be4c-23264e6d4d0a: Claiming unknown
Feb 01 09:54:47 np0005604215.localdomain systemd-udevd[307649]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:54:47 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:47.386 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-6c3db03b-523e-4bc1-b393-9ebce2d989a9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c3db03b-523e-4bc1-b393-9ebce2d989a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3e5e9f4ac99471688f0279d307f2650', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19c2267c-00a5-46e3-9993-22d0e5d1c93f, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9f362718-c529-402d-be4c-23264e6d4d0a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:54:47 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:47.389 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9f362718-c529-402d-be4c-23264e6d4d0a in datapath 6c3db03b-523e-4bc1-b393-9ebce2d989a9 bound to our chassis
Feb 01 09:54:47 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:47.391 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6c3db03b-523e-4bc1-b393-9ebce2d989a9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:54:47 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:47.393 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[92bcad3c-44c6-4bac-8fe3-5e40c1992622]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:47Z|00128|binding|INFO|Setting lport 9f362718-c529-402d-be4c-23264e6d4d0a ovn-installed in OVS
Feb 01 09:54:47 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:47Z|00129|binding|INFO|Setting lport 9f362718-c529-402d-be4c-23264e6d4d0a up in Southbound
Feb 01 09:54:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:47.405 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9f362718-c5: No such device
Feb 01 09:54:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:47.439 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:47.467 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v199: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 324 KiB/s rd, 2.6 MiB/s wr, 72 op/s
Feb 01 09:54:48 np0005604215.localdomain podman[307722]: 
Feb 01 09:54:48 np0005604215.localdomain podman[307722]: 2026-02-01 09:54:48.243659947 +0000 UTC m=+0.089122692 container create 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 09:54:48 np0005604215.localdomain systemd[1]: Started libpod-conmon-413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf.scope.
Feb 01 09:54:48 np0005604215.localdomain systemd[1]: tmp-crun.1seY1C.mount: Deactivated successfully.
Feb 01 09:54:48 np0005604215.localdomain podman[307722]: 2026-02-01 09:54:48.200564542 +0000 UTC m=+0.046027287 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:54:48 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:54:48 np0005604215.localdomain ceph-mon[298604]: pgmap v199: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 324 KiB/s rd, 2.6 MiB/s wr, 72 op/s
Feb 01 09:54:48 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2be8a4cf6f9b91e437cac744565b8665b9984dca2b679012aff6c3eac617c63b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:54:48 np0005604215.localdomain podman[307722]: 2026-02-01 09:54:48.323055087 +0000 UTC m=+0.168517822 container init 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:54:48 np0005604215.localdomain podman[307722]: 2026-02-01 09:54:48.332246062 +0000 UTC m=+0.177708797 container start 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:54:48 np0005604215.localdomain dnsmasq[307740]: started, version 2.85 cachesize 150
Feb 01 09:54:48 np0005604215.localdomain dnsmasq[307740]: DNS service limited to local subnets
Feb 01 09:54:48 np0005604215.localdomain dnsmasq[307740]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:54:48 np0005604215.localdomain dnsmasq[307740]: warning: no upstream servers configured
Feb 01 09:54:48 np0005604215.localdomain dnsmasq-dhcp[307740]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:54:48 np0005604215.localdomain dnsmasq[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/addn_hosts - 0 addresses
Feb 01 09:54:48 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/host
Feb 01 09:54:48 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/opts
Feb 01 09:54:48 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:48.514 259225 INFO neutron.agent.dhcp.agent [None req-1421531c-62a4-4503-9478-5b8d3a03642e - - - - - -] DHCP configuration for ports {'e1ae0704-eeaa-4346-991d-fe06dc0ead13'} is completed
Feb 01 09:54:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:50.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:50.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v200: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 324 KiB/s rd, 2.6 MiB/s wr, 72 op/s
Feb 01 09:54:50 np0005604215.localdomain ceph-mon[298604]: pgmap v200: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 324 KiB/s rd, 2.6 MiB/s wr, 72 op/s
Feb 01 09:54:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:50.669 2 INFO neutron.agent.securitygroups_rpc [None req-57c956cb-89d5-4885-9663-ca5823a12d21 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['6ebf4d70-9c5f-40a7-b43f-38d30ca97739']
Feb 01 09:54:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:50.840 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:50.967 2 INFO neutron.agent.securitygroups_rpc [None req-ca712223-c062-400f-8ed8-8ff5e5903afc 306e307654cf41949f0bb118796a4bc7 8f87cde7f6eb4ef0beb13dc0679c10cb - - default default] Security group member updated ['a498609f-8637-4692-9d11-be96cabae719']
Feb 01 09:54:51 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:51.015 2 INFO neutron.agent.securitygroups_rpc [None req-c3b6808d-2668-4b98-8bd8-53b9c4ac7a7c 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['6ebf4d70-9c5f-40a7-b43f-38d30ca97739']
Feb 01 09:54:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:51.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:51.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:51.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:54:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:54:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:54:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:54:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:54:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:54:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:51.494 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:54:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:54:51 np0005604215.localdomain podman[307741]: 2026-02-01 09:54:51.870108143 +0000 UTC m=+0.084413927 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, vcs-type=git, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, version=9.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, architecture=x86_64, release=1769056855)
Feb 01 09:54:51 np0005604215.localdomain podman[307741]: 2026-02-01 09:54:51.881484635 +0000 UTC m=+0.095790419 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, config_id=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 01 09:54:51 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:54:51 np0005604215.localdomain systemd[1]: tmp-crun.2aH7Xt.mount: Deactivated successfully.
Feb 01 09:54:51 np0005604215.localdomain podman[307742]: 2026-02-01 09:54:51.977722067 +0000 UTC m=+0.189140071 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:54:52 np0005604215.localdomain podman[307742]: 2026-02-01 09:54:52.011837685 +0000 UTC m=+0.223255679 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:54:52 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.099 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.116 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.117 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.117 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.117 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.117 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:54:52 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:52.137 2 INFO neutron.agent.securitygroups_rpc [None req-898aca50-e443-4f04-8633-193e8d5d70fe 306e307654cf41949f0bb118796a4bc7 8f87cde7f6eb4ef0beb13dc0679c10cb - - default default] Security group member updated ['a498609f-8637-4692-9d11-be96cabae719']
Feb 01 09:54:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v201: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 304 KiB/s rd, 2.4 MiB/s wr, 67 op/s
Feb 01 09:54:52 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:52.519 2 INFO neutron.agent.securitygroups_rpc [None req-194b0ee6-ff55-463b-b49c-e9e305c5f2ea 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:52 np0005604215.localdomain ceph-mon[298604]: pgmap v201: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 304 KiB/s rd, 2.4 MiB/s wr, 67 op/s
Feb 01 09:54:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:54:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2584361436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.704 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.587s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.930 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.931 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11647MB free_disk=41.70072555541992GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.931 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.931 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.981 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:54:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:52.982 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.008 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:54:53 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:53.272 2 INFO neutron.agent.securitygroups_rpc [None req-40a7cf1e-2b3d-4cda-aef2-d6b58ad042f7 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:54:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/678286761' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.451 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.444s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.458 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.471 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.495 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.495 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.564s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:54:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2584361436' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/678286761' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2232338093' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:53.707 259225 INFO neutron.agent.linux.ip_lib [None req-87eae992-28d6-46b3-b307-ebf4256c1112 - - - - - -] Device tap663aeef3-4f cannot be used as it has no MAC address
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.729 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:53 np0005604215.localdomain kernel: device tap663aeef3-4f entered promiscuous mode
Feb 01 09:54:53 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939693.7359] manager: (tap663aeef3-4f): new Generic device (/org/freedesktop/NetworkManager/Devices/27)
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.735 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:53Z|00130|binding|INFO|Claiming lport 663aeef3-4f9a-4e46-92e6-29e331b8f905 for this chassis.
Feb 01 09:54:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:53Z|00131|binding|INFO|663aeef3-4f9a-4e46-92e6-29e331b8f905: Claiming unknown
Feb 01 09:54:53 np0005604215.localdomain systemd-udevd[307831]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:54:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:53.745 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c19dda83-2ee3-4143-9992-3940695b7883, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=663aeef3-4f9a-4e46-92e6-29e331b8f905) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:54:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:53.747 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 663aeef3-4f9a-4e46-92e6-29e331b8f905 in datapath 3a0bb9e2-95cc-4b20-87c6-1e5c55901a39 bound to our chassis
Feb 01 09:54:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:53.748 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3a0bb9e2-95cc-4b20-87c6-1e5c55901a39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:54:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:54:53.749 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[704435fe-dfb6-4d1c-a4ac-592ec6b066f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:54:53 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:53.751 2 INFO neutron.agent.securitygroups_rpc [None req-e7666148-591c-4a9b-983e-c90f12ec30cc 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:53Z|00132|binding|INFO|Setting lport 663aeef3-4f9a-4e46-92e6-29e331b8f905 ovn-installed in OVS
Feb 01 09:54:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:54:53Z|00133|binding|INFO|Setting lport 663aeef3-4f9a-4e46-92e6-29e331b8f905 up in Southbound
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.771 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap663aeef3-4f: No such device
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.806 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:53.832 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v202: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 276 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Feb 01 09:54:54 np0005604215.localdomain ceph-mon[298604]: pgmap v202: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 276 KiB/s rd, 2.1 MiB/s wr, 68 op/s
Feb 01 09:54:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2454073390' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:54 np0005604215.localdomain podman[307902]: 
Feb 01 09:54:54 np0005604215.localdomain podman[307902]: 2026-02-01 09:54:54.599167303 +0000 UTC m=+0.092148037 container create 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:54:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf.scope.
Feb 01 09:54:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:54:54 np0005604215.localdomain podman[307902]: 2026-02-01 09:54:54.553271561 +0000 UTC m=+0.046252315 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:54:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c010c25732730f6cf24926dbc34c994a3de25b1f3464184ac39f6a9aeb5aeb15/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:54:54 np0005604215.localdomain podman[307902]: 2026-02-01 09:54:54.666265942 +0000 UTC m=+0.159246666 container init 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:54:54 np0005604215.localdomain podman[307902]: 2026-02-01 09:54:54.678412688 +0000 UTC m=+0.171393412 container start 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 09:54:54 np0005604215.localdomain dnsmasq[307921]: started, version 2.85 cachesize 150
Feb 01 09:54:54 np0005604215.localdomain dnsmasq[307921]: DNS service limited to local subnets
Feb 01 09:54:54 np0005604215.localdomain dnsmasq[307921]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:54:54 np0005604215.localdomain dnsmasq[307921]: warning: no upstream servers configured
Feb 01 09:54:54 np0005604215.localdomain dnsmasq-dhcp[307921]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:54:54 np0005604215.localdomain dnsmasq[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/addn_hosts - 0 addresses
Feb 01 09:54:54 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/host
Feb 01 09:54:54 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/opts
Feb 01 09:54:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:54.710 2 INFO neutron.agent.securitygroups_rpc [req-a8ee260e-a71a-4341-959a-47320de8959d req-f1ea5bc8-d304-408a-99f9-104affd65e7e ff35eaef616c4f428644a9a881f035d4 9bbefd3c06294b7fa7720ba6ca48fa4b - - default default] Security group member updated ['d6a2366a-be19-483b-bd9c-86227fb6f0c8']
Feb 01 09:54:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:54.736 2 INFO neutron.agent.securitygroups_rpc [None req-7f906aa9-468b-48e8-aab4-90305509c943 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:54.821 259225 INFO neutron.agent.dhcp.agent [None req-25ece3ce-b5eb-46bf-980f-f32b3ca69e1b - - - - - -] DHCP configuration for ports {'c56dc049-37f4-476f-8f50-1932d09f33f5'} is completed
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:54:55 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:55.541 2 INFO neutron.agent.securitygroups_rpc [None req-bc75b288-28fa-41e6-8b23-683fd10099a8 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2253095091' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #25. Immutable memtables: 0.
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.617917) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 11] Flushing memtable with next log file: 25
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695617953, "job": 11, "event": "flush_started", "num_memtables": 1, "num_entries": 843, "num_deletes": 255, "total_data_size": 981828, "memory_usage": 996824, "flush_reason": "Manual Compaction"}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 11] Level-0 flush table #26: started
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695623768, "cf_name": "default", "job": 11, "event": "table_file_creation", "file_number": 26, "file_size": 634977, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 18677, "largest_seqno": 19515, "table_properties": {"data_size": 631278, "index_size": 1490, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1157, "raw_key_size": 9337, "raw_average_key_size": 20, "raw_value_size": 623480, "raw_average_value_size": 1379, "num_data_blocks": 65, "num_entries": 452, "num_filter_entries": 452, "num_deletions": 255, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939656, "oldest_key_time": 1769939656, "file_creation_time": 1769939695, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 26, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 11] Flush lasted 5897 microseconds, and 2698 cpu microseconds.
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.623811) [db/flush_job.cc:967] [default] [JOB 11] Level-0 flush table #26: 634977 bytes OK
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.623831) [db/memtable_list.cc:519] [default] Level-0 commit table #26 started
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.626185) [db/memtable_list.cc:722] [default] Level-0 commit table #26: memtable #1 done
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.626204) EVENT_LOG_v1 {"time_micros": 1769939695626198, "job": 11, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.626221) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 11] Try to delete WAL files size 977388, prev total WAL file size 977388, number of live WAL files 2.
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000022.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.626968) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131353436' seq:72057594037927935, type:22 .. '7061786F73003131373938' seq:0, type:0; will stop at (end)
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 12] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 11 Base level 0, inputs: [26(620KB)], [24(22MB)]
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695627012, "job": 12, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [26], "files_L6": [24], "score": -1, "input_data_size": 23962605, "oldest_snapshot_seqno": -1}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 12] Generated table #27: 12445 keys, 21349444 bytes, temperature: kUnknown
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695757765, "cf_name": "default", "job": 12, "event": "table_file_creation", "file_number": 27, "file_size": 21349444, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21276706, "index_size": 40509, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31173, "raw_key_size": 335633, "raw_average_key_size": 26, "raw_value_size": 21062837, "raw_average_value_size": 1692, "num_data_blocks": 1533, "num_entries": 12445, "num_filter_entries": 12445, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939695, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 27, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.757987) [db/compaction/compaction_job.cc:1663] [default] [JOB 12] Compacted 1@0 + 1@6 files to L6 => 21349444 bytes
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.763541) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 183.2 rd, 163.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.6, 22.2 +0.0 blob) out(20.4 +0.0 blob), read-write-amplify(71.4) write-amplify(33.6) OK, records in: 12973, records dropped: 528 output_compression: NoCompression
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.763581) EVENT_LOG_v1 {"time_micros": 1769939695763565, "job": 12, "event": "compaction_finished", "compaction_time_micros": 130814, "compaction_time_cpu_micros": 55897, "output_level": 6, "num_output_files": 1, "total_output_size": 21349444, "num_input_records": 12973, "num_output_records": 12445, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000026.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695763844, "job": 12, "event": "table_file_deletion", "file_number": 26}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000024.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939695766017, "job": 12, "event": "table_file_deletion", "file_number": 24}
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.626862) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.766042) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.766047) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.766049) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.766051) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:55 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:55.766053) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:55.843 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v203: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 104 KiB/s wr, 22 op/s
Feb 01 09:54:56 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:56.699 2 INFO neutron.agent.securitygroups_rpc [None req-56483b67-01a0-4213-8f99-ef04c4ba0846 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:56.695 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:54:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:54:56.697 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:54:56 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:56.705 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:55Z, description=, device_id=42b688b0-4c84-4fa7-8d5b-06392b34bb1a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310af0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310370>], id=991bdc44-02fe-46ae-a7ac-3c925253bc9a, ip_allocation=immediate, mac_address=fa:16:3e:29:40:3b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:49Z, description=, dns_domain=, id=3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1709197007, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54827, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1357, status=ACTIVE, subnets=['21530e73-d947-4c03-bf9d-7cb1658ac535'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:54:52Z, vlan_transparent=None, network_id=3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1406, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:54:55Z on network 3a0bb9e2-95cc-4b20-87c6-1e5c55901a39
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #28. Immutable memtables: 0.
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.711090) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 13] Flushing memtable with next log file: 28
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696711126, "job": 13, "event": "flush_started", "num_memtables": 1, "num_entries": 258, "num_deletes": 251, "total_data_size": 23386, "memory_usage": 29536, "flush_reason": "Manual Compaction"}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 13] Level-0 flush table #29: started
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696713560, "cf_name": "default", "job": 13, "event": "table_file_creation", "file_number": 29, "file_size": 14223, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19517, "largest_seqno": 19773, "table_properties": {"data_size": 12418, "index_size": 49, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 709, "raw_key_size": 5180, "raw_average_key_size": 20, "raw_value_size": 8969, "raw_average_value_size": 34, "num_data_blocks": 2, "num_entries": 257, "num_filter_entries": 257, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939696, "oldest_key_time": 1769939696, "file_creation_time": 1769939696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 29, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 13] Flush lasted 2519 microseconds, and 890 cpu microseconds.
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.713607) [db/flush_job.cc:967] [default] [JOB 13] Level-0 flush table #29: 14223 bytes OK
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.713628) [db/memtable_list.cc:519] [default] Level-0 commit table #29 started
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.715317) [db/memtable_list.cc:722] [default] Level-0 commit table #29: memtable #1 done
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.715343) EVENT_LOG_v1 {"time_micros": 1769939696715337, "job": 13, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.715363) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 13] Try to delete WAL files size 21365, prev total WAL file size 21365, number of live WAL files 2.
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000025.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.715996) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740033373539' seq:72057594037927935, type:22 .. '6D6772737461740034303131' seq:0, type:0; will stop at (end)
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 14] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 13 Base level 0, inputs: [29(13KB)], [27(20MB)]
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696716029, "job": 14, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [29], "files_L6": [27], "score": -1, "input_data_size": 21363667, "oldest_snapshot_seqno": -1}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1141559275' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/552575308' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: pgmap v203: 177 pgs: 177 active+clean; 225 MiB data, 895 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 104 KiB/s wr, 22 op/s
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 14] Generated table #30: 12195 keys, 19116671 bytes, temperature: kUnknown
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696825515, "cf_name": "default", "job": 14, "event": "table_file_creation", "file_number": 30, "file_size": 19116671, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19050547, "index_size": 34535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 30533, "raw_key_size": 330632, "raw_average_key_size": 27, "raw_value_size": 18845929, "raw_average_value_size": 1545, "num_data_blocks": 1283, "num_entries": 12195, "num_filter_entries": 12195, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939696, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 30, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.825825) [db/compaction/compaction_job.cc:1663] [default] [JOB 14] Compacted 1@0 + 1@6 files to L6 => 19116671 bytes
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.827626) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 194.9 rd, 174.4 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.0, 20.4 +0.0 blob) out(18.2 +0.0 blob), read-write-amplify(2846.1) write-amplify(1344.1) OK, records in: 12702, records dropped: 507 output_compression: NoCompression
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.827658) EVENT_LOG_v1 {"time_micros": 1769939696827645, "job": 14, "event": "compaction_finished", "compaction_time_micros": 109596, "compaction_time_cpu_micros": 43653, "output_level": 6, "num_output_files": 1, "total_output_size": 19116671, "num_input_records": 12702, "num_output_records": 12195, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000029.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696827803, "job": 14, "event": "table_file_deletion", "file_number": 29}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000027.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939696830797, "job": 14, "event": "table_file_deletion", "file_number": 27}
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.715955) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.830851) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.830858) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.830861) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.830864) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:56 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:54:56.830867) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:54:56 np0005604215.localdomain dnsmasq[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/addn_hosts - 1 addresses
Feb 01 09:54:56 np0005604215.localdomain podman[307940]: 2026-02-01 09:54:56.895870276 +0000 UTC m=+0.054448877 container kill 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:54:56 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/host
Feb 01 09:54:56 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/opts
Feb 01 09:54:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:57.073 259225 INFO neutron.agent.dhcp.agent [None req-a81291ef-6918-46b1-a3bc-1d8acebfb4aa - - - - - -] DHCP configuration for ports {'991bdc44-02fe-46ae-a7ac-3c925253bc9a'} is completed
Feb 01 09:54:57 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:57.723 2 INFO neutron.agent.securitygroups_rpc [None req-fccfae55-19b6-4730-a30c-150ca6a4b95d 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 105 KiB/s wr, 42 op/s
Feb 01 09:54:58 np0005604215.localdomain ceph-mon[298604]: pgmap v204: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 105 KiB/s wr, 42 op/s
Feb 01 09:54:58 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:58.464 2 INFO neutron.agent.securitygroups_rpc [None req-91b7ca55-4f70-425c-a0e2-0dabc032162c 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:59 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:59.072 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:54:55Z, description=, device_id=42b688b0-4c84-4fa7-8d5b-06392b34bb1a, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032369040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032369a60>], id=991bdc44-02fe-46ae-a7ac-3c925253bc9a, ip_allocation=immediate, mac_address=fa:16:3e:29:40:3b, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:49Z, description=, dns_domain=, id=3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-1709197007, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=54827, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1357, status=ACTIVE, subnets=['21530e73-d947-4c03-bf9d-7cb1658ac535'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:54:52Z, vlan_transparent=None, network_id=3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1406, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:54:55Z on network 3a0bb9e2-95cc-4b20-87c6-1e5c55901a39
Feb 01 09:54:59 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:59.116 2 INFO neutron.agent.securitygroups_rpc [None req-e31219af-a122-457f-883b-2593c5b9c745 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:54:59 np0005604215.localdomain dnsmasq[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/addn_hosts - 1 addresses
Feb 01 09:54:59 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/host
Feb 01 09:54:59 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/opts
Feb 01 09:54:59 np0005604215.localdomain systemd[1]: tmp-crun.Rchxgr.mount: Deactivated successfully.
Feb 01 09:54:59 np0005604215.localdomain podman[307980]: 2026-02-01 09:54:59.264990894 +0000 UTC m=+0.063570501 container kill 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:54:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:54:59 np0005604215.localdomain systemd[1]: tmp-crun.hKRZUQ.mount: Deactivated successfully.
Feb 01 09:54:59 np0005604215.localdomain podman[307995]: 2026-02-01 09:54:59.385788146 +0000 UTC m=+0.095294053 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 09:54:59 np0005604215.localdomain podman[307995]: 2026-02-01 09:54:59.424915009 +0000 UTC m=+0.134420906 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:54:59 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:54:59 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:54:59.537 259225 INFO neutron.agent.dhcp.agent [None req-abfe6113-2f9d-490d-b2c3-197feb8f6b12 - - - - - -] DHCP configuration for ports {'991bdc44-02fe-46ae-a7ac-3c925253bc9a'} is completed
Feb 01 09:54:59 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:54:59.635 2 INFO neutron.agent.securitygroups_rpc [None req-72904d6f-3e0e-4e9a-a3b1-a7457a722d24 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['0efcca4c-fcac-48b8-ae72-0742b6eb0b6c']
Feb 01 09:55:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:55:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:55:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:55:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 158990 "" "Go-http-client/1.1"
Feb 01 09:55:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:55:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19243 "" "Go-http-client/1.1"
Feb 01 09:55:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 01 09:55:00 np0005604215.localdomain ceph-mon[298604]: pgmap v205: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 01 09:55:00 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:00.498 259225 INFO neutron.agent.linux.ip_lib [None req-9f3711b0-0522-4e05-b7ad-f0342ca0796b - - - - - -] Device tapaebf5a93-f1 cannot be used as it has no MAC address
Feb 01 09:55:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:00.521 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:00 np0005604215.localdomain kernel: device tapaebf5a93-f1 entered promiscuous mode
Feb 01 09:55:00 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939700.5292] manager: (tapaebf5a93-f1): new Generic device (/org/freedesktop/NetworkManager/Devices/28)
Feb 01 09:55:00 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:00Z|00134|binding|INFO|Claiming lport aebf5a93-f1df-421a-8bc6-9d245205815f for this chassis.
Feb 01 09:55:00 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:00Z|00135|binding|INFO|aebf5a93-f1df-421a-8bc6-9d245205815f: Claiming unknown
Feb 01 09:55:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:00.529 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:00 np0005604215.localdomain systemd-udevd[308028]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:55:00 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:00.544 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-9ecb4282-8104-4878-8e0d-966d3ce505f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ecb4282-8104-4878-8e0d-966d3ce505f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e00f2ed54c74d70847b97f9f434e5e6', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64afe251-cbee-41d8-8098-a70c383c96db, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=aebf5a93-f1df-421a-8bc6-9d245205815f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:00 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:00.551 158655 INFO neutron.agent.ovn.metadata.agent [-] Port aebf5a93-f1df-421a-8bc6-9d245205815f in datapath 9ecb4282-8104-4878-8e0d-966d3ce505f1 bound to our chassis
Feb 01 09:55:00 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:00.553 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 9ecb4282-8104-4878-8e0d-966d3ce505f1 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:00 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:00.555 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[bdcc9dc5-0872-4013-9068-0a0284a6d490]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:00Z|00136|binding|INFO|Setting lport aebf5a93-f1df-421a-8bc6-9d245205815f ovn-installed in OVS
Feb 01 09:55:00 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:00Z|00137|binding|INFO|Setting lport aebf5a93-f1df-421a-8bc6-9d245205815f up in Southbound
Feb 01 09:55:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:00.574 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapaebf5a93-f1: No such device
Feb 01 09:55:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:00.613 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:00.636 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:00.844 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:01 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:01.120 2 INFO neutron.agent.securitygroups_rpc [None req-866dbbe7-cd9a-459b-9d5c-70660b94e103 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['e4f60f26-54df-4f21-8c82-cc76833023ab']
Feb 01 09:55:01 np0005604215.localdomain podman[308100]: 
Feb 01 09:55:01 np0005604215.localdomain podman[308100]: 2026-02-01 09:55:01.512725639 +0000 UTC m=+0.088246634 container create 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:55:01 np0005604215.localdomain systemd[1]: Started libpod-conmon-9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386.scope.
Feb 01 09:55:01 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:55:01 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4ca2471c03211d064700b7109f6deb39ac7204f972477d84418c84baebc05a05/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:55:01 np0005604215.localdomain podman[308100]: 2026-02-01 09:55:01.470616135 +0000 UTC m=+0.046137500 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:55:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:55:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:55:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:55:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:55:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:55:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:55:01 np0005604215.localdomain podman[308100]: 2026-02-01 09:55:01.577187807 +0000 UTC m=+0.152708782 container init 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:55:01 np0005604215.localdomain podman[308100]: 2026-02-01 09:55:01.589148157 +0000 UTC m=+0.164669152 container start 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:55:01 np0005604215.localdomain dnsmasq[308119]: started, version 2.85 cachesize 150
Feb 01 09:55:01 np0005604215.localdomain dnsmasq[308119]: DNS service limited to local subnets
Feb 01 09:55:01 np0005604215.localdomain dnsmasq[308119]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:55:01 np0005604215.localdomain dnsmasq[308119]: warning: no upstream servers configured
Feb 01 09:55:01 np0005604215.localdomain dnsmasq-dhcp[308119]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:55:01 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 0 addresses
Feb 01 09:55:01 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:01 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:01.632 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:01Z, description=, device_id=68d684c2-2d8e-49d4-b723-69387fb1e9d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236ffd0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236ff10>], id=3bfd0ccd-300d-4a86-a663-a937cbd871e8, ip_allocation=immediate, mac_address=fa:16:3e:a1:4a:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:44Z, description=, dns_domain=, id=6c3db03b-523e-4bc1-b393-9ebce2d989a9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1734500773, port_security_enabled=True, project_id=b3e5e9f4ac99471688f0279d307f2650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52069, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1306, status=ACTIVE, subnets=['68b5b999-50d8-4107-94e6-5f7c15e05d58'], tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:54:46Z, vlan_transparent=None, network_id=6c3db03b-523e-4bc1-b393-9ebce2d989a9, port_security_enabled=False, project_id=b3e5e9f4ac99471688f0279d307f2650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1439, status=DOWN, tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:55:01Z on network 6c3db03b-523e-4bc1-b393-9ebce2d989a9
Feb 01 09:55:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:01.699 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:01 np0005604215.localdomain dnsmasq[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/addn_hosts - 1 addresses
Feb 01 09:55:01 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/host
Feb 01 09:55:01 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/opts
Feb 01 09:55:01 np0005604215.localdomain podman[308137]: 2026-02-01 09:55:01.804456479 +0000 UTC m=+0.057714139 container kill 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:55:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:01.807 259225 INFO neutron.agent.dhcp.agent [None req-cc352a97-5595-4bda-951c-24f03655f028 - - - - - -] DHCP configuration for ports {'46d348b0-12c7-4993-ab4e-2bb80e58ed68'} is completed
Feb 01 09:55:02 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:02.055 259225 INFO neutron.agent.dhcp.agent [None req-a9c866ca-e125-4d1a-81f5-2b8bf247b5e7 - - - - - -] DHCP configuration for ports {'3bfd0ccd-300d-4a86-a663-a937cbd871e8'} is completed
Feb 01 09:55:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v206: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 01 09:55:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:55:02 np0005604215.localdomain ceph-mon[298604]: pgmap v206: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 01 09:55:02 np0005604215.localdomain podman[308157]: 2026-02-01 09:55:02.592992872 +0000 UTC m=+0.058741141 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:55:02 np0005604215.localdomain podman[308157]: 2026-02-01 09:55:02.599491663 +0000 UTC m=+0.065239942 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:55:02 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:55:03 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:03.363 2 INFO neutron.agent.securitygroups_rpc [None req-d4cc719f-d42b-446b-935d-536d497f9b87 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:55:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:55:03 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:03.457 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:02Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003404e1f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b68700>], id=33d00b58-8fe0-49bf-9ca6-6ab8e48b27a8, ip_allocation=immediate, mac_address=fa:16:3e:f6:75:9f, name=tempest-AllowedAddressPairTestJSON-226722493, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:58Z, description=, dns_domain=, id=9ecb4282-8104-4878-8e0d-966d3ce505f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1256779838, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46894, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1423, status=ACTIVE, subnets=['07a6bf4e-6f88-44a1-bfbd-3faf418e14ec'], tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:54:59Z, vlan_transparent=None, network_id=9ecb4282-8104-4878-8e0d-966d3ce505f1, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b'], standard_attr_id=1453, status=DOWN, tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:55:02Z on network 9ecb4282-8104-4878-8e0d-966d3ce505f1
Feb 01 09:55:03 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 1 addresses
Feb 01 09:55:03 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:03 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:03 np0005604215.localdomain podman[308199]: 2026-02-01 09:55:03.659513877 +0000 UTC m=+0.059054501 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:03 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:03.873 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:01Z, description=, device_id=68d684c2-2d8e-49d4-b723-69387fb1e9d1, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003236fd90>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b4f8b0>], id=3bfd0ccd-300d-4a86-a663-a937cbd871e8, ip_allocation=immediate, mac_address=fa:16:3e:a1:4a:21, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:44Z, description=, dns_domain=, id=6c3db03b-523e-4bc1-b393-9ebce2d989a9, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersNegativeIpV6Test-test-network-1734500773, port_security_enabled=True, project_id=b3e5e9f4ac99471688f0279d307f2650, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=52069, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1306, status=ACTIVE, subnets=['68b5b999-50d8-4107-94e6-5f7c15e05d58'], tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:54:46Z, vlan_transparent=None, network_id=6c3db03b-523e-4bc1-b393-9ebce2d989a9, port_security_enabled=False, project_id=b3e5e9f4ac99471688f0279d307f2650, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1439, status=DOWN, tags=[], tenant_id=b3e5e9f4ac99471688f0279d307f2650, updated_at=2026-02-01T09:55:01Z on network 6c3db03b-523e-4bc1-b393-9ebce2d989a9
Feb 01 09:55:03 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:03.922 259225 INFO neutron.agent.dhcp.agent [None req-87c006d8-f43c-4bcd-8a52-3c4da0b64e23 - - - - - -] DHCP configuration for ports {'33d00b58-8fe0-49bf-9ca6-6ab8e48b27a8'} is completed
Feb 01 09:55:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:04.035 2 INFO neutron.agent.securitygroups_rpc [None req-a6c0da30-1e7d-4fdc-b34f-c8211b005180 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['dcd86290-3678-4dc4-8595-e876b5745966']
Feb 01 09:55:04 np0005604215.localdomain dnsmasq[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/addn_hosts - 1 addresses
Feb 01 09:55:04 np0005604215.localdomain podman[308237]: 2026-02-01 09:55:04.062222885 +0000 UTC m=+0.064997854 container kill 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:55:04 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/host
Feb 01 09:55:04 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/opts
Feb 01 09:55:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 01 09:55:04 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:04.265 259225 INFO neutron.agent.dhcp.agent [None req-778cd3bb-243a-42bd-9307-72402127ed5d - - - - - -] DHCP configuration for ports {'3bfd0ccd-300d-4a86-a663-a937cbd871e8'} is completed
Feb 01 09:55:04 np0005604215.localdomain ceph-mon[298604]: pgmap v207: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 19 KiB/s rd, 14 KiB/s wr, 28 op/s
Feb 01 09:55:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:04.590 2 INFO neutron.agent.securitygroups_rpc [None req-14af839d-45a9-4f98-b03e-7a019e6f639f 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['dcd86290-3678-4dc4-8595-e876b5745966']
Feb 01 09:55:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:04.970 2 INFO neutron.agent.securitygroups_rpc [None req-e93a2aaf-9f80-41c1-a11d-131d09e57386 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:05.088 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b499d0>], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:04Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00324ae040>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322c0070>], id=8f90dd66-37b5-4ed3-9dfb-df4f3d5ff644, ip_allocation=immediate, mac_address=fa:16:3e:96:a3:7e, name=tempest-AllowedAddressPairTestJSON-2125362275, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:58Z, description=, dns_domain=, id=9ecb4282-8104-4878-8e0d-966d3ce505f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1256779838, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46894, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1423, status=ACTIVE, subnets=['07a6bf4e-6f88-44a1-bfbd-3faf418e14ec'], tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:54:59Z, vlan_transparent=None, network_id=9ecb4282-8104-4878-8e0d-966d3ce505f1, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b'], standard_attr_id=1459, status=DOWN, tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:55:04Z on network 9ecb4282-8104-4878-8e0d-966d3ce505f1
Feb 01 09:55:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:05 np0005604215.localdomain dnsmasq[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/addn_hosts - 0 addresses
Feb 01 09:55:05 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/host
Feb 01 09:55:05 np0005604215.localdomain podman[308272]: 2026-02-01 09:55:05.172387603 +0000 UTC m=+0.058399119 container kill 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:05 np0005604215.localdomain dnsmasq-dhcp[307921]: read /var/lib/neutron/dhcp/3a0bb9e2-95cc-4b20-87c6-1e5c55901a39/opts
Feb 01 09:55:05 np0005604215.localdomain systemd[1]: tmp-crun.Em7KP1.mount: Deactivated successfully.
Feb 01 09:55:05 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 2 addresses
Feb 01 09:55:05 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:05 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:05 np0005604215.localdomain podman[308305]: 2026-02-01 09:55:05.295386355 +0000 UTC m=+0.063772267 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:05.336 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:05 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:05Z|00138|binding|INFO|Releasing lport 663aeef3-4f9a-4e46-92e6-29e331b8f905 from this chassis (sb_readonly=0)
Feb 01 09:55:05 np0005604215.localdomain kernel: device tap663aeef3-4f left promiscuous mode
Feb 01 09:55:05 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:05Z|00139|binding|INFO|Setting lport 663aeef3-4f9a-4e46-92e6-29e331b8f905 down in Southbound
Feb 01 09:55:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:05.346 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c19dda83-2ee3-4143-9992-3940695b7883, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=663aeef3-4f9a-4e46-92e6-29e331b8f905) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:05.348 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 663aeef3-4f9a-4e46-92e6-29e331b8f905 in datapath 3a0bb9e2-95cc-4b20-87c6-1e5c55901a39 unbound from our chassis
Feb 01 09:55:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:05.350 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 3a0bb9e2-95cc-4b20-87c6-1e5c55901a39 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:05 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:05.351 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd33c81-a445-4f26-9204-765404456673]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:05.358 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:05.526 259225 INFO neutron.agent.dhcp.agent [None req-4fb1290a-041c-4efb-bccd-405fa6edef15 - - - - - -] DHCP configuration for ports {'8f90dd66-37b5-4ed3-9dfb-df4f3d5ff644'} is completed
Feb 01 09:55:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:05.883 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 938 B/s wr, 19 op/s
Feb 01 09:55:06 np0005604215.localdomain ceph-mon[298604]: pgmap v208: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 938 B/s wr, 19 op/s
Feb 01 09:55:06 np0005604215.localdomain podman[308347]: 2026-02-01 09:55:06.672173645 +0000 UTC m=+0.060482295 container kill 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:06 np0005604215.localdomain dnsmasq[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/addn_hosts - 0 addresses
Feb 01 09:55:06 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/host
Feb 01 09:55:06 np0005604215.localdomain dnsmasq-dhcp[307740]: read /var/lib/neutron/dhcp/6c3db03b-523e-4bc1-b393-9ebce2d989a9/opts
Feb 01 09:55:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:06.701 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:07.044 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:07 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:07Z|00140|binding|INFO|Releasing lport 9f362718-c529-402d-be4c-23264e6d4d0a from this chassis (sb_readonly=0)
Feb 01 09:55:07 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:07Z|00141|binding|INFO|Setting lport 9f362718-c529-402d-be4c-23264e6d4d0a down in Southbound
Feb 01 09:55:07 np0005604215.localdomain kernel: device tap9f362718-c5 left promiscuous mode
Feb 01 09:55:07 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:07.060 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-6c3db03b-523e-4bc1-b393-9ebce2d989a9', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6c3db03b-523e-4bc1-b393-9ebce2d989a9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b3e5e9f4ac99471688f0279d307f2650', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=19c2267c-00a5-46e3-9993-22d0e5d1c93f, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9f362718-c529-402d-be4c-23264e6d4d0a) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:07 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:07.062 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9f362718-c529-402d-be4c-23264e6d4d0a in datapath 6c3db03b-523e-4bc1-b393-9ebce2d989a9 unbound from our chassis
Feb 01 09:55:07 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:07.064 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6c3db03b-523e-4bc1-b393-9ebce2d989a9 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:07 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:07.065 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[67171feb-01bb-47ff-8f38-ad261752e1a5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:07.070 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:07 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:07.158 2 INFO neutron.agent.securitygroups_rpc [None req-943127f9-174a-469a-a7d0-e33db638b827 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:07 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:07.306 2 INFO neutron.agent.securitygroups_rpc [None req-eb9a1f3c-34ee-4016-ae11-84944f9bb005 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['a5467c7c-cb9b-4aeb-bb09-b5bf7707aed9']
Feb 01 09:55:07 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 1 addresses
Feb 01 09:55:07 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:07 np0005604215.localdomain podman[308388]: 2026-02-01 09:55:07.569255811 +0000 UTC m=+0.060809095 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:55:07 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:07 np0005604215.localdomain dnsmasq[307921]: exiting on receipt of SIGTERM
Feb 01 09:55:07 np0005604215.localdomain podman[308423]: 2026-02-01 09:55:07.737637868 +0000 UTC m=+0.062204168 container kill 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:07 np0005604215.localdomain systemd[1]: libpod-5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf.scope: Deactivated successfully.
Feb 01 09:55:07 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:07.780 2 INFO neutron.agent.securitygroups_rpc [None req-7cd63175-19a6-47c4-a54b-1047c35ebff0 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['a5467c7c-cb9b-4aeb-bb09-b5bf7707aed9']
Feb 01 09:55:07 np0005604215.localdomain podman[308436]: 2026-02-01 09:55:07.807970397 +0000 UTC m=+0.059395931 container died 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:55:07 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf-userdata-shm.mount: Deactivated successfully.
Feb 01 09:55:07 np0005604215.localdomain podman[308436]: 2026-02-01 09:55:07.843751086 +0000 UTC m=+0.095176580 container cleanup 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:55:07 np0005604215.localdomain systemd[1]: libpod-conmon-5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf.scope: Deactivated successfully.
Feb 01 09:55:07 np0005604215.localdomain podman[308443]: 2026-02-01 09:55:07.886680526 +0000 UTC m=+0.127478730 container remove 5687c71923a452eddfa3c30666f0b986c793de747f66a501e953d12eec06d6bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-3a0bb9e2-95cc-4b20-87c6-1e5c55901a39, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:55:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 938 B/s wr, 19 op/s
Feb 01 09:55:08 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:08.209 259225 INFO neutron.agent.dhcp.agent [None req-ce40eaba-906b-4323-8472-c4affb5dcffe - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:08 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:08.292 2 INFO neutron.agent.securitygroups_rpc [None req-d9ca590a-9c4b-410e-b353-3b648a203b3e cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:08 np0005604215.localdomain ceph-mon[298604]: pgmap v209: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 13 KiB/s rd, 938 B/s wr, 19 op/s
Feb 01 09:55:08 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:08.331 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:07Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032318250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032318070>], id=53908e93-cb8b-4f2a-84a4-322ab382b07b, ip_allocation=immediate, mac_address=fa:16:3e:ff:2b:2b, name=tempest-AllowedAddressPairTestJSON-332394374, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:58Z, description=, dns_domain=, id=9ecb4282-8104-4878-8e0d-966d3ce505f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1256779838, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46894, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1423, status=ACTIVE, subnets=['07a6bf4e-6f88-44a1-bfbd-3faf418e14ec'], tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:54:59Z, vlan_transparent=None, network_id=9ecb4282-8104-4878-8e0d-966d3ce505f1, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b'], standard_attr_id=1466, status=DOWN, tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:55:08Z on network 9ecb4282-8104-4878-8e0d-966d3ce505f1
Feb 01 09:55:08 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 2 addresses
Feb 01 09:55:08 np0005604215.localdomain podman[308482]: 2026-02-01 09:55:08.552659132 +0000 UTC m=+0.063726406 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:55:08 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:08 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c010c25732730f6cf24926dbc34c994a3de25b1f3464184ac39f6a9aeb5aeb15-merged.mount: Deactivated successfully.
Feb 01 09:55:08 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d3a0bb9e2\x2d95cc\x2d4b20\x2d87c6\x2d1e5c55901a39.mount: Deactivated successfully.
Feb 01 09:55:08 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:08.785 259225 INFO neutron.agent.dhcp.agent [None req-22b3d707-c8ee-415d-a48c-f813a6026f76 - - - - - -] DHCP configuration for ports {'53908e93-cb8b-4f2a-84a4-322ab382b07b'} is completed
Feb 01 09:55:09 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:09.357 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:09 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:09.586 2 INFO neutron.agent.securitygroups_rpc [None req-efaae1c0-3e5f-4a79-9705-2ee5fc658831 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']
Feb 01 09:55:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:09.816 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:10 np0005604215.localdomain ceph-mon[298604]: pgmap v210: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:10 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:10.360 2 INFO neutron.agent.securitygroups_rpc [None req-c53a02fb-f287-4ee9-90b1-fd2dea7e171b 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']
Feb 01 09:55:10 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:10.373 2 INFO neutron.agent.securitygroups_rpc [None req-8c9b29e3-84f5-4ecc-a69f-55a0bba3249d cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:10 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 1 addresses
Feb 01 09:55:10 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:10 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:10 np0005604215.localdomain podman[308520]: 2026-02-01 09:55:10.675036644 +0000 UTC m=+0.057841823 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:55:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:10.939 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:11 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:11.310 2 INFO neutron.agent.securitygroups_rpc [None req-f752f535-7480-4edb-a5ff-a77295c4683e 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']
Feb 01 09:55:11 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:11.631 2 INFO neutron.agent.securitygroups_rpc [None req-61a06c10-15de-40c3-8b7d-3148d6b4f873 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:11 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:11.703 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:11Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323ab400>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323ab8b0>], id=8b27933e-cee1-4d4a-b715-b56f1202d238, ip_allocation=immediate, mac_address=fa:16:3e:70:3a:22, name=tempest-AllowedAddressPairTestJSON-1293059564, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:58Z, description=, dns_domain=, id=9ecb4282-8104-4878-8e0d-966d3ce505f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1256779838, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46894, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1423, status=ACTIVE, subnets=['07a6bf4e-6f88-44a1-bfbd-3faf418e14ec'], tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:54:59Z, vlan_transparent=None, network_id=9ecb4282-8104-4878-8e0d-966d3ce505f1, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b'], standard_attr_id=1501, status=DOWN, tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:55:11Z on network 9ecb4282-8104-4878-8e0d-966d3ce505f1
Feb 01 09:55:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:11.703 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:11 np0005604215.localdomain systemd[1]: tmp-crun.Mrz2iy.mount: Deactivated successfully.
Feb 01 09:55:11 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 2 addresses
Feb 01 09:55:11 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:11 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:11 np0005604215.localdomain podman[308558]: 2026-02-01 09:55:11.92744936 +0000 UTC m=+0.071567468 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:55:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:12.010 2 INFO neutron.agent.securitygroups_rpc [None req-c4d0e02a-d41e-4d38-a096-515e00cc05ca 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']
Feb 01 09:55:12 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:12.204 259225 INFO neutron.agent.dhcp.agent [None req-e7be9a8e-5765-4b78-9e79-53b88e8f7202 - - - - - -] DHCP configuration for ports {'8b27933e-cee1-4d4a-b715-b56f1202d238'} is completed
Feb 01 09:55:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:12.301 2 INFO neutron.agent.securitygroups_rpc [None req-7c1928f8-098c-4c19-bf11-b8c429ffbdda 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']
Feb 01 09:55:12 np0005604215.localdomain ceph-mon[298604]: pgmap v211: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:12.948 2 INFO neutron.agent.securitygroups_rpc [None req-28204c12-7df5-4c98-a342-64d5ab507d83 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['438bf7d2-5c8f-4a0b-9c04-1bbb91f9d2e5']
Feb 01 09:55:13 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:13.817 2 INFO neutron.agent.securitygroups_rpc [None req-5b09233a-6601-4e83-a057-9181035eb7ab cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:13 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:13.895 2 INFO neutron.agent.securitygroups_rpc [None req-0b649a65-4805-4006-9abf-770c26af78b1 930a89cab3af43239942c71cee47dc19 904cc8942364443bb4c4a4017bb1e647 - - default default] Security group member updated ['4db01845-8230-4c8d-a3f4-5e942e576ef7']
Feb 01 09:55:14 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 1 addresses
Feb 01 09:55:14 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:14 np0005604215.localdomain podman[308596]: 2026-02-01 09:55:14.080369768 +0000 UTC m=+0.059147554 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 01 09:55:14 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:14 np0005604215.localdomain ceph-mon[298604]: pgmap v212: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:14 np0005604215.localdomain podman[308634]: 2026-02-01 09:55:14.636115668 +0000 UTC m=+0.062772886 container kill 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:14 np0005604215.localdomain dnsmasq[307740]: exiting on receipt of SIGTERM
Feb 01 09:55:14 np0005604215.localdomain systemd[1]: libpod-413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf.scope: Deactivated successfully.
Feb 01 09:55:14 np0005604215.localdomain podman[308648]: 2026-02-01 09:55:14.705987334 +0000 UTC m=+0.054246093 container died 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:55:14 np0005604215.localdomain podman[308648]: 2026-02-01 09:55:14.734747434 +0000 UTC m=+0.083006143 container cleanup 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 09:55:14 np0005604215.localdomain systemd[1]: libpod-conmon-413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf.scope: Deactivated successfully.
Feb 01 09:55:14 np0005604215.localdomain podman[308650]: 2026-02-01 09:55:14.785392483 +0000 UTC m=+0.127980806 container remove 413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6c3db03b-523e-4bc1-b393-9ebce2d989a9, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:55:14 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:14.804 2 INFO neutron.agent.securitygroups_rpc [None req-8cf1a02e-dfaa-4f95-a3f5-4d4a9a4c833f 3874381a42e5464e990880c51dfe02ee b7c21c7c3be54a6ca3c24d0fe0d75778 - - default default] Security group rule updated ['be540add-f8ad-43d9-9aea-3a58bb289e01']
Feb 01 09:55:14 np0005604215.localdomain systemd[1]: virtsecretd.service: Deactivated successfully.
Feb 01 09:55:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2be8a4cf6f9b91e437cac744565b8665b9984dca2b679012aff6c3eac617c63b-merged.mount: Deactivated successfully.
Feb 01 09:55:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-413a89d18eb25a0ca4c567ed78323c32827f90fc85cb2f08860837c4e3da79bf-userdata-shm.mount: Deactivated successfully.
Feb 01 09:55:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:15.087 259225 INFO neutron.agent.dhcp.agent [None req-0c4610f4-844a-488b-b858-a36ad28e1eec - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:15 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d6c3db03b\x2d523e\x2d4bc1\x2db393\x2d9ebce2d989a9.mount: Deactivated successfully.
Feb 01 09:55:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:15.462 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:15.492 2 INFO neutron.agent.securitygroups_rpc [None req-018e5631-8c87-44c1-9046-0c9a678cac95 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:15.606 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:14Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032bb7d30>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b52130>], id=a341db71-f700-4b35-9ac8-bb001c1c4e94, ip_allocation=immediate, mac_address=fa:16:3e:9e:68:6e, name=tempest-AllowedAddressPairTestJSON-1821801941, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:58Z, description=, dns_domain=, id=9ecb4282-8104-4878-8e0d-966d3ce505f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1256779838, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46894, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1423, status=ACTIVE, subnets=['07a6bf4e-6f88-44a1-bfbd-3faf418e14ec'], tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:54:59Z, vlan_transparent=None, network_id=9ecb4282-8104-4878-8e0d-966d3ce505f1, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b'], standard_attr_id=1514, status=DOWN, tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:55:14Z on network 9ecb4282-8104-4878-8e0d-966d3ce505f1
Feb 01 09:55:15 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 2 addresses
Feb 01 09:55:15 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:15 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:15 np0005604215.localdomain podman[308695]: 2026-02-01 09:55:15.819240018 +0000 UTC m=+0.058717411 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 01 09:55:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:15.942 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:16 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:16.092 259225 INFO neutron.agent.dhcp.agent [None req-f476aad5-2c09-4c3f-8a68-5289e94faf03 - - - - - -] DHCP configuration for ports {'a341db71-f700-4b35-9ac8-bb001c1c4e94'} is completed
Feb 01 09:55:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:16 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:16.300 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:16 np0005604215.localdomain ceph-mon[298604]: pgmap v213: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:16.706 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:55:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:55:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:17.787 2 INFO neutron.agent.securitygroups_rpc [None req-ab764583-db62-4331-9e90-bc94b6b4e26a cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:17 np0005604215.localdomain systemd[1]: tmp-crun.3zNBIk.mount: Deactivated successfully.
Feb 01 09:55:17 np0005604215.localdomain podman[308717]: 2026-02-01 09:55:17.856528944 +0000 UTC m=+0.072407765 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:55:17 np0005604215.localdomain podman[308718]: 2026-02-01 09:55:17.874637615 +0000 UTC m=+0.085666996 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:55:17 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:17.883 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:16Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322c5c40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322c5130>], id=0ca28df6-cb43-4342-8724-5b036d328dce, ip_allocation=immediate, mac_address=fa:16:3e:6a:39:a6, name=tempest-AllowedAddressPairTestJSON-943410257, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:54:58Z, description=, dns_domain=, id=9ecb4282-8104-4878-8e0d-966d3ce505f1, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-AllowedAddressPairTestJSON-test-network-1256779838, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=46894, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1423, status=ACTIVE, subnets=['07a6bf4e-6f88-44a1-bfbd-3faf418e14ec'], tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:54:59Z, vlan_transparent=None, network_id=9ecb4282-8104-4878-8e0d-966d3ce505f1, port_security_enabled=True, project_id=7e00f2ed54c74d70847b97f9f434e5e6, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b'], standard_attr_id=1527, status=DOWN, tags=[], tenant_id=7e00f2ed54c74d70847b97f9f434e5e6, updated_at=2026-02-01T09:55:17Z on network 9ecb4282-8104-4878-8e0d-966d3ce505f1
Feb 01 09:55:17 np0005604215.localdomain podman[308718]: 2026-02-01 09:55:17.955350085 +0000 UTC m=+0.166379476 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:55:17 np0005604215.localdomain podman[308717]: 2026-02-01 09:55:17.963033574 +0000 UTC m=+0.178912355 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 09:55:17 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:55:17 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:55:18 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 3 addresses
Feb 01 09:55:18 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:18 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:18 np0005604215.localdomain podman[308782]: 2026-02-01 09:55:18.085714575 +0000 UTC m=+0.068678009 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 01 09:55:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:18.267 259225 INFO neutron.agent.dhcp.agent [None req-e0261892-fad4-4779-acc9-60ee82c0ef91 - - - - - -] DHCP configuration for ports {'0ca28df6-cb43-4342-8724-5b036d328dce'} is completed
Feb 01 09:55:18 np0005604215.localdomain ceph-mon[298604]: pgmap v214: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:18 np0005604215.localdomain systemd[1]: tmp-crun.DjYnRJ.mount: Deactivated successfully.
Feb 01 09:55:19 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:19.335 2 INFO neutron.agent.securitygroups_rpc [None req-fa6c5d49-c4dc-46a9-9459-bf711075cc0d 930a89cab3af43239942c71cee47dc19 904cc8942364443bb4c4a4017bb1e647 - - default default] Security group member updated ['4db01845-8230-4c8d-a3f4-5e942e576ef7']
Feb 01 09:55:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:20 np0005604215.localdomain ceph-mon[298604]: pgmap v215: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:20 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:20.549 2 INFO neutron.agent.securitygroups_rpc [None req-6dfed5db-0b52-4f95-900a-39e3e0691fbf cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:20 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 2 addresses
Feb 01 09:55:20 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:20 np0005604215.localdomain podman[308820]: 2026-02-01 09:55:20.800010598 +0000 UTC m=+0.058121182 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:55:20 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:20.968 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:21 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:21.280 2 INFO neutron.agent.securitygroups_rpc [None req-cbc03874-de2d-4db1-8aeb-b67049c1615b cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:55:21
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['vms', 'backups', '.mgr', 'manila_data', 'manila_metadata', 'volumes', 'images']
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:55:21 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 1 addresses
Feb 01 09:55:21 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:21 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:21 np0005604215.localdomain podman[308856]: 2026-02-01 09:55:21.509469681 +0000 UTC m=+0.059815964 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.002170138888888889 quantized to 16 (current 16)
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:55:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:55:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:21.708 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:22 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:22.096 2 INFO neutron.agent.securitygroups_rpc [None req-22955056-d501-42ef-9a2c-bf3d181d8fe4 cc5a6d3f99ef4279ae1c5508734703e2 7e00f2ed54c74d70847b97f9f434e5e6 - - default default] Security group member updated ['d3c7388a-ba3b-4ea2-bcb7-369aeac4af1b']
Feb 01 09:55:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:22 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:22.220 2 INFO neutron.agent.securitygroups_rpc [None req-68b28681-9762-4f24-92af-1fa7309650a4 edcc55a03c02426f897467232a84b22e eeec82e52999475da0fa4e4a4a8effbd - - default default] Security group rule updated ['150b315a-79ca-493c-98be-8b45107659c4']
Feb 01 09:55:22 np0005604215.localdomain dnsmasq[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/addn_hosts - 0 addresses
Feb 01 09:55:22 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/host
Feb 01 09:55:22 np0005604215.localdomain dnsmasq-dhcp[308119]: read /var/lib/neutron/dhcp/9ecb4282-8104-4878-8e0d-966d3ce505f1/opts
Feb 01 09:55:22 np0005604215.localdomain podman[308893]: 2026-02-01 09:55:22.333165602 +0000 UTC m=+0.062721394 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:55:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:55:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:55:22 np0005604215.localdomain podman[308907]: 2026-02-01 09:55:22.449069124 +0000 UTC m=+0.085037606 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, maintainer=Red Hat, Inc., name=ubi9/ubi-minimal, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, config_id=openstack_network_exporter, architecture=x86_64, version=9.7, io.buildah.version=1.33.7, vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z)
Feb 01 09:55:22 np0005604215.localdomain podman[308907]: 2026-02-01 09:55:22.463708938 +0000 UTC m=+0.099677400 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, release=1769056855, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, name=ubi9/ubi-minimal, managed_by=edpm_ansible, container_name=openstack_network_exporter, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 01 09:55:22 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:55:22 np0005604215.localdomain systemd[1]: tmp-crun.VzOZIy.mount: Deactivated successfully.
Feb 01 09:55:22 np0005604215.localdomain podman[308908]: 2026-02-01 09:55:22.556539834 +0000 UTC m=+0.186182160 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 01 09:55:22 np0005604215.localdomain ceph-mon[298604]: pgmap v216: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:22 np0005604215.localdomain podman[308908]: 2026-02-01 09:55:22.590804975 +0000 UTC m=+0.220447301 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:22 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:55:23 np0005604215.localdomain dnsmasq[308119]: exiting on receipt of SIGTERM
Feb 01 09:55:23 np0005604215.localdomain podman[308967]: 2026-02-01 09:55:23.061043316 +0000 UTC m=+0.054619053 container kill 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:23 np0005604215.localdomain systemd[1]: libpod-9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386.scope: Deactivated successfully.
Feb 01 09:55:23 np0005604215.localdomain podman[308979]: 2026-02-01 09:55:23.134954606 +0000 UTC m=+0.058481402 container died 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:55:23 np0005604215.localdomain podman[308979]: 2026-02-01 09:55:23.166461963 +0000 UTC m=+0.089988719 container cleanup 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:55:23 np0005604215.localdomain systemd[1]: libpod-conmon-9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386.scope: Deactivated successfully.
Feb 01 09:55:23 np0005604215.localdomain podman[308981]: 2026-02-01 09:55:23.214889993 +0000 UTC m=+0.131765934 container remove 9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-9ecb4282-8104-4878-8e0d-966d3ce505f1, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:55:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:23.262 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:23 np0005604215.localdomain kernel: device tapaebf5a93-f1 left promiscuous mode
Feb 01 09:55:23 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:23Z|00142|binding|INFO|Releasing lport aebf5a93-f1df-421a-8bc6-9d245205815f from this chassis (sb_readonly=0)
Feb 01 09:55:23 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:23Z|00143|binding|INFO|Setting lport aebf5a93-f1df-421a-8bc6-9d245205815f down in Southbound
Feb 01 09:55:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:23.277 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-9ecb4282-8104-4878-8e0d-966d3ce505f1', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-9ecb4282-8104-4878-8e0d-966d3ce505f1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '7e00f2ed54c74d70847b97f9f434e5e6', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=64afe251-cbee-41d8-8098-a70c383c96db, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=aebf5a93-f1df-421a-8bc6-9d245205815f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:23.279 158655 INFO neutron.agent.ovn.metadata.agent [-] Port aebf5a93-f1df-421a-8bc6-9d245205815f in datapath 9ecb4282-8104-4878-8e0d-966d3ce505f1 unbound from our chassis
Feb 01 09:55:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:23.282 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 9ecb4282-8104-4878-8e0d-966d3ce505f1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:55:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:23.283 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:23.285 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7c20b04b-5332-4749-a622-1b2ccfb9220c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4ca2471c03211d064700b7109f6deb39ac7204f972477d84418c84baebc05a05-merged.mount: Deactivated successfully.
Feb 01 09:55:23 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d217b20f4d879dfce4c9830258190d034479b66007eecdbbb50fff980de1386-userdata-shm.mount: Deactivated successfully.
Feb 01 09:55:23 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:23.553 259225 INFO neutron.agent.dhcp.agent [None req-26241b25-ba26-48db-bff7-7bc075a8eea3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:23 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:23.554 259225 INFO neutron.agent.dhcp.agent [None req-26241b25-ba26-48db-bff7-7bc075a8eea3 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:23 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d9ecb4282\x2d8104\x2d4878\x2d8e0d\x2d966d3ce505f1.mount: Deactivated successfully.
Feb 01 09:55:24 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:24.057 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v217: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:24 np0005604215.localdomain ceph-mon[298604]: pgmap v217: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:24.446 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e123 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e124 e124: 6 total, 6 up, 6 in
Feb 01 09:55:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:25.353 259225 INFO neutron.agent.linux.ip_lib [None req-eeefd9da-3351-403c-aeb1-adf9d385a92d - - - - - -] Device tapeaa11732-01 cannot be used as it has no MAC address
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.379 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain kernel: device tapeaa11732-01 entered promiscuous mode
Feb 01 09:55:25 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939725.3854] manager: (tapeaa11732-01): new Generic device (/org/freedesktop/NetworkManager/Devices/29)
Feb 01 09:55:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:25Z|00144|binding|INFO|Claiming lport eaa11732-01f9-489d-9b7f-3f8e2175bbb2 for this chassis.
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.387 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:25Z|00145|binding|INFO|eaa11732-01f9-489d-9b7f-3f8e2175bbb2: Claiming unknown
Feb 01 09:55:25 np0005604215.localdomain systemd-udevd[309020]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.392 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:25.397 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-0f958be9-2a71-46b8-be29-bf69d602dea7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f958be9-2a71-46b8-be29-bf69d602dea7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27418984-2ed5-4c42-a2ac-15243821a950, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=eaa11732-01f9-489d-9b7f-3f8e2175bbb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:25.399 158655 INFO neutron.agent.ovn.metadata.agent [-] Port eaa11732-01f9-489d-9b7f-3f8e2175bbb2 in datapath 0f958be9-2a71-46b8-be29-bf69d602dea7 bound to our chassis
Feb 01 09:55:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:25.401 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0f958be9-2a71-46b8-be29-bf69d602dea7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:25.402 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[dd770504-095f-401d-a7bb-3c70844baa0b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:25Z|00146|binding|INFO|Setting lport eaa11732-01f9-489d-9b7f-3f8e2175bbb2 ovn-installed in OVS
Feb 01 09:55:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:25Z|00147|binding|INFO|Setting lport eaa11732-01f9-489d-9b7f-3f8e2175bbb2 up in Southbound
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.420 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapeaa11732-01: No such device
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.458 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.527 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:25.970 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v219: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:26 np0005604215.localdomain podman[309091]: 
Feb 01 09:55:26 np0005604215.localdomain podman[309091]: 2026-02-01 09:55:26.277726995 +0000 UTC m=+0.086795641 container create 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:55:26 np0005604215.localdomain systemd[1]: Started libpod-conmon-1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4.scope.
Feb 01 09:55:26 np0005604215.localdomain systemd[1]: tmp-crun.jV521i.mount: Deactivated successfully.
Feb 01 09:55:26 np0005604215.localdomain podman[309091]: 2026-02-01 09:55:26.233626349 +0000 UTC m=+0.042695025 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:55:26 np0005604215.localdomain ceph-mon[298604]: osdmap e124: 6 total, 6 up, 6 in
Feb 01 09:55:26 np0005604215.localdomain ceph-mon[298604]: pgmap v219: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:26 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:55:26 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/368071a3dbed367c88ceb2ce8433e3b17be46f6454c6658be28f451008d9eca3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:55:26 np0005604215.localdomain podman[309091]: 2026-02-01 09:55:26.36115942 +0000 UTC m=+0.170228066 container init 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 09:55:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e125 e125: 6 total, 6 up, 6 in
Feb 01 09:55:26 np0005604215.localdomain podman[309091]: 2026-02-01 09:55:26.36987972 +0000 UTC m=+0.178948376 container start 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 09:55:26 np0005604215.localdomain dnsmasq[309109]: started, version 2.85 cachesize 150
Feb 01 09:55:26 np0005604215.localdomain dnsmasq[309109]: DNS service limited to local subnets
Feb 01 09:55:26 np0005604215.localdomain dnsmasq[309109]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:55:26 np0005604215.localdomain dnsmasq[309109]: warning: no upstream servers configured
Feb 01 09:55:26 np0005604215.localdomain dnsmasq-dhcp[309109]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:55:26 np0005604215.localdomain dnsmasq[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/addn_hosts - 0 addresses
Feb 01 09:55:26 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/host
Feb 01 09:55:26 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/opts
Feb 01 09:55:26 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:26.444 259225 INFO neutron.agent.dhcp.agent [None req-eeefd9da-3351-403c-aeb1-adf9d385a92d - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:25Z, description=, device_id=faab0309-c85c-4332-a9f4-449a0ffeae16, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003232f250>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003232fe20>], id=355f615d-2122-4356-9769-53760b28d43d, ip_allocation=immediate, mac_address=fa:16:3e:ea:eb:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:22Z, description=, dns_domain=, id=0f958be9-2a71-46b8-be29-bf69d602dea7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-160748546, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40133, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1549, status=ACTIVE, subnets=['93832961-de74-40ac-80ba-b5c061596bf4'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:24Z, vlan_transparent=None, network_id=0f958be9-2a71-46b8-be29-bf69d602dea7, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1571, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:25Z on network 0f958be9-2a71-46b8-be29-bf69d602dea7
Feb 01 09:55:26 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:26.555 259225 INFO neutron.agent.dhcp.agent [None req-b19bd063-8896-4fe7-bd72-4c53d5aa80ed - - - - - -] DHCP configuration for ports {'862ccf8d-641e-477e-8e4f-8de14626e350'} is completed
Feb 01 09:55:26 np0005604215.localdomain dnsmasq[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/addn_hosts - 1 addresses
Feb 01 09:55:26 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/host
Feb 01 09:55:26 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/opts
Feb 01 09:55:26 np0005604215.localdomain podman[309128]: 2026-02-01 09:55:26.629930628 +0000 UTC m=+0.060055151 container kill 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 01 09:55:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:26.709 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:26 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:26.901 259225 INFO neutron.agent.dhcp.agent [None req-677b17b3-5850-47cf-89df-6737ac4422e6 - - - - - -] DHCP configuration for ports {'355f615d-2122-4356-9769-53760b28d43d'} is completed
Feb 01 09:55:27 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:27.130 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:25Z, description=, device_id=faab0309-c85c-4332-a9f4-449a0ffeae16, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310c70>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310df0>], id=355f615d-2122-4356-9769-53760b28d43d, ip_allocation=immediate, mac_address=fa:16:3e:ea:eb:ca, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:22Z, description=, dns_domain=, id=0f958be9-2a71-46b8-be29-bf69d602dea7, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-160748546, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40133, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1549, status=ACTIVE, subnets=['93832961-de74-40ac-80ba-b5c061596bf4'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:24Z, vlan_transparent=None, network_id=0f958be9-2a71-46b8-be29-bf69d602dea7, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1571, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:55:25Z on network 0f958be9-2a71-46b8-be29-bf69d602dea7
Feb 01 09:55:27 np0005604215.localdomain podman[309168]: 2026-02-01 09:55:27.317708188 +0000 UTC m=+0.057848662 container kill 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:55:27 np0005604215.localdomain dnsmasq[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/addn_hosts - 1 addresses
Feb 01 09:55:27 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/host
Feb 01 09:55:27 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/opts
Feb 01 09:55:27 np0005604215.localdomain ceph-mon[298604]: osdmap e125: 6 total, 6 up, 6 in
Feb 01 09:55:27 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:27.864 259225 INFO neutron.agent.dhcp.agent [None req-45d44043-08f4-444f-aaf8-3a05b2f1566a - - - - - -] DHCP configuration for ports {'355f615d-2122-4356-9769-53760b28d43d'} is completed
Feb 01 09:55:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v221: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 767 B/s wr, 1 op/s
Feb 01 09:55:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e126 e126: 6 total, 6 up, 6 in
Feb 01 09:55:28 np0005604215.localdomain ceph-mon[298604]: pgmap v221: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 767 B/s wr, 1 op/s
Feb 01 09:55:28 np0005604215.localdomain dnsmasq[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/addn_hosts - 0 addresses
Feb 01 09:55:28 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/host
Feb 01 09:55:28 np0005604215.localdomain dnsmasq-dhcp[309109]: read /var/lib/neutron/dhcp/0f958be9-2a71-46b8-be29-bf69d602dea7/opts
Feb 01 09:55:28 np0005604215.localdomain podman[309205]: 2026-02-01 09:55:28.833637181 +0000 UTC m=+0.057832144 container kill 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 01 09:55:29 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:29Z|00148|binding|INFO|Releasing lport eaa11732-01f9-489d-9b7f-3f8e2175bbb2 from this chassis (sb_readonly=0)
Feb 01 09:55:29 np0005604215.localdomain kernel: device tapeaa11732-01 left promiscuous mode
Feb 01 09:55:29 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:29Z|00149|binding|INFO|Setting lport eaa11732-01f9-489d-9b7f-3f8e2175bbb2 down in Southbound
Feb 01 09:55:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:29.053 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:29.065 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-0f958be9-2a71-46b8-be29-bf69d602dea7', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f958be9-2a71-46b8-be29-bf69d602dea7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=27418984-2ed5-4c42-a2ac-15243821a950, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=eaa11732-01f9-489d-9b7f-3f8e2175bbb2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:29.067 158655 INFO neutron.agent.ovn.metadata.agent [-] Port eaa11732-01f9-489d-9b7f-3f8e2175bbb2 in datapath 0f958be9-2a71-46b8-be29-bf69d602dea7 unbound from our chassis
Feb 01 09:55:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:29.069 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0f958be9-2a71-46b8-be29-bf69d602dea7 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:29.070 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7242f089-db43-4182-99ac-3092a5ea36bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:29.071 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:29 np0005604215.localdomain ceph-mon[298604]: osdmap e126: 6 total, 6 up, 6 in
Feb 01 09:55:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:29.625 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:29.626 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:29 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:29.627 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:55:29 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:55:29 np0005604215.localdomain systemd[1]: tmp-crun.fzz6db.mount: Deactivated successfully.
Feb 01 09:55:29 np0005604215.localdomain podman[309227]: 2026-02-01 09:55:29.874748979 +0000 UTC m=+0.084254392 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 01 09:55:29 np0005604215.localdomain podman[309227]: 2026-02-01 09:55:29.885182142 +0000 UTC m=+0.094687595 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Feb 01 09:55:29 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:30.001 2 INFO neutron.agent.securitygroups_rpc [None req-83329cbf-bffb-48da-a04c-50fb44fb93a0 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:55:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:55:30 np0005604215.localdomain dnsmasq[309109]: exiting on receipt of SIGTERM
Feb 01 09:55:30 np0005604215.localdomain systemd[1]: tmp-crun.Z4xu1g.mount: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain systemd[1]: libpod-1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4.scope: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain podman[309262]: 2026-02-01 09:55:30.035451779 +0000 UTC m=+0.063885741 container kill 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 01 09:55:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:55:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157173 "" "Go-http-client/1.1"
Feb 01 09:55:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e126 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:30 np0005604215.localdomain podman[236852]: 2026-02-01 09:55:30.122170406 +0000 UTC m=+1786.269114419 container died 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:55:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:55:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18657 "" "Go-http-client/1.1"
Feb 01 09:55:30 np0005604215.localdomain podman[309274]: 2026-02-01 09:55:30.183883958 +0000 UTC m=+0.133617401 container cleanup 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:55:30 np0005604215.localdomain systemd[1]: libpod-conmon-1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4.scope: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 1023 B/s wr, 1 op/s
Feb 01 09:55:30 np0005604215.localdomain podman[309276]: 2026-02-01 09:55:30.236201128 +0000 UTC m=+0.179797971 container remove 1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0f958be9-2a71-46b8-be29-bf69d602dea7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:30 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:30.261 259225 INFO neutron.agent.dhcp.agent [None req-adaa159b-a0b4-4e1e-93aa-04402ed92f9a - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:30 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:30.415 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:30 np0005604215.localdomain ceph-mon[298604]: pgmap v223: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 1023 B/s wr, 1 op/s
Feb 01 09:55:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e127 e127: 6 total, 6 up, 6 in
Feb 01 09:55:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:30.648 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:30 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:30.831 2 INFO neutron.agent.securitygroups_rpc [None req-32692fbf-89c1-4916-95dd-247e36fdbe6a e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-368071a3dbed367c88ceb2ce8433e3b17be46f6454c6658be28f451008d9eca3-merged.mount: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1667d668f9d4040ea369c9fa887e4d24551b66133bc87d19b634731f42c30bd4-userdata-shm.mount: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d0f958be9\x2d2a71\x2d46b8\x2dbe29\x2dbf69d602dea7.mount: Deactivated successfully.
Feb 01 09:55:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:30.972 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:31 np0005604215.localdomain ceph-mon[298604]: osdmap e127: 6 total, 6 up, 6 in
Feb 01 09:55:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:55:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:55:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:55:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:55:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:55:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:55:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e128 e128: 6 total, 6 up, 6 in
Feb 01 09:55:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:31.711 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 525 B/s rd, 1.0 KiB/s wr, 2 op/s
Feb 01 09:55:32 np0005604215.localdomain ceph-mon[298604]: osdmap e128: 6 total, 6 up, 6 in
Feb 01 09:55:32 np0005604215.localdomain ceph-mon[298604]: pgmap v226: 177 pgs: 177 active+clean; 145 MiB data, 755 MiB used, 41 GiB / 42 GiB avail; 525 B/s rd, 1.0 KiB/s wr, 2 op/s
Feb 01 09:55:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:55:32 np0005604215.localdomain podman[309303]: 2026-02-01 09:55:32.851500584 +0000 UTC m=+0.068479024 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:55:32 np0005604215.localdomain podman[309303]: 2026-02-01 09:55:32.864684323 +0000 UTC m=+0.081662703 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:55:32 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:55:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3011359184' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:55:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3011359184' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:55:33 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:33.929 2 INFO neutron.agent.securitygroups_rpc [None req-d273fe1f-00e9-4cf8-ba92-3b038868502e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v227: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 1.7 KiB/s wr, 48 op/s
Feb 01 09:55:34 np0005604215.localdomain sudo[309326]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:55:34 np0005604215.localdomain sudo[309326]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:55:34 np0005604215.localdomain ceph-mon[298604]: pgmap v227: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 1.7 KiB/s wr, 48 op/s
Feb 01 09:55:34 np0005604215.localdomain sudo[309326]: pam_unix(sudo:session): session closed for user root
Feb 01 09:55:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:55:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3636105754' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:55:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:55:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3636105754' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:55:34 np0005604215.localdomain sudo[309344]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 09:55:34 np0005604215.localdomain sudo[309344]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:55:34 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:34.990 2 INFO neutron.agent.securitygroups_rpc [None req-cdd29e27-3485-4128-94e1-06734434ccf5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:35 np0005604215.localdomain sudo[309344]: pam_unix(sudo:session): session closed for user root
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e128 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:55:35 np0005604215.localdomain sudo[309384]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:55:35 np0005604215.localdomain sudo[309384]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:55:35 np0005604215.localdomain sudo[309384]: pam_unix(sudo:session): session closed for user root
Feb 01 09:55:35 np0005604215.localdomain sudo[309402]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:55:35 np0005604215.localdomain sudo[309402]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3636105754' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3636105754' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:35 np0005604215.localdomain sudo[309402]: pam_unix(sudo:session): session closed for user root
Feb 01 09:55:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:35.974 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:55:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 88a42565-8b01-4ae4-85fa-58e794a8e3e8 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:55:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 88a42565-8b01-4ae4-85fa-58e794a8e3e8 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:55:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 88a42565-8b01-4ae4-85fa-58e794a8e3e8 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:55:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 1.3 KiB/s wr, 37 op/s
Feb 01 09:55:36 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:36.312 2 INFO neutron.agent.securitygroups_rpc [None req-0cc22e1e-f882-4556-8673-0cf2b002c102 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:36 np0005604215.localdomain sudo[309451]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:55:36 np0005604215.localdomain sudo[309451]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:55:36 np0005604215.localdomain sudo[309451]: pam_unix(sudo:session): session closed for user root
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e129 e129: 6 total, 6 up, 6 in
Feb 01 09:55:36 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: pgmap v228: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 1.3 KiB/s wr, 37 op/s
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: osdmap e129: 6 total, 6 up, 6 in
Feb 01 09:55:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:55:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:36.716 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:37.288 2 INFO neutron.agent.securitygroups_rpc [None req-9031f569-eff7-411d-8454-e1e2bf358206 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:37.650 2 INFO neutron.agent.securitygroups_rpc [None req-c13e4698-12de-4fa4-84de-f7194e33c853 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 45 op/s
Feb 01 09:55:38 np0005604215.localdomain ceph-mon[298604]: pgmap v230: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 34 KiB/s rd, 1.7 KiB/s wr, 45 op/s
Feb 01 09:55:38 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:38.387 2 INFO neutron.agent.securitygroups_rpc [None req-c43d2a56-e16d-478a-abbd-9e2bb456c208 d96cff636365480c93dc8d1f3e16c531 272972c8d99e4a5c99e73e4bdb72346d - - default default] Security group rule updated ['56a3691b-0dfa-477a-aaac-6fc6d2066735']
Feb 01 09:55:38 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:38.536 2 INFO neutron.agent.securitygroups_rpc [None req-e32d5a82-c050-454c-9bc9-fb7e97cffc23 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:38 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:38.677 2 INFO neutron.agent.securitygroups_rpc [None req-b42ac328-eb3f-4f32-8c68-ad479176a68e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:39 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:39.502 2 INFO neutron.agent.securitygroups_rpc [None req-89d33254-82ae-4671-bea8-643bd9d50212 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:39 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:39.630 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:55:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 1.5 KiB/s wr, 40 op/s
Feb 01 09:55:40 np0005604215.localdomain ceph-mon[298604]: pgmap v231: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 1.5 KiB/s wr, 40 op/s
Feb 01 09:55:40 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:40.956 2 INFO neutron.agent.securitygroups_rpc [None req-f70d179e-59ca-44a3-8018-26f7c96f2b8c 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:40.976 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:41.642 2 INFO neutron.agent.securitygroups_rpc [None req-3276d6b5-a480-4f28-8680-1993dd5ca124 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']
Feb 01 09:55:41 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:41.657 259225 INFO neutron.agent.linux.ip_lib [None req-22ee0ef7-2461-4635-aaab-6107be9deebd - - - - - -] Device tap750c32c9-1c cannot be used as it has no MAC address
Feb 01 09:55:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:41.678 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain kernel: device tap750c32c9-1c entered promiscuous mode
Feb 01 09:55:41 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939741.6865] manager: (tap750c32c9-1c): new Generic device (/org/freedesktop/NetworkManager/Devices/30)
Feb 01 09:55:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:41Z|00150|binding|INFO|Claiming lport 750c32c9-1ccc-42ba-84bc-e13c95225798 for this chassis.
Feb 01 09:55:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:41Z|00151|binding|INFO|750c32c9-1ccc-42ba-84bc-e13c95225798: Claiming unknown
Feb 01 09:55:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:41.690 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain systemd-udevd[309479]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.705 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-ae16cdd8-4ef0-4acb-9779-9431fa50e220', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae16cdd8-4ef0-4acb-9779-9431fa50e220', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f88f2edf4c492c9754208b1c502849', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea5af0e4-f5ed-413c-862a-945a06818c24, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=750c32c9-1ccc-42ba-84bc-e13c95225798) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.707 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 750c32c9-1ccc-42ba-84bc-e13c95225798 in datapath ae16cdd8-4ef0-4acb-9779-9431fa50e220 bound to our chassis
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.711 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae16cdd8-4ef0-4acb-9779-9431fa50e220 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.712 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0fd155-15ee-4687-9d84-24a1587a0a9d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:41.733 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:41Z|00152|binding|INFO|Setting lport 750c32c9-1ccc-42ba-84bc-e13c95225798 ovn-installed in OVS
Feb 01 09:55:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:41Z|00153|binding|INFO|Setting lport 750c32c9-1ccc-42ba-84bc-e13c95225798 up in Southbound
Feb 01 09:55:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:41.738 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:41.772 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.773 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.774 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:55:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:41.774 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:55:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:41.803 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:41 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:41.856 2 INFO neutron.agent.securitygroups_rpc [None req-739b236d-6306-45a2-92f5-3e504f993767 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:42.046 2 INFO neutron.agent.securitygroups_rpc [None req-3c4d812b-8cb5-4a19-9709-fc562d1570e9 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v232: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.3 KiB/s wr, 35 op/s
Feb 01 09:55:42 np0005604215.localdomain podman[309534]: 
Feb 01 09:55:42 np0005604215.localdomain podman[309534]: 2026-02-01 09:55:42.576377759 +0000 UTC m=+0.087898064 container create 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 01 09:55:42 np0005604215.localdomain ceph-mon[298604]: pgmap v232: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 26 KiB/s rd, 1.3 KiB/s wr, 35 op/s
Feb 01 09:55:42 np0005604215.localdomain systemd[1]: Started libpod-conmon-36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5.scope.
Feb 01 09:55:42 np0005604215.localdomain podman[309534]: 2026-02-01 09:55:42.534561194 +0000 UTC m=+0.046081539 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:55:42 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:55:42 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e7ba26053806f3553eef62c358afce0b2364e6978b21be5bb236b0fdebaf3c20/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:55:42 np0005604215.localdomain podman[309534]: 2026-02-01 09:55:42.653998135 +0000 UTC m=+0.165518440 container init 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:55:42 np0005604215.localdomain podman[309534]: 2026-02-01 09:55:42.664280114 +0000 UTC m=+0.175800419 container start 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:55:42 np0005604215.localdomain dnsmasq[309552]: started, version 2.85 cachesize 150
Feb 01 09:55:42 np0005604215.localdomain dnsmasq[309552]: DNS service limited to local subnets
Feb 01 09:55:42 np0005604215.localdomain dnsmasq[309552]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:55:42 np0005604215.localdomain dnsmasq[309552]: warning: no upstream servers configured
Feb 01 09:55:42 np0005604215.localdomain dnsmasq-dhcp[309552]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:55:42 np0005604215.localdomain dnsmasq[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/addn_hosts - 0 addresses
Feb 01 09:55:42 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/host
Feb 01 09:55:42 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/opts
Feb 01 09:55:42 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:42.720 259225 INFO neutron.agent.dhcp.agent [None req-22ee0ef7-2461-4635-aaab-6107be9deebd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032bbdc40>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b74640>], id=bf862379-338a-4125-9b52-b08c60b25ce1, ip_allocation=immediate, mac_address=fa:16:3e:61:21:58, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1067819702, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:37Z, description=, dns_domain=, id=ae16cdd8-4ef0-4acb-9779-9431fa50e220, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1849249413, port_security_enabled=True, project_id=28f88f2edf4c492c9754208b1c502849, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51522, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1700, status=ACTIVE, subnets=['7b5b389a-883e-47a4-b850-997574034dd2'], tags=[], tenant_id=28f88f2edf4c492c9754208b1c502849, updated_at=2026-02-01T09:55:40Z, vlan_transparent=None, network_id=ae16cdd8-4ef0-4acb-9779-9431fa50e220, port_security_enabled=True, project_id=28f88f2edf4c492c9754208b1c502849, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f98fef45-df22-4656-9ceb-98910abc5fa5'], standard_attr_id=1721, status=DOWN, tags=[], tenant_id=28f88f2edf4c492c9754208b1c502849, updated_at=2026-02-01T09:55:41Z on network ae16cdd8-4ef0-4acb-9779-9431fa50e220
Feb 01 09:55:42 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:42.770 259225 INFO neutron.agent.dhcp.agent [None req-9b40e706-78fc-4a5e-a99d-f8d1b753b333 - - - - - -] DHCP configuration for ports {'ec2100b4-db1f-4d32-9011-61922e9925f7'} is completed
Feb 01 09:55:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:42.893 2 INFO neutron.agent.securitygroups_rpc [None req-201a996d-ad5d-4320-ba39-e8954e21d5dc 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']
Feb 01 09:55:42 np0005604215.localdomain dnsmasq[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/addn_hosts - 1 addresses
Feb 01 09:55:42 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/host
Feb 01 09:55:42 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/opts
Feb 01 09:55:42 np0005604215.localdomain podman[309572]: 2026-02-01 09:55:42.912457373 +0000 UTC m=+0.058431612 container kill 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 01 09:55:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:43.054 259225 INFO neutron.agent.dhcp.agent [None req-22ee0ef7-2461-4635-aaab-6107be9deebd - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:42Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003234ad00>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003234a250>, <neutron.agent.linux.dhcp.DictModel object at 0x7f003234a880>, <neutron.agent.linux.dhcp.DictModel object at 0x7f003234aeb0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003234a7f0>], id=9196554d-ed55-40d1-9612-8e12d76f3b7c, ip_allocation=immediate, mac_address=fa:16:3e:2f:f0:08, name=tempest-ExtraDHCPOptionsIpV6TestJSON-1863404296, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:37Z, description=, dns_domain=, id=ae16cdd8-4ef0-4acb-9779-9431fa50e220, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-ExtraDHCPOptionsIpV6TestJSON-test-network-1849249413, port_security_enabled=True, project_id=28f88f2edf4c492c9754208b1c502849, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=51522, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1700, status=ACTIVE, subnets=['7b5b389a-883e-47a4-b850-997574034dd2'], tags=[], tenant_id=28f88f2edf4c492c9754208b1c502849, updated_at=2026-02-01T09:55:40Z, vlan_transparent=None, network_id=ae16cdd8-4ef0-4acb-9779-9431fa50e220, port_security_enabled=True, project_id=28f88f2edf4c492c9754208b1c502849, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['f98fef45-df22-4656-9ceb-98910abc5fa5'], standard_attr_id=1737, status=DOWN, tags=[], tenant_id=28f88f2edf4c492c9754208b1c502849, updated_at=2026-02-01T09:55:42Z on network ae16cdd8-4ef0-4acb-9779-9431fa50e220
Feb 01 09:55:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:43.072 259225 INFO neutron.agent.linux.dhcp [None req-22ee0ef7-2461-4635-aaab-6107be9deebd - - - - - -] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Feb 01 09:55:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:43.073 259225 INFO neutron.agent.linux.dhcp [None req-22ee0ef7-2461-4635-aaab-6107be9deebd - - - - - -] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Feb 01 09:55:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:43.073 259225 INFO neutron.agent.linux.dhcp [None req-22ee0ef7-2461-4635-aaab-6107be9deebd - - - - - -] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Feb 01 09:55:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:43.127 259225 INFO neutron.agent.dhcp.agent [None req-eb24e3e8-62ca-4c98-9b13-8d750a21ca79 - - - - - -] DHCP configuration for ports {'bf862379-338a-4125-9b52-b08c60b25ce1'} is completed
Feb 01 09:55:43 np0005604215.localdomain dnsmasq[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/addn_hosts - 2 addresses
Feb 01 09:55:43 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/host
Feb 01 09:55:43 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/opts
Feb 01 09:55:43 np0005604215.localdomain podman[309613]: 2026-02-01 09:55:43.241082986 +0000 UTC m=+0.055933294 container kill 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:55:43 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:43.386 2 INFO neutron.agent.securitygroups_rpc [None req-37d62ac1-fed3-4baa-9fc5-73880f6c1760 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:43 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:43.449 2 INFO neutron.agent.securitygroups_rpc [None req-3fe18ee3-04ff-4a52-96de-1f4dd720f476 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:43.507 259225 INFO neutron.agent.dhcp.agent [None req-1b1afa22-388c-4f3a-a459-6c5728e1c876 - - - - - -] DHCP configuration for ports {'9196554d-ed55-40d1-9612-8e12d76f3b7c'} is completed
Feb 01 09:55:43 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:43.912 2 INFO neutron.agent.securitygroups_rpc [None req-14278f14-d261-4185-9918-5bcd6670f17f 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']
Feb 01 09:55:44 np0005604215.localdomain podman[309649]: 2026-02-01 09:55:44.152182566 +0000 UTC m=+0.061576748 container kill 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:55:44 np0005604215.localdomain dnsmasq[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/addn_hosts - 1 addresses
Feb 01 09:55:44 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/host
Feb 01 09:55:44 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/opts
Feb 01 09:55:44 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:44.171 2 INFO neutron.agent.securitygroups_rpc [None req-edf8cab4-6686-4ad5-95d6-283face1fc91 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 3.9 KiB/s rd, 307 B/s wr, 5 op/s
Feb 01 09:55:44 np0005604215.localdomain ceph-mon[298604]: pgmap v233: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 3.9 KiB/s rd, 307 B/s wr, 5 op/s
Feb 01 09:55:44 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:44.405 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:55:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00334c2580>], dns_domain=, dns_name=, extra_dhcp_opts=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322d2c70>, <neutron.agent.linux.dhcp.DictModel object at 0x7f00322d27f0>, <neutron.agent.linux.dhcp.DictModel object at 0x7f00322d2fd0>], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322d2ee0>], id=bf862379-338a-4125-9b52-b08c60b25ce1, ip_allocation=immediate, mac_address=fa:16:3e:61:21:58, name=tempest-new-port-name-875240783, network_id=ae16cdd8-4ef0-4acb-9779-9431fa50e220, port_security_enabled=True, project_id=28f88f2edf4c492c9754208b1c502849, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['f98fef45-df22-4656-9ceb-98910abc5fa5'], standard_attr_id=1721, status=DOWN, tags=[], tenant_id=28f88f2edf4c492c9754208b1c502849, updated_at=2026-02-01T09:55:44Z on network ae16cdd8-4ef0-4acb-9779-9431fa50e220
Feb 01 09:55:44 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:44.420 259225 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option server-ip-address because it's ip_version 4 is not in port's address IP versions
Feb 01 09:55:44 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:44.421 259225 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option bootfile-name because it's ip_version 4 is not in port's address IP versions
Feb 01 09:55:44 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:44.421 259225 INFO neutron.agent.linux.dhcp [-] Cannot apply dhcp option tftp-server because it's ip_version 4 is not in port's address IP versions
Feb 01 09:55:44 np0005604215.localdomain dnsmasq[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/addn_hosts - 1 addresses
Feb 01 09:55:44 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/host
Feb 01 09:55:44 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/opts
Feb 01 09:55:44 np0005604215.localdomain podman[309687]: 2026-02-01 09:55:44.600401185 +0000 UTC m=+0.065185011 container kill 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:55:44 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:44.811 2 INFO neutron.agent.securitygroups_rpc [None req-6cb6b6f1-5b7d-4172-a864-3d0396afa917 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:44 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:44.986 259225 INFO neutron.agent.dhcp.agent [None req-ca729419-63e1-4987-ad10-810dca2406a3 - - - - - -] DHCP configuration for ports {'bf862379-338a-4125-9b52-b08c60b25ce1'} is completed
Feb 01 09:55:45 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:45.045 2 INFO neutron.agent.securitygroups_rpc [None req-889331f7-1b8f-45c7-9568-2acc0f065d63 565c83edf044493d9bd1199ad90d627d 28f88f2edf4c492c9754208b1c502849 - - default default] Security group member updated ['f98fef45-df22-4656-9ceb-98910abc5fa5']
Feb 01 09:55:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:45 np0005604215.localdomain systemd[1]: tmp-crun.oMQ2lI.mount: Deactivated successfully.
Feb 01 09:55:45 np0005604215.localdomain dnsmasq[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/addn_hosts - 0 addresses
Feb 01 09:55:45 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/host
Feb 01 09:55:45 np0005604215.localdomain podman[309724]: 2026-02-01 09:55:45.244575005 +0000 UTC m=+0.068422782 container kill 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:55:45 np0005604215.localdomain dnsmasq-dhcp[309552]: read /var/lib/neutron/dhcp/ae16cdd8-4ef0-4acb-9779-9431fa50e220/opts
Feb 01 09:55:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:45.978 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:46 np0005604215.localdomain dnsmasq[309552]: exiting on receipt of SIGTERM
Feb 01 09:55:46 np0005604215.localdomain podman[309761]: 2026-02-01 09:55:46.111421274 +0000 UTC m=+0.048927607 container kill 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:46 np0005604215.localdomain systemd[1]: libpod-36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5.scope: Deactivated successfully.
Feb 01 09:55:46 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:46.168 2 INFO neutron.agent.securitygroups_rpc [None req-04f416d8-8fa2-4799-adf0-ff612b0eb9e5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:46 np0005604215.localdomain podman[309775]: 2026-02-01 09:55:46.179317717 +0000 UTC m=+0.051566478 container died 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:55:46 np0005604215.localdomain systemd[1]: tmp-crun.PNKpGg.mount: Deactivated successfully.
Feb 01 09:55:46 np0005604215.localdomain podman[309775]: 2026-02-01 09:55:46.215571741 +0000 UTC m=+0.087820472 container cleanup 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:55:46 np0005604215.localdomain systemd[1]: libpod-conmon-36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5.scope: Deactivated successfully.
Feb 01 09:55:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 3.9 KiB/s rd, 307 B/s wr, 5 op/s
Feb 01 09:55:46 np0005604215.localdomain podman[309776]: 2026-02-01 09:55:46.255751766 +0000 UTC m=+0.124138558 container remove 36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ae16cdd8-4ef0-4acb-9779-9431fa50e220, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:55:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:46.307 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:46 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:46Z|00154|binding|INFO|Releasing lport 750c32c9-1ccc-42ba-84bc-e13c95225798 from this chassis (sb_readonly=0)
Feb 01 09:55:46 np0005604215.localdomain kernel: device tap750c32c9-1c left promiscuous mode
Feb 01 09:55:46 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:55:46Z|00155|binding|INFO|Setting lport 750c32c9-1ccc-42ba-84bc-e13c95225798 down in Southbound
Feb 01 09:55:46 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:46.318 2 INFO neutron.agent.securitygroups_rpc [None req-31a351f9-c21a-4778-8d23-ceef24052c50 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:46.323 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-ae16cdd8-4ef0-4acb-9779-9431fa50e220', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ae16cdd8-4ef0-4acb-9779-9431fa50e220', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '28f88f2edf4c492c9754208b1c502849', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ea5af0e4-f5ed-413c-862a-945a06818c24, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=750c32c9-1ccc-42ba-84bc-e13c95225798) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:55:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:46.324 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 750c32c9-1ccc-42ba-84bc-e13c95225798 in datapath ae16cdd8-4ef0-4acb-9779-9431fa50e220 unbound from our chassis
Feb 01 09:55:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:46.327 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ae16cdd8-4ef0-4acb-9779-9431fa50e220 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:55:46 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:55:46.328 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[f4210da8-d272-4342-851e-a09ed3076ef8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:55:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:46.328 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:46 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:46.359 259225 INFO neutron.agent.dhcp.agent [None req-718b6805-6850-49b5-b355-503fd866ab19 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:46 np0005604215.localdomain ceph-mon[298604]: pgmap v234: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 3.9 KiB/s rd, 307 B/s wr, 5 op/s
Feb 01 09:55:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:46.738 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:46 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:55:46.745 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:55:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:47.091 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:47.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:47.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:55:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:47.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:55:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e7ba26053806f3553eef62c358afce0b2364e6978b21be5bb236b0fdebaf3c20-merged.mount: Deactivated successfully.
Feb 01 09:55:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-36dda2713cf22e6c85648690c9fbdbfab9d85ce4c33fddceef5ec94474b864b5-userdata-shm.mount: Deactivated successfully.
Feb 01 09:55:47 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dae16cdd8\x2d4ef0\x2d4acb\x2d9779\x2d9431fa50e220.mount: Deactivated successfully.
Feb 01 09:55:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:47.122 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:55:47 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:47.569 2 INFO neutron.agent.securitygroups_rpc [None req-33ff6921-4704-427a-80ac-43e95e4fc8cf 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:47 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:47.860 2 INFO neutron.agent.securitygroups_rpc [None req-d0c90d7d-7396-4a27-b056-110260352268 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v235: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 3.3 KiB/s rd, 263 B/s wr, 4 op/s
Feb 01 09:55:48 np0005604215.localdomain ceph-mon[298604]: pgmap v235: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 3.3 KiB/s rd, 263 B/s wr, 4 op/s
Feb 01 09:55:48 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:48.671 2 INFO neutron.agent.securitygroups_rpc [None req-08f39597-7736-4b6c-bf06-18c33436307c 6febfd614c0f4e5bbcdad7acfe861496 6419fd8b712b467ea6e03df22d411fcf - - default default] Security group member updated ['4a3b0332-f824-4e4c-b1eb-cf09581851da']
Feb 01 09:55:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:55:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:55:48 np0005604215.localdomain systemd[1]: tmp-crun.HngtPp.mount: Deactivated successfully.
Feb 01 09:55:48 np0005604215.localdomain podman[309803]: 2026-02-01 09:55:48.872481716 +0000 UTC m=+0.088773632 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible)
Feb 01 09:55:48 np0005604215.localdomain podman[309804]: 2026-02-01 09:55:48.917102448 +0000 UTC m=+0.129152742 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:55:48 np0005604215.localdomain podman[309803]: 2026-02-01 09:55:48.935710944 +0000 UTC m=+0.152002840 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:55:48 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:55:48 np0005604215.localdomain podman[309804]: 2026-02-01 09:55:48.952234196 +0000 UTC m=+0.164284460 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:55:48 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:55:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:50.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:50.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:50.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 09:55:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:50 np0005604215.localdomain ceph-mon[298604]: pgmap v236: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:51.012 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:51 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:51.315 2 INFO neutron.agent.securitygroups_rpc [None req-a98fbbc3-cb77-458d-bae0-4950f59446e4 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:55:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:55:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:55:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:55:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:55:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:55:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:51.741 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:52 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:52.140 2 INFO neutron.agent.securitygroups_rpc [None req-0dd8b6b5-2ecb-4256-a677-3e4c95ec3623 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.259 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.260 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.260 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.282 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.282 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.282 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.283 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.283 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:55:52 np0005604215.localdomain ceph-mon[298604]: pgmap v237: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:55:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1153971238' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.748 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.464s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:55:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:55:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:55:52 np0005604215.localdomain podman[309871]: 2026-02-01 09:55:52.864037814 +0000 UTC m=+0.079311259 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, release=1769056855, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git)
Feb 01 09:55:52 np0005604215.localdomain podman[309871]: 2026-02-01 09:55:52.874666723 +0000 UTC m=+0.089940188 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, architecture=x86_64, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter)
Feb 01 09:55:52 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:55:52 np0005604215.localdomain podman[309872]: 2026-02-01 09:55:52.876473349 +0000 UTC m=+0.088220194 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.933 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.934 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11650MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.935 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:55:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:52.935 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:55:52 np0005604215.localdomain podman[309872]: 2026-02-01 09:55:52.956699395 +0000 UTC m=+0.168446220 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:55:52 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.224 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.225 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.483 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:55:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:55:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/382175384' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1153971238' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/382175384' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:55:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2705022701' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.930 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.447s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.936 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.957 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.985 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:55:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:53.985 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.050s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:55:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:54.442 2 INFO neutron.agent.securitygroups_rpc [None req-e7867791-cd65-411f-8ed6-a1d97d2d0b42 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2705022701' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:54 np0005604215.localdomain ceph-mon[298604]: pgmap v238: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:54.826 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:54.826 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:54.827 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:54.827 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:55:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:55:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2045844312' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:55 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:55:55.811 2 INFO neutron.agent.securitygroups_rpc [None req-58b746b2-1860-41ad-b399-3d8dcfe6ba21 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:55:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:56.069 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:56.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v239: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:56 np0005604215.localdomain ceph-mon[298604]: pgmap v239: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:55:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:56.742 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:55:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:57.117 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3319678924' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:55:58.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:55:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 341 B/s wr, 0 op/s
Feb 01 09:55:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3775181733' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:55:58 np0005604215.localdomain ceph-mon[298604]: pgmap v240: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 341 B/s wr, 0 op/s
Feb 01 09:56:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:56:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:56:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:56:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 09:56:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:56:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18302 "" "Go-http-client/1.1"
Feb 01 09:56:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e129 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v241: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 341 B/s wr, 0 op/s
Feb 01 09:56:00 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:56:00 np0005604215.localdomain podman[309930]: 2026-02-01 09:56:00.854391476 +0000 UTC m=+0.069963559 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ceilometer_agent_compute)
Feb 01 09:56:00 np0005604215.localdomain podman[309930]: 2026-02-01 09:56:00.868700569 +0000 UTC m=+0.084272722 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 09:56:00 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:56:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:01.070 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:01 np0005604215.localdomain ceph-mon[298604]: pgmap v241: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 341 B/s wr, 0 op/s
Feb 01 09:56:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4057334485' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:56:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e130 e130: 6 total, 6 up, 6 in
Feb 01 09:56:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:56:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:56:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:56:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:56:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:56:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:56:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:01.747 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v243: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 409 B/s wr, 1 op/s
Feb 01 09:56:02 np0005604215.localdomain ceph-mon[298604]: osdmap e130: 6 total, 6 up, 6 in
Feb 01 09:56:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e131 e131: 6 total, 6 up, 6 in
Feb 01 09:56:02 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:02.680 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:03 np0005604215.localdomain ceph-mon[298604]: pgmap v243: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 409 B/s wr, 1 op/s
Feb 01 09:56:03 np0005604215.localdomain ceph-mon[298604]: osdmap e131: 6 total, 6 up, 6 in
Feb 01 09:56:03 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:03.364 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:03 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e132 e132: 6 total, 6 up, 6 in
Feb 01 09:56:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:56:03 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:03.824 259225 INFO neutron.agent.linux.ip_lib [None req-3140bc6e-3c00-4049-bd5d-7eceb2ee1ff1 - - - - - -] Device tap71466265-5f cannot be used as it has no MAC address
Feb 01 09:56:03 np0005604215.localdomain podman[309950]: 2026-02-01 09:56:03.847822748 +0000 UTC m=+0.094376965 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:56:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:03.857 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:03 np0005604215.localdomain podman[309950]: 2026-02-01 09:56:03.858475648 +0000 UTC m=+0.105029835 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:56:03 np0005604215.localdomain kernel: device tap71466265-5f entered promiscuous mode
Feb 01 09:56:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:03.865 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:03Z|00156|binding|INFO|Claiming lport 71466265-5f83-483a-a896-41c28a392e73 for this chassis.
Feb 01 09:56:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:03Z|00157|binding|INFO|71466265-5f83-483a-a896-41c28a392e73: Claiming unknown
Feb 01 09:56:03 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939763.8683] manager: (tap71466265-5f): new Generic device (/org/freedesktop/NetworkManager/Devices/31)
Feb 01 09:56:03 np0005604215.localdomain systemd-udevd[309981]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:56:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:03.877 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-6d00e50d-ad20-4c3b-83fb-c0f039efd634', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d00e50d-ad20-4c3b-83fb-c0f039efd634', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd93c82b-505e-4fa1-935a-07bfb46ac2bf, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=71466265-5f83-483a-a896-41c28a392e73) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:03.879 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 71466265-5f83-483a-a896-41c28a392e73 in datapath 6d00e50d-ad20-4c3b-83fb-c0f039efd634 bound to our chassis
Feb 01 09:56:03 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:56:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:03.883 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d00e50d-ad20-4c3b-83fb-c0f039efd634 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:56:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:03.884 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[f6030755-8968-4660-a849-553f20ca8d00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:03Z|00158|binding|INFO|Setting lport 71466265-5f83-483a-a896-41c28a392e73 ovn-installed in OVS
Feb 01 09:56:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:03Z|00159|binding|INFO|Setting lport 71466265-5f83-483a-a896-41c28a392e73 up in Southbound
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:03.904 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap71466265-5f: No such device
Feb 01 09:56:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:03.939 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:03.966 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v246: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s rd, 511 B/s wr, 5 op/s
Feb 01 09:56:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e133 e133: 6 total, 6 up, 6 in
Feb 01 09:56:04 np0005604215.localdomain ceph-mon[298604]: osdmap e132: 6 total, 6 up, 6 in
Feb 01 09:56:04 np0005604215.localdomain ceph-mon[298604]: pgmap v246: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 4.0 KiB/s rd, 511 B/s wr, 5 op/s
Feb 01 09:56:04 np0005604215.localdomain podman[310052]: 
Feb 01 09:56:04 np0005604215.localdomain podman[310052]: 2026-02-01 09:56:04.782727666 +0000 UTC m=+0.082834217 container create cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 01 09:56:04 np0005604215.localdomain systemd[1]: Started libpod-conmon-cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69.scope.
Feb 01 09:56:04 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:56:04 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/080106a4e4259907a6b8268b5326d948d1c07084ef37858b7872beeabb761334/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:56:04 np0005604215.localdomain podman[310052]: 2026-02-01 09:56:04.743227722 +0000 UTC m=+0.043334273 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:56:04 np0005604215.localdomain podman[310052]: 2026-02-01 09:56:04.846834272 +0000 UTC m=+0.146940823 container init cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:56:04 np0005604215.localdomain podman[310052]: 2026-02-01 09:56:04.854878572 +0000 UTC m=+0.154985113 container start cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:56:04 np0005604215.localdomain dnsmasq[310071]: started, version 2.85 cachesize 150
Feb 01 09:56:04 np0005604215.localdomain dnsmasq[310071]: DNS service limited to local subnets
Feb 01 09:56:04 np0005604215.localdomain dnsmasq[310071]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:56:04 np0005604215.localdomain dnsmasq[310071]: warning: no upstream servers configured
Feb 01 09:56:04 np0005604215.localdomain dnsmasq-dhcp[310071]: DHCPv6, static leases only on 2001:db8:2::, lease time 1d
Feb 01 09:56:04 np0005604215.localdomain dnsmasq[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/addn_hosts - 0 addresses
Feb 01 09:56:04 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/host
Feb 01 09:56:04 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/opts
Feb 01 09:56:04 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:04.918 259225 INFO neutron.agent.dhcp.agent [None req-3140bc6e-3c00-4049-bd5d-7eceb2ee1ff1 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:03Z, description=, device_id=ca4d5fd2-fcc5-4bbf-84e5-6e063f1f23d4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032bafeb0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032bafb50>], id=87ed5c48-18a2-4a05-820b-da5952a8289d, ip_allocation=immediate, mac_address=fa:16:3e:33:ce:c8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:59Z, description=, dns_domain=, id=6d00e50d-ad20-4c3b-83fb-c0f039efd634, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-352168199, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21257, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1857, status=ACTIVE, subnets=['88123a8b-9c30-4a59-b1d6-4fd658119a87'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:56:01Z, vlan_transparent=None, network_id=6d00e50d-ad20-4c3b-83fb-c0f039efd634, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1876, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:56:03Z on network 6d00e50d-ad20-4c3b-83fb-c0f039efd634
Feb 01 09:56:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:05.095 259225 INFO neutron.agent.dhcp.agent [None req-312d8b52-72a4-4c64-9c9e-81992ee8e002 - - - - - -] DHCP configuration for ports {'915d9796-5daf-41ca-ab93-9109392896ab'} is completed
Feb 01 09:56:05 np0005604215.localdomain dnsmasq[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/addn_hosts - 1 addresses
Feb 01 09:56:05 np0005604215.localdomain podman[310088]: 2026-02-01 09:56:05.0978241 +0000 UTC m=+0.056443990 container kill cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:56:05 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/host
Feb 01 09:56:05 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/opts
Feb 01 09:56:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:05.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:05.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 09:56:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e133 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:05.129 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 09:56:05 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:05.237 2 INFO neutron.agent.securitygroups_rpc [None req-33d595e3-b3a6-4bc9-b70b-e120045130a2 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:56:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:05.340 259225 INFO neutron.agent.dhcp.agent [None req-9c5e2d71-9c53-492b-925b-55c362b4aa11 - - - - - -] DHCP configuration for ports {'87ed5c48-18a2-4a05-820b-da5952a8289d'} is completed
Feb 01 09:56:05 np0005604215.localdomain ceph-mon[298604]: osdmap e133: 6 total, 6 up, 6 in
Feb 01 09:56:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:05.438 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:56:03Z, description=, device_id=ca4d5fd2-fcc5-4bbf-84e5-6e063f1f23d4, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323351f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032335970>], id=87ed5c48-18a2-4a05-820b-da5952a8289d, ip_allocation=immediate, mac_address=fa:16:3e:33:ce:c8, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:55:59Z, description=, dns_domain=, id=6d00e50d-ad20-4c3b-83fb-c0f039efd634, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersIpV6Test-352168199, port_security_enabled=True, project_id=904cc8942364443bb4c4a4017bb1e647, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=21257, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=1857, status=ACTIVE, subnets=['88123a8b-9c30-4a59-b1d6-4fd658119a87'], tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:56:01Z, vlan_transparent=None, network_id=6d00e50d-ad20-4c3b-83fb-c0f039efd634, port_security_enabled=False, project_id=904cc8942364443bb4c4a4017bb1e647, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=1876, status=DOWN, tags=[], tenant_id=904cc8942364443bb4c4a4017bb1e647, updated_at=2026-02-01T09:56:03Z on network 6d00e50d-ad20-4c3b-83fb-c0f039efd634
Feb 01 09:56:05 np0005604215.localdomain dnsmasq[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/addn_hosts - 1 addresses
Feb 01 09:56:05 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/host
Feb 01 09:56:05 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/opts
Feb 01 09:56:05 np0005604215.localdomain podman[310126]: 2026-02-01 09:56:05.622657841 +0000 UTC m=+0.057748170 container kill cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:56:05 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:05.897 2 INFO neutron.agent.securitygroups_rpc [None req-4789720c-93ff-4d5d-a9e8-dc630a3e4cba 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:56:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:05.901 259225 INFO neutron.agent.dhcp.agent [None req-0067f83a-94ec-4f98-99b3-9b304250a23a - - - - - -] DHCP configuration for ports {'87ed5c48-18a2-4a05-820b-da5952a8289d'} is completed
Feb 01 09:56:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:05.983 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:06.072 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v248: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 4.9 KiB/s rd, 626 B/s wr, 6 op/s
Feb 01 09:56:06 np0005604215.localdomain ceph-mon[298604]: pgmap v248: 177 pgs: 1 active+clean+snaptrim, 176 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 4.9 KiB/s rd, 626 B/s wr, 6 op/s
Feb 01 09:56:06 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:06.527 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:06.748 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3348501364' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3348501364' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v249: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 4.2 KiB/s wr, 60 op/s
Feb 01 09:56:08 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:56:08 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2310334247' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:56:08 np0005604215.localdomain ceph-mon[298604]: pgmap v249: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 4.2 KiB/s wr, 60 op/s
Feb 01 09:56:08 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2310334247' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:56:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e134 e134: 6 total, 6 up, 6 in
Feb 01 09:56:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e134 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 3.2 KiB/s wr, 48 op/s
Feb 01 09:56:10 np0005604215.localdomain ceph-mon[298604]: osdmap e134: 6 total, 6 up, 6 in
Feb 01 09:56:10 np0005604215.localdomain ceph-mon[298604]: pgmap v251: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 3.2 KiB/s wr, 48 op/s
Feb 01 09:56:10 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:10.913 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:11.073 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:11 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e135 e135: 6 total, 6 up, 6 in
Feb 01 09:56:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3348857733' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3348857733' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:11 np0005604215.localdomain ceph-mon[298604]: osdmap e135: 6 total, 6 up, 6 in
Feb 01 09:56:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:11.749 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 2.8 KiB/s wr, 42 op/s
Feb 01 09:56:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:12.600 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:12 np0005604215.localdomain ceph-mon[298604]: pgmap v253: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 2.8 KiB/s wr, 42 op/s
Feb 01 09:56:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/878563889' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/878563889' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:13 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e136 e136: 6 total, 6 up, 6 in
Feb 01 09:56:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 2.8 KiB/s wr, 112 op/s
Feb 01 09:56:14 np0005604215.localdomain ceph-mon[298604]: osdmap e136: 6 total, 6 up, 6 in
Feb 01 09:56:14 np0005604215.localdomain ceph-mon[298604]: pgmap v255: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 2.8 KiB/s wr, 112 op/s
Feb 01 09:56:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e136 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e137 e137: 6 total, 6 up, 6 in
Feb 01 09:56:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:16.075 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 2.8 KiB/s wr, 112 op/s
Feb 01 09:56:16 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:16.400 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/455368239' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/455368239' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e138 e138: 6 total, 6 up, 6 in
Feb 01 09:56:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:16.759 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: osdmap e137: 6 total, 6 up, 6 in
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: pgmap v257: 177 pgs: 177 active+clean; 145 MiB data, 756 MiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 2.8 KiB/s wr, 112 op/s
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/455368239' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/455368239' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:16 np0005604215.localdomain ceph-mon[298604]: osdmap e138: 6 total, 6 up, 6 in
Feb 01 09:56:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v259: 177 pgs: 177 active+clean; 145 MiB data, 764 MiB used, 41 GiB / 42 GiB avail; 140 KiB/s rd, 5.2 KiB/s wr, 186 op/s
Feb 01 09:56:18 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:18.679 2 INFO neutron.agent.securitygroups_rpc [None req-f9943192-ce60-4425-aa06-00cabb160f7d 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:56:18 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:18.973 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:18 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:18.975 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:56:18 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:18.978 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:18 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:18.979 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[db80b759-b6b7-45d4-a231-472d5477d428]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:19 np0005604215.localdomain ceph-mon[298604]: pgmap v259: 177 pgs: 177 active+clean; 145 MiB data, 764 MiB used, 41 GiB / 42 GiB avail; 140 KiB/s rd, 5.2 KiB/s wr, 186 op/s
Feb 01 09:56:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:56:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:56:19 np0005604215.localdomain podman[310148]: 2026-02-01 09:56:19.868009005 +0000 UTC m=+0.082726874 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:56:19 np0005604215.localdomain podman[310148]: 2026-02-01 09:56:19.876762656 +0000 UTC m=+0.091480515 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:56:19 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:56:19 np0005604215.localdomain podman[310147]: 2026-02-01 09:56:19.918485899 +0000 UTC m=+0.134269372 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:56:19 np0005604215.localdomain podman[310147]: 2026-02-01 09:56:19.954238507 +0000 UTC m=+0.170022010 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:56:19 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:56:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e138 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v260: 177 pgs: 177 active+clean; 145 MiB data, 764 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 2.1 KiB/s wr, 67 op/s
Feb 01 09:56:20 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1575583271' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:56:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:21.114 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:56:21
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['images', 'manila_metadata', '.mgr', 'backups', 'vms', 'manila_data', 'volumes']
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:56:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e139 e139: 6 total, 6 up, 6 in
Feb 01 09:56:21 np0005604215.localdomain ceph-mon[298604]: pgmap v260: 177 pgs: 177 active+clean; 145 MiB data, 764 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 2.1 KiB/s wr, 67 op/s
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.443522589800856e-05 quantized to 32 (current 32)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.002170138888888889 quantized to 16 (current 16)
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:56:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e140 e140: 6 total, 6 up, 6 in
Feb 01 09:56:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:21.763 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:21 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:21.974 259225 INFO neutron.agent.linux.ip_lib [None req-6674391d-ef0d-468c-9ac3-43050e859039 - - - - - -] Device tap89a67c8f-ab cannot be used as it has no MAC address
Feb 01 09:56:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:21.994 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:21 np0005604215.localdomain kernel: device tap89a67c8f-ab entered promiscuous mode
Feb 01 09:56:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:22.000 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:22 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:22Z|00160|binding|INFO|Claiming lport 89a67c8f-abeb-44ba-987b-710ed5812b98 for this chassis.
Feb 01 09:56:22 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:22Z|00161|binding|INFO|89a67c8f-abeb-44ba-987b-710ed5812b98: Claiming unknown
Feb 01 09:56:22 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939782.0025] manager: (tap89a67c8f-ab): new Generic device (/org/freedesktop/NetworkManager/Devices/32)
Feb 01 09:56:22 np0005604215.localdomain systemd-udevd[310203]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:22Z|00162|binding|INFO|Setting lport 89a67c8f-abeb-44ba-987b-710ed5812b98 ovn-installed in OVS
Feb 01 09:56:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:22.037 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:22.040 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap89a67c8f-ab: No such device
Feb 01 09:56:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:22.074 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:22.101 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:22 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:22Z|00163|binding|INFO|Setting lport 89a67c8f-abeb-44ba-987b-710ed5812b98 up in Southbound
Feb 01 09:56:22 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:22.117 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c0dbb3ef-d632-48b0-b256-d985cf33ea92', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0dbb3ef-d632-48b0-b256-d985cf33ea92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7955782-fea5-4e19-bc74-89fb26d9b2eb, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=89a67c8f-abeb-44ba-987b-710ed5812b98) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:22 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:22.119 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 89a67c8f-abeb-44ba-987b-710ed5812b98 in datapath c0dbb3ef-d632-48b0-b256-d985cf33ea92 bound to our chassis
Feb 01 09:56:22 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:22.121 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0dbb3ef-d632-48b0-b256-d985cf33ea92 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:56:22 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:22.122 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[a0398152-1a9d-4763-8df3-431b02624317]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 764 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 2.3 KiB/s wr, 73 op/s
Feb 01 09:56:22 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:22.296 2 INFO neutron.agent.securitygroups_rpc [None req-a183bb9b-36ef-42c1-85bf-6ec8456cdf42 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:56:22 np0005604215.localdomain ceph-mon[298604]: osdmap e139: 6 total, 6 up, 6 in
Feb 01 09:56:22 np0005604215.localdomain ceph-mon[298604]: osdmap e140: 6 total, 6 up, 6 in
Feb 01 09:56:22 np0005604215.localdomain ceph-mon[298604]: pgmap v263: 177 pgs: 177 active+clean; 145 MiB data, 764 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 2.3 KiB/s wr, 73 op/s
Feb 01 09:56:22 np0005604215.localdomain podman[310274]: 
Feb 01 09:56:22 np0005604215.localdomain podman[310274]: 2026-02-01 09:56:22.917119162 +0000 UTC m=+0.092765266 container create d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:56:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:56:22 np0005604215.localdomain systemd[1]: Started libpod-conmon-d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61.scope.
Feb 01 09:56:22 np0005604215.localdomain podman[310274]: 2026-02-01 09:56:22.868115353 +0000 UTC m=+0.043761467 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:56:22 np0005604215.localdomain systemd[1]: tmp-crun.yanSzL.mount: Deactivated successfully.
Feb 01 09:56:22 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:56:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:56:22 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ede7d4779104218ea621e19c563a4f5da37bdff6220daf39da5c830ef37d9d02/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:56:23 np0005604215.localdomain podman[310274]: 2026-02-01 09:56:23.002484877 +0000 UTC m=+0.178130961 container init d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:56:23 np0005604215.localdomain dnsmasq[310310]: started, version 2.85 cachesize 150
Feb 01 09:56:23 np0005604215.localdomain dnsmasq[310310]: DNS service limited to local subnets
Feb 01 09:56:23 np0005604215.localdomain dnsmasq[310310]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:56:23 np0005604215.localdomain dnsmasq[310310]: warning: no upstream servers configured
Feb 01 09:56:23 np0005604215.localdomain dnsmasq-dhcp[310310]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:56:23 np0005604215.localdomain dnsmasq[310310]: read /var/lib/neutron/dhcp/c0dbb3ef-d632-48b0-b256-d985cf33ea92/addn_hosts - 0 addresses
Feb 01 09:56:23 np0005604215.localdomain dnsmasq-dhcp[310310]: read /var/lib/neutron/dhcp/c0dbb3ef-d632-48b0-b256-d985cf33ea92/host
Feb 01 09:56:23 np0005604215.localdomain dnsmasq-dhcp[310310]: read /var/lib/neutron/dhcp/c0dbb3ef-d632-48b0-b256-d985cf33ea92/opts
Feb 01 09:56:23 np0005604215.localdomain podman[310288]: 2026-02-01 09:56:23.04646148 +0000 UTC m=+0.088750801 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, architecture=x86_64, version=9.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:56:23 np0005604215.localdomain podman[310274]: 2026-02-01 09:56:23.061927388 +0000 UTC m=+0.237573472 container start d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:56:23 np0005604215.localdomain podman[310288]: 2026-02-01 09:56:23.087688066 +0000 UTC m=+0.129977397 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.buildah.version=1.33.7, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.7, release=1769056855, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc.)
Feb 01 09:56:23 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:56:23 np0005604215.localdomain podman[310302]: 2026-02-01 09:56:23.131969099 +0000 UTC m=+0.129975718 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:56:23 np0005604215.localdomain podman[310302]: 2026-02-01 09:56:23.165986422 +0000 UTC m=+0.163993061 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 01 09:56:23 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:56:23 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:23.444 259225 INFO neutron.agent.dhcp.agent [None req-aefb7c01-4e9b-4934-bf70-fd73a85dda45 - - - - - -] DHCP configuration for ports {'3ffbce15-2efc-45cc-abbd-0d8f25ff7bdf'} is completed
Feb 01 09:56:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e141 e141: 6 total, 6 up, 6 in
Feb 01 09:56:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 2.7 KiB/s wr, 31 op/s
Feb 01 09:56:24 np0005604215.localdomain ceph-mon[298604]: osdmap e141: 6 total, 6 up, 6 in
Feb 01 09:56:24 np0005604215.localdomain ceph-mon[298604]: pgmap v265: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 22 KiB/s rd, 2.7 KiB/s wr, 31 op/s
Feb 01 09:56:25 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:25.103 2 INFO neutron.agent.securitygroups_rpc [None req-f4397a72-704c-448d-a51f-22a40616a177 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:56:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e141 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:25.271 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e142 e142: 6 total, 6 up, 6 in
Feb 01 09:56:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:26.116 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 3.3 KiB/s wr, 38 op/s
Feb 01 09:56:26 np0005604215.localdomain ceph-mon[298604]: osdmap e142: 6 total, 6 up, 6 in
Feb 01 09:56:26 np0005604215.localdomain ceph-mon[298604]: pgmap v267: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 3.3 KiB/s wr, 38 op/s
Feb 01 09:56:26 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2961240377' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:26 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2961240377' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:26.765 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:26 np0005604215.localdomain dnsmasq[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/addn_hosts - 0 addresses
Feb 01 09:56:26 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/host
Feb 01 09:56:26 np0005604215.localdomain dnsmasq-dhcp[310071]: read /var/lib/neutron/dhcp/6d00e50d-ad20-4c3b-83fb-c0f039efd634/opts
Feb 01 09:56:26 np0005604215.localdomain podman[310347]: 2026-02-01 09:56:26.89481086 +0000 UTC m=+0.060179225 container kill cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 09:56:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:27.342 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:27Z|00164|binding|INFO|Releasing lport 71466265-5f83-483a-a896-41c28a392e73 from this chassis (sb_readonly=0)
Feb 01 09:56:27 np0005604215.localdomain kernel: device tap71466265-5f left promiscuous mode
Feb 01 09:56:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:27Z|00165|binding|INFO|Setting lport 71466265-5f83-483a-a896-41c28a392e73 down in Southbound
Feb 01 09:56:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:27.358 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:27.483 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:2::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-6d00e50d-ad20-4c3b-83fb-c0f039efd634', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-6d00e50d-ad20-4c3b-83fb-c0f039efd634', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '904cc8942364443bb4c4a4017bb1e647', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd93c82b-505e-4fa1-935a-07bfb46ac2bf, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=71466265-5f83-483a-a896-41c28a392e73) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:27.485 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 71466265-5f83-483a-a896-41c28a392e73 in datapath 6d00e50d-ad20-4c3b-83fb-c0f039efd634 unbound from our chassis
Feb 01 09:56:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:27.487 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 6d00e50d-ad20-4c3b-83fb-c0f039efd634 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:56:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:27.488 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[6cae717b-6b33-42c9-9dc1-ccb3b4692c79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 4.5 KiB/s wr, 95 op/s
Feb 01 09:56:28 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:28.356 2 INFO neutron.agent.securitygroups_rpc [None req-abdb9ca6-56bb-47f8-92cb-3bfb04a52114 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:56:29 np0005604215.localdomain ceph-mon[298604]: pgmap v268: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 4.5 KiB/s wr, 95 op/s
Feb 01 09:56:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e143 e143: 6 total, 6 up, 6 in
Feb 01 09:56:29 np0005604215.localdomain dnsmasq[310071]: exiting on receipt of SIGTERM
Feb 01 09:56:29 np0005604215.localdomain podman[310387]: 2026-02-01 09:56:29.797446869 +0000 UTC m=+0.063611622 container kill cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 01 09:56:29 np0005604215.localdomain systemd[1]: libpod-cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69.scope: Deactivated successfully.
Feb 01 09:56:29 np0005604215.localdomain podman[310402]: 2026-02-01 09:56:29.871788792 +0000 UTC m=+0.060114223 container died cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:56:29 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69-userdata-shm.mount: Deactivated successfully.
Feb 01 09:56:29 np0005604215.localdomain podman[310402]: 2026-02-01 09:56:29.899451319 +0000 UTC m=+0.087776710 container cleanup cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:56:29 np0005604215.localdomain systemd[1]: libpod-conmon-cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69.scope: Deactivated successfully.
Feb 01 09:56:29 np0005604215.localdomain podman[310407]: 2026-02-01 09:56:29.953402501 +0000 UTC m=+0.126826381 container remove cbf4b698789784ecb7f952b9bb712e1282ae0a87e7cfdf356f96e5df169dda69 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-6d00e50d-ad20-4c3b-83fb-c0f039efd634, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:56:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:56:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:56:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:56:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157173 "" "Go-http-client/1.1"
Feb 01 09:56:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:56:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18766 "" "Go-http-client/1.1"
Feb 01 09:56:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e143 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:30 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:30.152 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:30 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:30.154 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:56:30 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:30.157 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:30 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:30.159 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[6d38de4a-12c0-46ea-a6d9-d2d8d843a69a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 2.1 KiB/s wr, 67 op/s
Feb 01 09:56:30 np0005604215.localdomain ceph-mon[298604]: osdmap e143: 6 total, 6 up, 6 in
Feb 01 09:56:30 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-080106a4e4259907a6b8268b5326d948d1c07084ef37858b7872beeabb761334-merged.mount: Deactivated successfully.
Feb 01 09:56:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:31.038 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:31 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:31.039 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:31 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:31.042 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:56:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:31.118 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:31 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:31.176 259225 INFO neutron.agent.dhcp.agent [None req-30ec3d39-beaa-4a68-857b-7802b4190f44 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:31 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:31.178 259225 INFO neutron.agent.dhcp.agent [None req-30ec3d39-beaa-4a68-857b-7802b4190f44 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:31 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d6d00e50d\x2dad20\x2d4c3b\x2d83fb\x2dc0f039efd634.mount: Deactivated successfully.
Feb 01 09:56:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:56:31 np0005604215.localdomain podman[310431]: 2026-02-01 09:56:31.286722124 +0000 UTC m=+0.083863560 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:56:31 np0005604215.localdomain podman[310431]: 2026-02-01 09:56:31.302185663 +0000 UTC m=+0.099327099 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute)
Feb 01 09:56:31 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:56:31 np0005604215.localdomain ceph-mon[298604]: pgmap v270: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 50 KiB/s rd, 2.1 KiB/s wr, 67 op/s
Feb 01 09:56:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e144 e144: 6 total, 6 up, 6 in
Feb 01 09:56:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:56:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:56:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:56:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:56:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:56:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:56:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e145 e145: 6 total, 6 up, 6 in
Feb 01 09:56:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:31.798 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 2.3 KiB/s wr, 73 op/s
Feb 01 09:56:32 np0005604215.localdomain ceph-mon[298604]: osdmap e144: 6 total, 6 up, 6 in
Feb 01 09:56:32 np0005604215.localdomain ceph-mon[298604]: osdmap e145: 6 total, 6 up, 6 in
Feb 01 09:56:32 np0005604215.localdomain ceph-mon[298604]: pgmap v273: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 2.3 KiB/s wr, 73 op/s
Feb 01 09:56:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 7.2 KiB/s wr, 78 op/s
Feb 01 09:56:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:56:34 np0005604215.localdomain podman[310448]: 2026-02-01 09:56:34.85319514 +0000 UTC m=+0.074924592 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:56:34 np0005604215.localdomain podman[310448]: 2026-02-01 09:56:34.862090875 +0000 UTC m=+0.083820337 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:56:34 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:56:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e145 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e146 e146: 6 total, 6 up, 6 in
Feb 01 09:56:35 np0005604215.localdomain ceph-mon[298604]: pgmap v274: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 7.2 KiB/s wr, 78 op/s
Feb 01 09:56:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/498295892' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/498295892' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:35 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:35.891 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:36.121 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 7.2 KiB/s wr, 78 op/s
Feb 01 09:56:36 np0005604215.localdomain ceph-mon[298604]: osdmap e146: 6 total, 6 up, 6 in
Feb 01 09:56:36 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1641231477' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:36 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1641231477' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:36 np0005604215.localdomain sudo[310471]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:56:36 np0005604215.localdomain sudo[310471]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:56:36 np0005604215.localdomain sudo[310471]: pam_unix(sudo:session): session closed for user root
Feb 01 09:56:36 np0005604215.localdomain sudo[310489]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:56:36 np0005604215.localdomain sudo[310489]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:56:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:36.847 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:36.863 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:37.020 2 INFO neutron.agent.securitygroups_rpc [None req-ea765bd5-9cd7-4b20-a560-8c3da0273449 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:56:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:37.044 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:56:37 np0005604215.localdomain sudo[310489]: pam_unix(sudo:session): session closed for user root
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:56:37 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev ed7849d2-2d73-42b0-82c7-88ba13ad4b5f (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:56:37 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev ed7849d2-2d73-42b0-82c7-88ba13ad4b5f (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:56:37 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event ed7849d2-2d73-42b0-82c7-88ba13ad4b5f (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: pgmap v276: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 7.2 KiB/s wr, 78 op/s
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:56:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e147 e147: 6 total, 6 up, 6 in
Feb 01 09:56:37 np0005604215.localdomain sudo[310538]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:56:37 np0005604215.localdomain sudo[310538]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:56:37 np0005604215.localdomain sudo[310538]: pam_unix(sudo:session): session closed for user root
Feb 01 09:56:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 11 KiB/s wr, 110 op/s
Feb 01 09:56:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e148 e148: 6 total, 6 up, 6 in
Feb 01 09:56:38 np0005604215.localdomain ceph-mon[298604]: osdmap e147: 6 total, 6 up, 6 in
Feb 01 09:56:38 np0005604215.localdomain ceph-mon[298604]: pgmap v278: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 77 KiB/s rd, 11 KiB/s wr, 110 op/s
Feb 01 09:56:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:38.630 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:38.633 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:56:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:38.636 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:38.637 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[c8464432-c746-4302-8a61-ee43795e0c12]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:39 np0005604215.localdomain ceph-mon[298604]: osdmap e148: 6 total, 6 up, 6 in
Feb 01 09:56:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e148 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 4.8 KiB/s wr, 42 op/s
Feb 01 09:56:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e149 e149: 6 total, 6 up, 6 in
Feb 01 09:56:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3715535998' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:56:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3715535998' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:56:40 np0005604215.localdomain ceph-mon[298604]: pgmap v280: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 4.8 KiB/s wr, 42 op/s
Feb 01 09:56:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:41.124 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:41 np0005604215.localdomain ceph-mon[298604]: osdmap e149: 6 total, 6 up, 6 in
Feb 01 09:56:41 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:41.535 2 INFO neutron.agent.securitygroups_rpc [None req-fb00d927-fa6d-4c8a-857b-3eb5803ded56 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:56:41 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:56:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:56:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:41.774 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:56:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:41.774 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:56:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:41.774 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:56:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:41.889 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 4.8 KiB/s wr, 42 op/s
Feb 01 09:56:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:42.589 2 INFO neutron.agent.securitygroups_rpc [None req-868efe44-cb2f-4cd4-8b32-db45306b68ea 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:56:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:56:42 np0005604215.localdomain ceph-mon[298604]: pgmap v282: 177 pgs: 177 active+clean; 145 MiB data, 765 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 4.8 KiB/s wr, 42 op/s
Feb 01 09:56:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e150 e150: 6 total, 6 up, 6 in
Feb 01 09:56:43 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:43.577 2 INFO neutron.agent.securitygroups_rpc [None req-2270150a-743e-4e11-8e45-671bacf25871 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:56:43 np0005604215.localdomain ceph-mon[298604]: osdmap e150: 6 total, 6 up, 6 in
Feb 01 09:56:43 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e151 e151: 6 total, 6 up, 6 in
Feb 01 09:56:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 9.3 KiB/s wr, 104 op/s
Feb 01 09:56:44 np0005604215.localdomain ceph-mon[298604]: osdmap e151: 6 total, 6 up, 6 in
Feb 01 09:56:44 np0005604215.localdomain ceph-mon[298604]: pgmap v285: 177 pgs: 177 active+clean; 145 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 9.3 KiB/s wr, 104 op/s
Feb 01 09:56:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e151 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e152 e152: 6 total, 6 up, 6 in
Feb 01 09:56:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:46.125 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 9.3 KiB/s wr, 104 op/s
Feb 01 09:56:46 np0005604215.localdomain ceph-mon[298604]: osdmap e152: 6 total, 6 up, 6 in
Feb 01 09:56:46 np0005604215.localdomain ceph-mon[298604]: pgmap v287: 177 pgs: 177 active+clean; 145 MiB data, 770 MiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 9.3 KiB/s wr, 104 op/s
Feb 01 09:56:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e153 e153: 6 total, 6 up, 6 in
Feb 01 09:56:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:46.926 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e154 e154: 6 total, 6 up, 6 in
Feb 01 09:56:47 np0005604215.localdomain ceph-mon[298604]: osdmap e153: 6 total, 6 up, 6 in
Feb 01 09:56:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 146 KiB/s rd, 11 KiB/s wr, 200 op/s
Feb 01 09:56:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:48.539 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:48.541 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:56:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:48.544 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:48.545 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[e7390015-f420-4dad-9d87-2665f41908a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:48 np0005604215.localdomain ceph-mon[298604]: osdmap e154: 6 total, 6 up, 6 in
Feb 01 09:56:48 np0005604215.localdomain ceph-mon[298604]: pgmap v290: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 146 KiB/s rd, 11 KiB/s wr, 200 op/s
Feb 01 09:56:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e155 e155: 6 total, 6 up, 6 in
Feb 01 09:56:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:49.131 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:49.131 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:56:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:49.132 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:56:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:49.241 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:56:49 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:49.800 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:49 np0005604215.localdomain ceph-mon[298604]: osdmap e155: 6 total, 6 up, 6 in
Feb 01 09:56:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e155 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:50.206 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v292: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 147 KiB/s rd, 11 KiB/s wr, 202 op/s
Feb 01 09:56:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:56:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:56:50 np0005604215.localdomain ceph-mon[298604]: pgmap v292: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 147 KiB/s rd, 11 KiB/s wr, 202 op/s
Feb 01 09:56:50 np0005604215.localdomain podman[310557]: 2026-02-01 09:56:50.88350384 +0000 UTC m=+0.085939574 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:56:50 np0005604215.localdomain podman[310556]: 2026-02-01 09:56:50.938454142 +0000 UTC m=+0.142543928 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:56:50 np0005604215.localdomain podman[310557]: 2026-02-01 09:56:50.953125837 +0000 UTC m=+0.155561581 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:56:50 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:56:50 np0005604215.localdomain podman[310556]: 2026-02-01 09:56:50.97548681 +0000 UTC m=+0.179576626 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:56:51 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:56:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:51.104 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '10', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:51.107 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:56:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:51.110 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:51.111 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[008ce95e-676c-4056-a059-e95116233bd8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:51.128 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:56:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:56:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:56:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:56:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:56:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:56:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e156 e156: 6 total, 6 up, 6 in
Feb 01 09:56:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:51.929 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v294: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 121 KiB/s rd, 9.1 KiB/s wr, 166 op/s
Feb 01 09:56:52 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:52.530 259225 INFO neutron.agent.linux.ip_lib [None req-7b487ffd-5635-401e-bacc-6723fac1006e - - - - - -] Device tap0ff05a29-3c cannot be used as it has no MAC address
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.546 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:52 np0005604215.localdomain kernel: device tap0ff05a29-3c entered promiscuous mode
Feb 01 09:56:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:52Z|00166|binding|INFO|Claiming lport 0ff05a29-3cc7-4c1a-a005-225d700300ca for this chassis.
Feb 01 09:56:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:52Z|00167|binding|INFO|0ff05a29-3cc7-4c1a-a005-225d700300ca: Claiming unknown
Feb 01 09:56:52 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939812.5523] manager: (tap0ff05a29-3c): new Generic device (/org/freedesktop/NetworkManager/Devices/33)
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.552 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:52 np0005604215.localdomain systemd-udevd[310612]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:56:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:52.571 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c3e71f40-156c-4217-bedf-836f04a8f728', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3e71f40-156c-4217-bedf-836f04a8f728', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ff200d66c230435098f5a0489bf1e8f7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4bd8115-ffb2-4415-a799-f41a6c9021b2, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=0ff05a29-3cc7-4c1a-a005-225d700300ca) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:52.572 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 0ff05a29-3cc7-4c1a-a005-225d700300ca in datapath c3e71f40-156c-4217-bedf-836f04a8f728 bound to our chassis
Feb 01 09:56:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:52.573 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c3e71f40-156c-4217-bedf-836f04a8f728 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:56:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:52.573 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[93052da0-ea8a-4b5d-92f1-b69ac8f8c43b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:52Z|00168|binding|INFO|Setting lport 0ff05a29-3cc7-4c1a-a005-225d700300ca ovn-installed in OVS
Feb 01 09:56:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:52Z|00169|binding|INFO|Setting lport 0ff05a29-3cc7-4c1a-a005-225d700300ca up in Southbound
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.582 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap0ff05a29-3c: No such device
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.604 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.628 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:52 np0005604215.localdomain ceph-mon[298604]: osdmap e156: 6 total, 6 up, 6 in
Feb 01 09:56:52 np0005604215.localdomain ceph-mon[298604]: pgmap v294: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 121 KiB/s rd, 9.1 KiB/s wr, 166 op/s
Feb 01 09:56:52 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:52.736 2 INFO neutron.agent.securitygroups_rpc [None req-28352136-f461-4efb-990d-d0ac566ee992 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:56:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:52.972 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.118 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.118 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.119 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.119 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.120 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:56:53 np0005604215.localdomain podman[310703]: 
Feb 01 09:56:53 np0005604215.localdomain podman[310703]: 2026-02-01 09:56:53.468055412 +0000 UTC m=+0.113186387 container create 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:56:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:56:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:56:53 np0005604215.localdomain systemd[1]: Started libpod-conmon-8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51.scope.
Feb 01 09:56:53 np0005604215.localdomain podman[310703]: 2026-02-01 09:56:53.409505238 +0000 UTC m=+0.054636273 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:56:53 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:56:53 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6c22aff628e6bc84ba432acd1a0ec47a0f890d608bcc6d6b65fb5e1bf052ca32/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:56:53 np0005604215.localdomain podman[310716]: 2026-02-01 09:56:53.578589318 +0000 UTC m=+0.075346156 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, io.buildah.version=1.33.7, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, distribution-scope=public, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, config_id=openstack_network_exporter, version=9.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=)
Feb 01 09:56:53 np0005604215.localdomain podman[310716]: 2026-02-01 09:56:53.591738155 +0000 UTC m=+0.088495023 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9/ubi-minimal)
Feb 01 09:56:53 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:56:53 np0005604215.localdomain podman[310718]: 2026-02-01 09:56:53.646572814 +0000 UTC m=+0.141776114 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 01 09:56:53 np0005604215.localdomain podman[310703]: 2026-02-01 09:56:53.653186329 +0000 UTC m=+0.298317264 container init 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 01 09:56:53 np0005604215.localdomain podman[310703]: 2026-02-01 09:56:53.662423386 +0000 UTC m=+0.307554311 container start 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:56:53 np0005604215.localdomain dnsmasq[310765]: started, version 2.85 cachesize 150
Feb 01 09:56:53 np0005604215.localdomain dnsmasq[310765]: DNS service limited to local subnets
Feb 01 09:56:53 np0005604215.localdomain dnsmasq[310765]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:56:53 np0005604215.localdomain dnsmasq[310765]: warning: no upstream servers configured
Feb 01 09:56:53 np0005604215.localdomain dnsmasq-dhcp[310765]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:56:53 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 0 addresses
Feb 01 09:56:53 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:56:53 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.669 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.550s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:56:53 np0005604215.localdomain podman[310718]: 2026-02-01 09:56:53.677053499 +0000 UTC m=+0.172256839 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:56:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/794858011' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:53 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:56:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:53.734 259225 INFO neutron.agent.linux.ip_lib [None req-bcfb9cac-69bb-453a-8579-2cf0d4687405 - - - - - -] Device tapdf480a46-ff cannot be used as it has no MAC address
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.759 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain kernel: device tapdf480a46-ff entered promiscuous mode
Feb 01 09:56:53 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939813.7660] manager: (tapdf480a46-ff): new Generic device (/org/freedesktop/NetworkManager/Devices/34)
Feb 01 09:56:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:53Z|00170|binding|INFO|Claiming lport df480a46-ffeb-469e-8528-f16d97851fd4 for this chassis.
Feb 01 09:56:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:53Z|00171|binding|INFO|df480a46-ffeb-469e-8528-f16d97851fd4: Claiming unknown
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.766 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.796 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:53Z|00172|binding|INFO|Setting lport df480a46-ffeb-469e-8528-f16d97851fd4 ovn-installed in OVS
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.799 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.800 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:53Z|00173|binding|INFO|Setting lport df480a46-ffeb-469e-8528-f16d97851fd4 up in Southbound
Feb 01 09:56:53 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapdf480a46-ff: No such device
Feb 01 09:56:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:53.833 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09d03f879db542be8bf676bafcc9ce36', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d1a8906-fc18-4fe5-9368-552a4dec9770, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=df480a46-ffeb-469e-8528-f16d97851fd4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:53.836 158655 INFO neutron.agent.ovn.metadata.agent [-] Port df480a46-ffeb-469e-8528-f16d97851fd4 in datapath 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09 bound to our chassis
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.840 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:53.841 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 595dc249-884d-44f8-8888-d36a32f65dc4 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:56:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:53.842 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:53.843 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[4f96d9af-bf58-4615-b18e-6fd0b8d43927]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:53.862 259225 INFO neutron.agent.dhcp.agent [None req-46896adc-d14d-4e43-b82c-1100afb88bbe - - - - - -] DHCP configuration for ports {'2e5f5375-a62e-44d2-a494-38636ec2aecf'} is completed
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.866 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.877 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.878 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11589MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.879 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.879 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.922 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:56:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:53.922 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:56:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 4.3 KiB/s wr, 77 op/s
Feb 01 09:56:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:54.258 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:56:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:54.389 2 INFO neutron.agent.securitygroups_rpc [None req-beca9d48-bb68-446a-9132-2fee37d11230 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:56:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:56:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2613574519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:54.681 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.423s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:56:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:54.688 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:56:54 np0005604215.localdomain ceph-mon[298604]: pgmap v295: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 4.3 KiB/s wr, 77 op/s
Feb 01 09:56:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3247325699' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2613574519' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:54 np0005604215.localdomain podman[310861]: 
Feb 01 09:56:54 np0005604215.localdomain podman[310861]: 2026-02-01 09:56:54.70533395 +0000 UTC m=+0.093057314 container create 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:56:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:54.711 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:56:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:54.749 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:56:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:54.750 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.871s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:56:54 np0005604215.localdomain systemd[1]: Started libpod-conmon-2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254.scope.
Feb 01 09:56:54 np0005604215.localdomain podman[310861]: 2026-02-01 09:56:54.659896852 +0000 UTC m=+0.047620276 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:56:54 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:56:54 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8b6cde7f4650549dcf14a16d5d08a1cd963e6a89846ce81897519e9e109b5636/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:56:54 np0005604215.localdomain podman[310861]: 2026-02-01 09:56:54.781420427 +0000 UTC m=+0.169143791 container init 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:56:54 np0005604215.localdomain podman[310861]: 2026-02-01 09:56:54.78700874 +0000 UTC m=+0.174732104 container start 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 09:56:54 np0005604215.localdomain dnsmasq[310881]: started, version 2.85 cachesize 150
Feb 01 09:56:54 np0005604215.localdomain dnsmasq[310881]: DNS service limited to local subnets
Feb 01 09:56:54 np0005604215.localdomain dnsmasq[310881]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:56:54 np0005604215.localdomain dnsmasq[310881]: warning: no upstream servers configured
Feb 01 09:56:54 np0005604215.localdomain dnsmasq-dhcp[310881]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:56:54 np0005604215.localdomain dnsmasq[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/addn_hosts - 0 addresses
Feb 01 09:56:54 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/host
Feb 01 09:56:54 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/opts
Feb 01 09:56:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:54.919 259225 INFO neutron.agent.dhcp.agent [None req-56a2cf73-7adf-4064-bd1d-bed08073b81f - - - - - -] DHCP configuration for ports {'6b99df92-9ee2-429d-9ef9-469fa1e443e4'} is completed
Feb 01 09:56:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:55.017 259225 INFO neutron.agent.linux.ip_lib [None req-d837ecee-c6d9-4301-8d79-0169529fdbad - - - - - -] Device tap70e8c4ee-b7 cannot be used as it has no MAC address
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.036 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain kernel: device tap70e8c4ee-b7 entered promiscuous mode
Feb 01 09:56:55 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939815.0434] manager: (tap70e8c4ee-b7): new Generic device (/org/freedesktop/NetworkManager/Devices/35)
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.047 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.051 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:55Z|00174|binding|INFO|Claiming lport 70e8c4ee-b7bf-45c9-80c5-43450e09967e for this chassis.
Feb 01 09:56:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:55Z|00175|binding|INFO|70e8c4ee-b7bf-45c9-80c5-43450e09967e: Claiming unknown
Feb 01 09:56:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:55.075 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9279ffc0dc2f48079045ce3d49e21210', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2934a88b-2cb8-43fc-bc4a-0266d2f826b9, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=70e8c4ee-b7bf-45c9-80c5-43450e09967e) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:55.077 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 70e8c4ee-b7bf-45c9-80c5-43450e09967e in datapath 0ac2ccf3-74d8-4f0a-903f-4cf43406d18d bound to our chassis
Feb 01 09:56:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:55Z|00176|binding|INFO|Setting lport 70e8c4ee-b7bf-45c9-80c5-43450e09967e ovn-installed in OVS
Feb 01 09:56:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:56:55Z|00177|binding|INFO|Setting lport 70e8c4ee-b7bf-45c9-80c5-43450e09967e up in Southbound
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.079 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:55.081 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 0ac2ccf3-74d8-4f0a-903f-4cf43406d18d or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.082 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:55.082 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[1f7245f5-6413-44e2-b0f5-d15659330028]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.115 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e156 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.140 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:55 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:55.254 2 INFO neutron.agent.securitygroups_rpc [None req-4f603dff-697a-4023-bd41-ff0e5bb72114 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']
Feb 01 09:56:55 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:55.406 2 INFO neutron.agent.securitygroups_rpc [None req-4f603dff-697a-4023-bd41-ff0e5bb72114 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']
Feb 01 09:56:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2923838030' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.752 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.752 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.753 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.753 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:55.753 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:56:55 np0005604215.localdomain podman[310945]: 
Feb 01 09:56:55 np0005604215.localdomain podman[310945]: 2026-02-01 09:56:55.991516881 +0000 UTC m=+0.096314074 container create 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:56:56 np0005604215.localdomain systemd[1]: Started libpod-conmon-6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c.scope.
Feb 01 09:56:56 np0005604215.localdomain podman[310945]: 2026-02-01 09:56:55.938745636 +0000 UTC m=+0.043542879 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:56:56 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:56:56 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a141c608be1b875224d2f7067e777389f3126fc9994644b0ea89131e8d650861/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:56:56 np0005604215.localdomain podman[310945]: 2026-02-01 09:56:56.058382533 +0000 UTC m=+0.163179736 container init 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 09:56:56 np0005604215.localdomain podman[310945]: 2026-02-01 09:56:56.066397082 +0000 UTC m=+0.171194275 container start 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:56:56 np0005604215.localdomain dnsmasq[310963]: started, version 2.85 cachesize 150
Feb 01 09:56:56 np0005604215.localdomain dnsmasq[310963]: DNS service limited to local subnets
Feb 01 09:56:56 np0005604215.localdomain dnsmasq[310963]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:56:56 np0005604215.localdomain dnsmasq[310963]: warning: no upstream servers configured
Feb 01 09:56:56 np0005604215.localdomain dnsmasq-dhcp[310963]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:56:56 np0005604215.localdomain dnsmasq[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/addn_hosts - 0 addresses
Feb 01 09:56:56 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/host
Feb 01 09:56:56 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/opts
Feb 01 09:56:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:56.130 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:56 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:56.174 259225 INFO neutron.agent.dhcp.agent [None req-7e9bc58b-b11d-4511-9264-9335808c921a - - - - - -] DHCP configuration for ports {'0c2a82db-0feb-4b25-b844-2b222d4e123e'} is completed
Feb 01 09:56:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v296: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 3.5 KiB/s wr, 62 op/s
Feb 01 09:56:56 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:56.282 2 INFO neutron.agent.securitygroups_rpc [None req-8fcd2e7b-6ccb-4a1d-8200-e9004b8005a0 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']
Feb 01 09:56:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e157 e157: 6 total, 6 up, 6 in
Feb 01 09:56:56 np0005604215.localdomain ceph-mon[298604]: pgmap v296: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 3.5 KiB/s wr, 62 op/s
Feb 01 09:56:56 np0005604215.localdomain ceph-mon[298604]: osdmap e157: 6 total, 6 up, 6 in
Feb 01 09:56:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:56.931 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:56:57 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:56:57.427 2 INFO neutron.agent.securitygroups_rpc [None req-1f1c195c-f9d6-4ec8-8caa-a0e049b01499 0662eb14260a4e0584613789ed9c9820 ec2f419434374ceeb2aabac212e109be - - default default] Security group member updated ['e8a8d0ce-a79e-4888-bdec-0f79f8d34440']
Feb 01 09:56:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:56:57.481 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:56:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2735579090' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:56:58.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:56:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v298: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 3.5 KiB/s wr, 62 op/s
Feb 01 09:56:58 np0005604215.localdomain ceph-mon[298604]: pgmap v298: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 3.5 KiB/s wr, 62 op/s
Feb 01 09:56:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2392180758' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:56:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:59.340 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.2 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '11', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:56:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:59.343 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:56:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:59.347 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:56:59 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:56:59.349 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[f3aee197-57cc-4456-b9c8-ad3ab177f1ad]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:57:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:57:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:57:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 162645 "" "Go-http-client/1.1"
Feb 01 09:57:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:57:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 20191 "" "Go-http-client/1.1"
Feb 01 09:57:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v299: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 58 op/s
Feb 01 09:57:00 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:00.768 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:00Z, description=, device_id=275f8795-f4d8-4210-a735-0c3c1fecd4e3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322aadf0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322aaeb0>], id=9ddb04c6-905b-4fad-ae32-f2d22672d3a0, ip_allocation=immediate, mac_address=fa:16:3e:61:e3:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:51Z, description=, dns_domain=, id=0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1711716350-network, port_security_enabled=True, project_id=9279ffc0dc2f48079045ce3d49e21210, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7741, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2064, status=ACTIVE, subnets=['e7952a0f-0365-4883-af11-767cb701197e'], tags=[], tenant_id=9279ffc0dc2f48079045ce3d49e21210, updated_at=2026-02-01T09:56:53Z, vlan_transparent=None, network_id=0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, port_security_enabled=False, project_id=9279ffc0dc2f48079045ce3d49e21210, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2106, status=DOWN, tags=[], tenant_id=9279ffc0dc2f48079045ce3d49e21210, updated_at=2026-02-01T09:57:00Z on network 0ac2ccf3-74d8-4f0a-903f-4cf43406d18d
Feb 01 09:57:00 np0005604215.localdomain podman[310981]: 2026-02-01 09:57:00.969186365 +0000 UTC m=+0.063657864 container kill 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 01 09:57:00 np0005604215.localdomain dnsmasq[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/addn_hosts - 1 addresses
Feb 01 09:57:00 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/host
Feb 01 09:57:00 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/opts
Feb 01 09:57:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:01.131 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:01.211 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:00Z, description=, device_id=3d164407-ec04-4038-9224-5241a42e0a84, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003229d910>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003229dd60>], id=77bd7428-4576-4cae-b8c9-78b8c2c7ed62, ip_allocation=immediate, mac_address=fa:16:3e:22:32:32, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:49Z, description=, dns_domain=, id=5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1557564655, port_security_enabled=True, project_id=09d03f879db542be8bf676bafcc9ce36, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40555, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2038, status=ACTIVE, subnets=['1febdf11-5537-42ea-a6e2-0feca3467664'], tags=[], tenant_id=09d03f879db542be8bf676bafcc9ce36, updated_at=2026-02-01T09:56:51Z, vlan_transparent=None, network_id=5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, port_security_enabled=False, project_id=09d03f879db542be8bf676bafcc9ce36, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2102, status=DOWN, tags=[], tenant_id=09d03f879db542be8bf676bafcc9ce36, updated_at=2026-02-01T09:57:00Z on network 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09
Feb 01 09:57:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:01.268 259225 INFO neutron.agent.dhcp.agent [None req-2b6fd1ab-101b-498c-b25b-fa58d40768e4 - - - - - -] DHCP configuration for ports {'9ddb04c6-905b-4fad-ae32-f2d22672d3a0'} is completed
Feb 01 09:57:01 np0005604215.localdomain ceph-mon[298604]: pgmap v299: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 42 KiB/s rd, 3.3 KiB/s wr, 58 op/s
Feb 01 09:57:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:01.335 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:00Z, description=, device_id=cbac6734-8188-48fb-9a41-b0c64ce89f6d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032305100>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032305d60>], id=a50c835e-1670-4628-bb4d-c64fd6100a0a, ip_allocation=immediate, mac_address=fa:16:3e:bb:19:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:48Z, description=, dns_domain=, id=c3e71f40-156c-4217-bedf-836f04a8f728, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2085708237-network, port_security_enabled=True, project_id=ff200d66c230435098f5a0489bf1e8f7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55349, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2036, status=ACTIVE, subnets=['098397c5-98ca-4cc3-a654-3c1e4a604734'], tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:56:50Z, vlan_transparent=None, network_id=c3e71f40-156c-4217-bedf-836f04a8f728, port_security_enabled=False, project_id=ff200d66c230435098f5a0489bf1e8f7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2104, status=DOWN, tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:57:00Z on network c3e71f40-156c-4217-bedf-836f04a8f728
Feb 01 09:57:01 np0005604215.localdomain dnsmasq[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/addn_hosts - 1 addresses
Feb 01 09:57:01 np0005604215.localdomain podman[311019]: 2026-02-01 09:57:01.444715869 +0000 UTC m=+0.065330416 container kill 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:01 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/host
Feb 01 09:57:01 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/opts
Feb 01 09:57:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:57:01 np0005604215.localdomain podman[311049]: 2026-02-01 09:57:01.558780433 +0000 UTC m=+0.087238344 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:57:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:57:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:57:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:57:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:57:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:57:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:57:01 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 1 addresses
Feb 01 09:57:01 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:57:01 np0005604215.localdomain podman[311062]: 2026-02-01 09:57:01.586655777 +0000 UTC m=+0.078017529 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:57:01 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:57:01 np0005604215.localdomain podman[311049]: 2026-02-01 09:57:01.624210091 +0000 UTC m=+0.152668002 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:57:01 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:57:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:01.682 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:00Z, description=, device_id=275f8795-f4d8-4210-a735-0c3c1fecd4e3, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322ac8e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322c3ee0>], id=9ddb04c6-905b-4fad-ae32-f2d22672d3a0, ip_allocation=immediate, mac_address=fa:16:3e:61:e3:6e, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:51Z, description=, dns_domain=, id=0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesActionsTest-1711716350-network, port_security_enabled=True, project_id=9279ffc0dc2f48079045ce3d49e21210, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=7741, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2064, status=ACTIVE, subnets=['e7952a0f-0365-4883-af11-767cb701197e'], tags=[], tenant_id=9279ffc0dc2f48079045ce3d49e21210, updated_at=2026-02-01T09:56:53Z, vlan_transparent=None, network_id=0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, port_security_enabled=False, project_id=9279ffc0dc2f48079045ce3d49e21210, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2106, status=DOWN, tags=[], tenant_id=9279ffc0dc2f48079045ce3d49e21210, updated_at=2026-02-01T09:57:00Z on network 0ac2ccf3-74d8-4f0a-903f-4cf43406d18d
Feb 01 09:57:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:01.779 259225 INFO neutron.agent.dhcp.agent [None req-fefd1530-5c19-427b-9291-6aa45e22d6f1 - - - - - -] DHCP configuration for ports {'77bd7428-4576-4cae-b8c9-78b8c2c7ed62'} is completed
Feb 01 09:57:01 np0005604215.localdomain dnsmasq[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/addn_hosts - 1 addresses
Feb 01 09:57:01 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/host
Feb 01 09:57:01 np0005604215.localdomain podman[311110]: 2026-02-01 09:57:01.901797052 +0000 UTC m=+0.056043738 container kill 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:57:01 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/opts
Feb 01 09:57:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:01.933 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:01 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:01.951 259225 INFO neutron.agent.dhcp.agent [None req-8dc8ecaf-b926-4324-8302-4d607bd08a07 - - - - - -] DHCP configuration for ports {'a50c835e-1670-4628-bb4d-c64fd6100a0a'} is completed
Feb 01 09:57:02 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:02.110 259225 INFO neutron.agent.dhcp.agent [None req-a7ee453f-38a8-40dc-abad-67dbaa0fb6d7 - - - - - -] DHCP configuration for ports {'9ddb04c6-905b-4fad-ae32-f2d22672d3a0'} is completed
Feb 01 09:57:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v300: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 2.8 KiB/s wr, 49 op/s
Feb 01 09:57:02 np0005604215.localdomain ceph-mon[298604]: pgmap v300: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 2.8 KiB/s wr, 49 op/s
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.407 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.408 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:57:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:57:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2856613121' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2856613121' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:03Z|00178|ovn_bfd|INFO|Enabled BFD on interface ovn-2186fb-0
Feb 01 09:57:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:03Z|00179|ovn_bfd|INFO|Enabled BFD on interface ovn-e1cc33-0
Feb 01 09:57:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:03Z|00180|ovn_bfd|INFO|Enabled BFD on interface ovn-45aa31-0
Feb 01 09:57:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:03.934 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:03.947 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:03.955 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:03.957 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:03.998 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:04.026 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v301: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:57:04 np0005604215.localdomain ceph-mon[298604]: pgmap v301: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:57:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:04.929 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '14', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:04.931 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:57:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:04.935 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:04.936 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[4cbce85e-186a-48f0-a92f-bfd4c6fea436]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:04.967 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:05.016 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:05.068 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:00Z, description=, device_id=cbac6734-8188-48fb-9a41-b0c64ce89f6d, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323750d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032375cd0>], id=a50c835e-1670-4628-bb4d-c64fd6100a0a, ip_allocation=immediate, mac_address=fa:16:3e:bb:19:44, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:48Z, description=, dns_domain=, id=c3e71f40-156c-4217-bedf-836f04a8f728, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2085708237-network, port_security_enabled=True, project_id=ff200d66c230435098f5a0489bf1e8f7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55349, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2036, status=ACTIVE, subnets=['098397c5-98ca-4cc3-a654-3c1e4a604734'], tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:56:50Z, vlan_transparent=None, network_id=c3e71f40-156c-4217-bedf-836f04a8f728, port_security_enabled=False, project_id=ff200d66c230435098f5a0489bf1e8f7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2104, status=DOWN, tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:57:00Z on network c3e71f40-156c-4217-bedf-836f04a8f728
Feb 01 09:57:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:05.083 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:00Z, description=, device_id=3d164407-ec04-4038-9224-5241a42e0a84, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323351f0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323358b0>], id=77bd7428-4576-4cae-b8c9-78b8c2c7ed62, ip_allocation=immediate, mac_address=fa:16:3e:22:32:32, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:49Z, description=, dns_domain=, id=5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1557564655, port_security_enabled=True, project_id=09d03f879db542be8bf676bafcc9ce36, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40555, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2038, status=ACTIVE, subnets=['1febdf11-5537-42ea-a6e2-0feca3467664'], tags=[], tenant_id=09d03f879db542be8bf676bafcc9ce36, updated_at=2026-02-01T09:56:51Z, vlan_transparent=None, network_id=5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, port_security_enabled=False, project_id=09d03f879db542be8bf676bafcc9ce36, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2102, status=DOWN, tags=[], tenant_id=09d03f879db542be8bf676bafcc9ce36, updated_at=2026-02-01T09:57:00Z on network 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09
Feb 01 09:57:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:05 np0005604215.localdomain dnsmasq[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/addn_hosts - 1 addresses
Feb 01 09:57:05 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/host
Feb 01 09:57:05 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/opts
Feb 01 09:57:05 np0005604215.localdomain podman[311163]: 2026-02-01 09:57:05.301901744 +0000 UTC m=+0.064568032 container kill 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:57:05 np0005604215.localdomain podman[311176]: 2026-02-01 09:57:05.360334565 +0000 UTC m=+0.067145492 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:57:05 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 1 addresses
Feb 01 09:57:05 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:57:05 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:57:05 np0005604215.localdomain podman[311187]: 2026-02-01 09:57:05.420358765 +0000 UTC m=+0.090879428 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:57:05 np0005604215.localdomain podman[311187]: 2026-02-01 09:57:05.436620118 +0000 UTC m=+0.107140781 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:57:05 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:57:05 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:05.759 259225 INFO neutron.agent.dhcp.agent [None req-b0932cf5-2af1-4b0b-b4c2-e89359cef3e8 - - - - - -] DHCP configuration for ports {'a50c835e-1670-4628-bb4d-c64fd6100a0a', '77bd7428-4576-4cae-b8c9-78b8c2c7ed62'} is completed
Feb 01 09:57:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:05.804 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:06.133 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v302: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:57:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:06.982 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:07 np0005604215.localdomain ceph-mon[298604]: pgmap v302: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:57:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v303: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 2.4 KiB/s rd, 355 B/s wr, 3 op/s
Feb 01 09:57:08 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:08.345 2 INFO neutron.agent.securitygroups_rpc [None req-6b01ca21-428c-44eb-a29a-b5d48a46bb1b e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:08 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:08.908 2 INFO neutron.agent.securitygroups_rpc [None req-6ad3677b-8ed2-4afa-a1c1-27a49bcc11f3 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:57:09 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:09.003 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:08Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003229dd60>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003229da30>], id=2eb5ca0e-8f6c-4dad-ae1f-dd77c07bc083, ip_allocation=immediate, mac_address=fa:16:3e:21:db:e0, name=tempest-FloatingIPTestJSON-1252841599, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:49Z, description=, dns_domain=, id=5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-FloatingIPTestJSON-1557564655, port_security_enabled=True, project_id=09d03f879db542be8bf676bafcc9ce36, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=40555, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2038, status=ACTIVE, subnets=['1febdf11-5537-42ea-a6e2-0feca3467664'], tags=[], tenant_id=09d03f879db542be8bf676bafcc9ce36, updated_at=2026-02-01T09:56:51Z, vlan_transparent=None, network_id=5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, port_security_enabled=True, project_id=09d03f879db542be8bf676bafcc9ce36, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['7a11b431-4ecd-4461-a4ec-d66a85649c4d'], standard_attr_id=2139, status=DOWN, tags=[], tenant_id=09d03f879db542be8bf676bafcc9ce36, updated_at=2026-02-01T09:57:08Z on network 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09
Feb 01 09:57:09 np0005604215.localdomain dnsmasq[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/addn_hosts - 2 addresses
Feb 01 09:57:09 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/host
Feb 01 09:57:09 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/opts
Feb 01 09:57:09 np0005604215.localdomain podman[311245]: 2026-02-01 09:57:09.211856934 +0000 UTC m=+0.059498094 container kill 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:57:09 np0005604215.localdomain ceph-mon[298604]: pgmap v303: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 2.4 KiB/s rd, 355 B/s wr, 3 op/s
Feb 01 09:57:09 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:09.419 259225 INFO neutron.agent.dhcp.agent [None req-9feb0436-5fa6-482e-bbdf-cd27232ec498 - - - - - -] DHCP configuration for ports {'2eb5ca0e-8f6c-4dad-ae1f-dd77c07bc083'} is completed
Feb 01 09:57:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v304: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 2.3 KiB/s rd, 341 B/s wr, 3 op/s
Feb 01 09:57:10 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:10.902 2 INFO neutron.agent.securitygroups_rpc [None req-dadc567d-76ba-47fb-b0ef-4dff55f1d7c5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:10.984 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:11.146 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:11 np0005604215.localdomain ceph-mon[298604]: pgmap v304: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 2.3 KiB/s rd, 341 B/s wr, 3 op/s
Feb 01 09:57:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:11.986 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v305: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 2.3 KiB/s rd, 341 B/s wr, 3 op/s
Feb 01 09:57:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:12.280 2 INFO neutron.agent.securitygroups_rpc [None req-3e23b3f8-adbb-495f-8475-dfff7cdaa65d 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:57:12 np0005604215.localdomain dnsmasq[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/addn_hosts - 1 addresses
Feb 01 09:57:12 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/host
Feb 01 09:57:12 np0005604215.localdomain podman[311282]: 2026-02-01 09:57:12.539411808 +0000 UTC m=+0.056035157 container kill 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:57:12 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/opts
Feb 01 09:57:12 np0005604215.localdomain ceph-mon[298604]: pgmap v305: 177 pgs: 177 active+clean; 145 MiB data, 788 MiB used, 41 GiB / 42 GiB avail; 2.3 KiB/s rd, 341 B/s wr, 3 op/s
Feb 01 09:57:14 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:14.017 2 INFO neutron.agent.securitygroups_rpc [None req-f8a158f2-98c4-421b-88cc-f32123223f2c d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']
Feb 01 09:57:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v306: 177 pgs: 177 active+clean; 233 MiB data, 1004 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 7.3 MiB/s wr, 23 op/s
Feb 01 09:57:14 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:14.289 2 INFO neutron.agent.securitygroups_rpc [None req-f8a158f2-98c4-421b-88cc-f32123223f2c d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']
Feb 01 09:57:14 np0005604215.localdomain dnsmasq[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/addn_hosts - 0 addresses
Feb 01 09:57:14 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/host
Feb 01 09:57:14 np0005604215.localdomain dnsmasq-dhcp[310881]: read /var/lib/neutron/dhcp/5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09/opts
Feb 01 09:57:14 np0005604215.localdomain podman[311320]: 2026-02-01 09:57:14.492773493 +0000 UTC m=+0.064765178 container kill 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:57:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:14.654 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:14 np0005604215.localdomain kernel: device tapdf480a46-ff left promiscuous mode
Feb 01 09:57:14 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:14Z|00181|binding|INFO|Releasing lport df480a46-ffeb-469e-8528-f16d97851fd4 from this chassis (sb_readonly=0)
Feb 01 09:57:14 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:14Z|00182|binding|INFO|Setting lport df480a46-ffeb-469e-8528-f16d97851fd4 down in Southbound
Feb 01 09:57:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:14.665 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '09d03f879db542be8bf676bafcc9ce36', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0d1a8906-fc18-4fe5-9368-552a4dec9770, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=df480a46-ffeb-469e-8528-f16d97851fd4) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:14.667 158655 INFO neutron.agent.ovn.metadata.agent [-] Port df480a46-ffeb-469e-8528-f16d97851fd4 in datapath 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09 unbound from our chassis
Feb 01 09:57:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:14.671 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:14.672 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[48a0c1d7-f913-4e49-8ccd-4c48d7fbc972]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:14.678 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:15.143 2 INFO neutron.agent.securitygroups_rpc [None req-49f91d64-6065-4341-b2b5-79ecb7af7da0 d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']
Feb 01 09:57:15 np0005604215.localdomain ceph-mon[298604]: pgmap v306: 177 pgs: 177 active+clean; 233 MiB data, 1004 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 7.3 MiB/s wr, 23 op/s
Feb 01 09:57:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:15.583 2 INFO neutron.agent.securitygroups_rpc [None req-55fa4d7c-6084-454f-b0ae-c51e2ec0c52d d1f5486995624e27afb3baf89715ca46 3cb13cb2ee4e4e329cfbfe3e5fc9c8b9 - - default default] Security group member updated ['13bf76e1-cfaa-4be1-a8fe-9de3506dc4bd']
Feb 01 09:57:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:15.611 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:15 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:15.794 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '18', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 10.100.0.3 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '10.100.0.3/28 2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '15', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:15 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:15.796 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:57:15 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:15.799 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:15 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:15.800 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ef4ca8cc-b206-4772-9733-2a8d9ad628f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:16.149 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v307: 177 pgs: 177 active+clean; 233 MiB data, 1004 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 7.3 MiB/s wr, 23 op/s
Feb 01 09:57:16 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:16.690 2 INFO neutron.agent.securitygroups_rpc [None req-4b8cba9a-9cba-4ac1-97f8-0546b3fd4da5 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:16.988 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:17.358 2 INFO neutron.agent.securitygroups_rpc [None req-0e44897f-42c5-42e0-aa55-3214c5bdaadc e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:17 np0005604215.localdomain ceph-mon[298604]: pgmap v307: 177 pgs: 177 active+clean; 233 MiB data, 1004 MiB used, 41 GiB / 42 GiB avail; 15 KiB/s rd, 7.3 MiB/s wr, 23 op/s
Feb 01 09:57:17 np0005604215.localdomain dnsmasq[310881]: exiting on receipt of SIGTERM
Feb 01 09:57:17 np0005604215.localdomain podman[311362]: 2026-02-01 09:57:17.615945905 +0000 UTC m=+0.057581525 container kill 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:57:17 np0005604215.localdomain systemd[1]: libpod-2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254.scope: Deactivated successfully.
Feb 01 09:57:17 np0005604215.localdomain podman[311377]: 2026-02-01 09:57:17.69744025 +0000 UTC m=+0.059517995 container died 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:57:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254-userdata-shm.mount: Deactivated successfully.
Feb 01 09:57:17 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-8b6cde7f4650549dcf14a16d5d08a1cd963e6a89846ce81897519e9e109b5636-merged.mount: Deactivated successfully.
Feb 01 09:57:17 np0005604215.localdomain podman[311377]: 2026-02-01 09:57:17.795807788 +0000 UTC m=+0.157885463 container remove 2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-5bfedfe5-c714-4a56-95b1-3d4c6b3d2f09, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 01 09:57:17 np0005604215.localdomain systemd[1]: libpod-conmon-2e1c2a3c8e99041ece37c80543564f85baab48786e71c097c189b4b153d55254.scope: Deactivated successfully.
Feb 01 09:57:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v308: 177 pgs: 177 active+clean; 353 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 17 MiB/s wr, 59 op/s
Feb 01 09:57:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2036505987' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2036505987' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:18 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d5bfedfe5\x2dc714\x2d4a56\x2d95b1\x2d3d4c6b3d2f09.mount: Deactivated successfully.
Feb 01 09:57:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:18.401 259225 INFO neutron.agent.dhcp.agent [None req-bf851bee-a9f5-4eeb-86d1-8625335eb1bf - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:18.566 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:18 np0005604215.localdomain dnsmasq[310310]: exiting on receipt of SIGTERM
Feb 01 09:57:18 np0005604215.localdomain podman[311419]: 2026-02-01 09:57:18.960412243 +0000 UTC m=+0.061961611 container kill d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:18 np0005604215.localdomain systemd[1]: libpod-d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61.scope: Deactivated successfully.
Feb 01 09:57:19 np0005604215.localdomain podman[311433]: 2026-02-01 09:57:19.0345454 +0000 UTC m=+0.060402622 container died d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 09:57:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61-userdata-shm.mount: Deactivated successfully.
Feb 01 09:57:19 np0005604215.localdomain podman[311433]: 2026-02-01 09:57:19.066647035 +0000 UTC m=+0.092504197 container cleanup d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:57:19 np0005604215.localdomain systemd[1]: libpod-conmon-d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61.scope: Deactivated successfully.
Feb 01 09:57:19 np0005604215.localdomain podman[311435]: 2026-02-01 09:57:19.113170967 +0000 UTC m=+0.126919645 container remove d5797bd4be489b76ff42f88c466cd274615e933da5e59a8088d05895a4206f61 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c0dbb3ef-d632-48b0-b256-d985cf33ea92, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:57:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:19.124 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:19Z|00183|binding|INFO|Releasing lport 89a67c8f-abeb-44ba-987b-710ed5812b98 from this chassis (sb_readonly=0)
Feb 01 09:57:19 np0005604215.localdomain kernel: device tap89a67c8f-ab left promiscuous mode
Feb 01 09:57:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:19Z|00184|binding|INFO|Setting lport 89a67c8f-abeb-44ba-987b-710ed5812b98 down in Southbound
Feb 01 09:57:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:19.136 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::3/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c0dbb3ef-d632-48b0-b256-d985cf33ea92', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c0dbb3ef-d632-48b0-b256-d985cf33ea92', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1d70c431093044779c88823510311e1a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d7955782-fea5-4e19-bc74-89fb26d9b2eb, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=89a67c8f-abeb-44ba-987b-710ed5812b98) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:19.139 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 89a67c8f-abeb-44ba-987b-710ed5812b98 in datapath c0dbb3ef-d632-48b0-b256-d985cf33ea92 unbound from our chassis
Feb 01 09:57:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:19.145 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network c0dbb3ef-d632-48b0-b256-d985cf33ea92 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:57:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:19.146 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[a87e5c6d-4853-499b-aa72-9b671424bc60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:19.149 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:19 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:19.151 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:19 np0005604215.localdomain ceph-mon[298604]: pgmap v308: 177 pgs: 177 active+clean; 353 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 17 MiB/s wr, 59 op/s
Feb 01 09:57:19 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:19.413 259225 INFO neutron.agent.dhcp.agent [None req-dd00effc-1f7e-48f0-9bfe-25253988c234 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:19 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:19.413 259225 INFO neutron.agent.dhcp.agent [None req-dd00effc-1f7e-48f0-9bfe-25253988c234 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:19 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:19.420 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:19 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:19.455 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:19.581 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:19 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-ede7d4779104218ea621e19c563a4f5da37bdff6220daf39da5c830ef37d9d02-merged.mount: Deactivated successfully.
Feb 01 09:57:19 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dc0dbb3ef\x2dd632\x2d48b0\x2db256\x2dd985cf33ea92.mount: Deactivated successfully.
Feb 01 09:57:20 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:20.057 2 INFO neutron.agent.securitygroups_rpc [None req-66f0be61-daee-4cdf-a282-a2ed1512143e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e157 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v309: 177 pgs: 177 active+clean; 353 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 17 MiB/s wr, 55 op/s
Feb 01 09:57:20 np0005604215.localdomain ceph-mon[298604]: pgmap v309: 177 pgs: 177 active+clean; 353 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 17 MiB/s wr, 55 op/s
Feb 01 09:57:20 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:20.498 2 INFO neutron.agent.securitygroups_rpc [None req-a90c741a-39d3-40c8-bb6e-b94dde79eb43 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:20 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:20.955 2 INFO neutron.agent.securitygroups_rpc [None req-6d308bdf-c84d-45b6-96f5-9c77d97fcd46 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:57:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:57:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3982317258' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:57:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:21.151 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:57:21
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['.mgr', 'manila_data', 'vms', 'volumes', 'manila_metadata', 'backups', 'images']
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:57:21 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3982317258' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:57:21 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:21.583 2 INFO neutron.agent.securitygroups_rpc [None req-510b41ab-e548-47d9-b4e7-2bdc2eb9aebb 80e349351b8943ebac895c06dc769fa1 09d03f879db542be8bf676bafcc9ce36 - - default default] Security group member updated ['7a11b431-4ecd-4461-a4ec-d66a85649c4d']
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.3631525683975433e-06 of space, bias 1.0, pg target 0.0002721761294900428 quantized to 32 (current 32)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.01918918958217011 of space, bias 1.0, pg target 3.831441519906632 quantized to 32 (current 32)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.34355806811837e-05 quantized to 32 (current 32)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 2.7263051367950866e-06 of space, bias 4.0, pg target 0.002137423227247348 quantized to 16 (current 16)
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:57:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:57:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:57:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:57:21 np0005604215.localdomain podman[311461]: 2026-02-01 09:57:21.863641269 +0000 UTC m=+0.077729059 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 01 09:57:21 np0005604215.localdomain podman[311461]: 2026-02-01 09:57:21.90366215 +0000 UTC m=+0.117749940 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 01 09:57:21 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:57:21 np0005604215.localdomain podman[311462]: 2026-02-01 09:57:21.924412152 +0000 UTC m=+0.135302843 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:57:21 np0005604215.localdomain podman[311462]: 2026-02-01 09:57:21.932499653 +0000 UTC m=+0.143390314 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 09:57:21 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:57:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:22.008 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v310: 177 pgs: 177 active+clean; 353 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 17 MiB/s wr, 55 op/s
Feb 01 09:57:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e158 e158: 6 total, 6 up, 6 in
Feb 01 09:57:22 np0005604215.localdomain ceph-mon[298604]: pgmap v310: 177 pgs: 177 active+clean; 353 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.7 MiB/s rd, 17 MiB/s wr, 55 op/s
Feb 01 09:57:22 np0005604215.localdomain ceph-mon[298604]: osdmap e158: 6 total, 6 up, 6 in
Feb 01 09:57:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e159 e159: 6 total, 6 up, 6 in
Feb 01 09:57:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:57:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:57:23 np0005604215.localdomain podman[311508]: 2026-02-01 09:57:23.881446682 +0000 UTC m=+0.095634085 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1769056855, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, version=9.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 09:57:23 np0005604215.localdomain podman[311508]: 2026-02-01 09:57:23.889703928 +0000 UTC m=+0.103891341 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, io.openshift.expose-services=, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, release=1769056855, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers)
Feb 01 09:57:23 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:57:23 np0005604215.localdomain podman[311509]: 2026-02-01 09:57:23.97790971 +0000 UTC m=+0.187774659 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 01 09:57:23 np0005604215.localdomain podman[311509]: 2026-02-01 09:57:23.982952847 +0000 UTC m=+0.192817786 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:23 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:57:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v313: 177 pgs: 177 active+clean; 536 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 2.6 MiB/s rd, 35 MiB/s wr, 131 op/s
Feb 01 09:57:24 np0005604215.localdomain ceph-mon[298604]: osdmap e159: 6 total, 6 up, 6 in
Feb 01 09:57:24 np0005604215.localdomain ceph-mon[298604]: pgmap v313: 177 pgs: 177 active+clean; 536 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 2.6 MiB/s rd, 35 MiB/s wr, 131 op/s
Feb 01 09:57:24 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:24.953 2 INFO neutron.agent.securitygroups_rpc [None req-9d70792f-5f72-48f9-b951-877d0761d664 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e159 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:25.273 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:25 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:25.902 2 INFO neutron.agent.securitygroups_rpc [None req-4b206305-83e7-4f57-ba9f-2e24f96d5798 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:26.153 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v314: 177 pgs: 177 active+clean; 536 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 48 KiB/s rd, 20 MiB/s wr, 77 op/s
Feb 01 09:57:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:27.049 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:27 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:27.099 2 INFO neutron.agent.securitygroups_rpc [None req-beee55c0-e969-4dc8-abc8-cdbdc16af93f 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:27 np0005604215.localdomain ceph-mon[298604]: pgmap v314: 177 pgs: 177 active+clean; 536 MiB data, 1.8 GiB used, 40 GiB / 42 GiB avail; 48 KiB/s rd, 20 MiB/s wr, 77 op/s
Feb 01 09:57:27 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/97672389' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:57:28 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:28.143 2 INFO neutron.agent.securitygroups_rpc [None req-b32fd076-8c86-4555-94ea-b4066e09ed5c e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v315: 177 pgs: 177 active+clean; 716 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 4.7 MiB/s rd, 38 MiB/s wr, 170 op/s
Feb 01 09:57:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e160 e160: 6 total, 6 up, 6 in
Feb 01 09:57:29 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:29.099 2 INFO neutron.agent.securitygroups_rpc [None req-2d3763ba-9699-499d-861c-79c864912ba7 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:29 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:29.202 2 INFO neutron.agent.securitygroups_rpc [None req-2d3763ba-9699-499d-861c-79c864912ba7 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:29 np0005604215.localdomain ceph-mon[298604]: pgmap v315: 177 pgs: 177 active+clean; 716 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 4.7 MiB/s rd, 38 MiB/s wr, 170 op/s
Feb 01 09:57:29 np0005604215.localdomain ceph-mon[298604]: osdmap e160: 6 total, 6 up, 6 in
Feb 01 09:57:29 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:29.439 2 INFO neutron.agent.securitygroups_rpc [None req-2958eafe-4222-42a5-8f58-07ed853ce57d e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:29 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:29.550 2 INFO neutron.agent.securitygroups_rpc [None req-dc57259b-517e-469b-bbab-e176f8bbf6e4 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:57:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:57:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:57:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 159004 "" "Go-http-client/1.1"
Feb 01 09:57:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:57:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19245 "" "Go-http-client/1.1"
Feb 01 09:57:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e160 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v317: 177 pgs: 177 active+clean; 716 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 4.7 MiB/s rd, 18 MiB/s wr, 96 op/s
Feb 01 09:57:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e161 e161: 6 total, 6 up, 6 in
Feb 01 09:57:30 np0005604215.localdomain ceph-mon[298604]: pgmap v317: 177 pgs: 177 active+clean; 716 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 4.7 MiB/s rd, 18 MiB/s wr, 96 op/s
Feb 01 09:57:30 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:30.498 2 INFO neutron.agent.securitygroups_rpc [None req-a9710b04-3715-469f-ada4-600e33182b7e 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:30 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:30.534 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:31.155 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: osdmap e161: 6 total, 6 up, 6 in
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #31. Immutable memtables: 0.
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.440543) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 15] Flushing memtable with next log file: 31
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851440574, "job": 15, "event": "flush_started", "num_memtables": 1, "num_entries": 2620, "num_deletes": 264, "total_data_size": 4027248, "memory_usage": 4098752, "flush_reason": "Manual Compaction"}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 15] Level-0 flush table #32: started
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e162 e162: 6 total, 6 up, 6 in
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851459069, "cf_name": "default", "job": 15, "event": "table_file_creation", "file_number": 32, "file_size": 2595391, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 19778, "largest_seqno": 22393, "table_properties": {"data_size": 2585328, "index_size": 6440, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2565, "raw_key_size": 21714, "raw_average_key_size": 21, "raw_value_size": 2564997, "raw_average_value_size": 2562, "num_data_blocks": 273, "num_entries": 1001, "num_filter_entries": 1001, "num_deletions": 264, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939696, "oldest_key_time": 1769939696, "file_creation_time": 1769939851, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 32, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 15] Flush lasted 18623 microseconds, and 7223 cpu microseconds.
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.459157) [db/flush_job.cc:967] [default] [JOB 15] Level-0 flush table #32: 2595391 bytes OK
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.459186) [db/memtable_list.cc:519] [default] Level-0 commit table #32 started
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.461181) [db/memtable_list.cc:722] [default] Level-0 commit table #32: memtable #1 done
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.461206) EVENT_LOG_v1 {"time_micros": 1769939851461200, "job": 15, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.461229) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 15] Try to delete WAL files size 4015460, prev total WAL file size 4015501, number of live WAL files 2.
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000028.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.462735) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003131373937' seq:72057594037927935, type:22 .. '7061786F73003132303439' seq:0, type:0; will stop at (end)
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 16] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 15 Base level 0, inputs: [32(2534KB)], [30(18MB)]
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851462804, "job": 16, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [32], "files_L6": [30], "score": -1, "input_data_size": 21712062, "oldest_snapshot_seqno": -1}
Feb 01 09:57:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:57:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:57:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:57:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:57:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:57:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 16] Generated table #33: 12653 keys, 19460804 bytes, temperature: kUnknown
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851588949, "cf_name": "default", "job": 16, "event": "table_file_creation", "file_number": 33, "file_size": 19460804, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 19389720, "index_size": 38343, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 31685, "raw_key_size": 341153, "raw_average_key_size": 26, "raw_value_size": 19175060, "raw_average_value_size": 1515, "num_data_blocks": 1437, "num_entries": 12653, "num_filter_entries": 12653, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939851, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 33, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.589275) [db/compaction/compaction_job.cc:1663] [default] [JOB 16] Compacted 1@0 + 1@6 files to L6 => 19460804 bytes
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.592280) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 172.0 rd, 154.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.5, 18.2 +0.0 blob) out(18.6 +0.0 blob), read-write-amplify(15.9) write-amplify(7.5) OK, records in: 13196, records dropped: 543 output_compression: NoCompression
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.592347) EVENT_LOG_v1 {"time_micros": 1769939851592335, "job": 16, "event": "compaction_finished", "compaction_time_micros": 126214, "compaction_time_cpu_micros": 54284, "output_level": 6, "num_output_files": 1, "total_output_size": 19460804, "num_input_records": 13196, "num_output_records": 12653, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000032.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851592861, "job": 16, "event": "table_file_deletion", "file_number": 32}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000030.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939851595953, "job": 16, "event": "table_file_deletion", "file_number": 30}
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.462428) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.596370) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.596376) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.596379) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.596382) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:57:31 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:57:31.596386) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:57:31 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:57:31 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:31.802 2 INFO neutron.agent.securitygroups_rpc [None req-c82c049d-951f-4fb2-91dc-f79774ee784c e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:31 np0005604215.localdomain systemd[1]: tmp-crun.KxU2t5.mount: Deactivated successfully.
Feb 01 09:57:31 np0005604215.localdomain podman[311544]: 2026-02-01 09:57:31.883892176 +0000 UTC m=+0.097534763 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 09:57:31 np0005604215.localdomain podman[311544]: 2026-02-01 09:57:31.89434684 +0000 UTC m=+0.107989387 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127)
Feb 01 09:57:31 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:57:31 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:31.940 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:31 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:31.942 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:57:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:31.981 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3454001065' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3454001065' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:32.053 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v320: 177 pgs: 177 active+clean; 716 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 6.2 MiB/s rd, 24 MiB/s wr, 125 op/s
Feb 01 09:57:32 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:32.336 2 INFO neutron.agent.securitygroups_rpc [None req-ae637900-6d4f-4914-b5d3-eacda6bf763f e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: osdmap e162: 6 total, 6 up, 6 in
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3454001065' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3454001065' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:32 np0005604215.localdomain ceph-mon[298604]: pgmap v320: 177 pgs: 177 active+clean; 716 MiB data, 2.2 GiB used, 40 GiB / 42 GiB avail; 6.2 MiB/s rd, 24 MiB/s wr, 125 op/s
Feb 01 09:57:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e163 e163: 6 total, 6 up, 6 in
Feb 01 09:57:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1595370341' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1595370341' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v322: 177 pgs: 177 active+clean; 838 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 1.1 MiB/s rd, 26 MiB/s wr, 156 op/s
Feb 01 09:57:34 np0005604215.localdomain ceph-mon[298604]: osdmap e163: 6 total, 6 up, 6 in
Feb 01 09:57:34 np0005604215.localdomain ceph-mon[298604]: pgmap v322: 177 pgs: 177 active+clean; 838 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 1.1 MiB/s rd, 26 MiB/s wr, 156 op/s
Feb 01 09:57:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e163 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:35 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:35.132 259225 INFO neutron.agent.linux.ip_lib [None req-7b00ed42-b72e-4f66-b893-28b4bb7bf099 - - - - - -] Device tapff2b3531-69 cannot be used as it has no MAC address
Feb 01 09:57:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:35.153 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:35 np0005604215.localdomain kernel: device tapff2b3531-69 entered promiscuous mode
Feb 01 09:57:35 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939855.1620] manager: (tapff2b3531-69): new Generic device (/org/freedesktop/NetworkManager/Devices/36)
Feb 01 09:57:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:35.162 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:35 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:35Z|00185|binding|INFO|Claiming lport ff2b3531-69db-424e-a495-69e43824d008 for this chassis.
Feb 01 09:57:35 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:35Z|00186|binding|INFO|ff2b3531-69db-424e-a495-69e43824d008: Claiming unknown
Feb 01 09:57:35 np0005604215.localdomain systemd-udevd[311573]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:57:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:35.174 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c94cfbe2-b38a-4f65-b5ff-344bf4929a50', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c94cfbe2-b38a-4f65-b5ff-344bf4929a50', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=049a24bb-789a-44cb-8aa4-57bf18fabc72, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=ff2b3531-69db-424e-a495-69e43824d008) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:35.176 158655 INFO neutron.agent.ovn.metadata.agent [-] Port ff2b3531-69db-424e-a495-69e43824d008 in datapath c94cfbe2-b38a-4f65-b5ff-344bf4929a50 bound to our chassis
Feb 01 09:57:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:35.180 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 153aae87-7053-41f9-b4ef-7b79c315171f IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:57:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:35.180 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c94cfbe2-b38a-4f65-b5ff-344bf4929a50, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:35 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:35.184 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[85f0c8b0-e080-43fe-800d-43d4aa2ee05b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:35 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:35Z|00187|binding|INFO|Setting lport ff2b3531-69db-424e-a495-69e43824d008 ovn-installed in OVS
Feb 01 09:57:35 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:35Z|00188|binding|INFO|Setting lport ff2b3531-69db-424e-a495-69e43824d008 up in Southbound
Feb 01 09:57:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:35.208 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:35.243 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:35.277 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2965659129' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2965659129' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:57:35 np0005604215.localdomain podman[311606]: 2026-02-01 09:57:35.91210386 +0000 UTC m=+0.116545081 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:57:35 np0005604215.localdomain podman[311606]: 2026-02-01 09:57:35.929498919 +0000 UTC m=+0.133940150 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:57:35 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:57:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:36.157 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:36 np0005604215.localdomain podman[311648]: 
Feb 01 09:57:36 np0005604215.localdomain podman[311648]: 2026-02-01 09:57:36.233878301 +0000 UTC m=+0.088980669 container create 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:57:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v323: 177 pgs: 177 active+clean; 838 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 1.1 MiB/s rd, 26 MiB/s wr, 153 op/s
Feb 01 09:57:36 np0005604215.localdomain systemd[1]: Started libpod-conmon-2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb.scope.
Feb 01 09:57:36 np0005604215.localdomain podman[311648]: 2026-02-01 09:57:36.190536078 +0000 UTC m=+0.045638496 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:57:36 np0005604215.localdomain systemd[1]: tmp-crun.oKGqmN.mount: Deactivated successfully.
Feb 01 09:57:36 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:57:36 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/be4dd53ceceb959b546f67efa71b185264c4777498bb167cd87c7cd7f3c9fe8d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:57:36 np0005604215.localdomain podman[311648]: 2026-02-01 09:57:36.340891976 +0000 UTC m=+0.195994364 container init 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:57:36 np0005604215.localdomain podman[311648]: 2026-02-01 09:57:36.351174135 +0000 UTC m=+0.206276523 container start 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:57:36 np0005604215.localdomain dnsmasq[311666]: started, version 2.85 cachesize 150
Feb 01 09:57:36 np0005604215.localdomain dnsmasq[311666]: DNS service limited to local subnets
Feb 01 09:57:36 np0005604215.localdomain dnsmasq[311666]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:57:36 np0005604215.localdomain dnsmasq[311666]: warning: no upstream servers configured
Feb 01 09:57:36 np0005604215.localdomain dnsmasq-dhcp[311666]: DHCP, static leases only on 10.101.0.0, lease time 1d
Feb 01 09:57:36 np0005604215.localdomain dnsmasq[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/addn_hosts - 0 addresses
Feb 01 09:57:36 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/host
Feb 01 09:57:36 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/opts
Feb 01 09:57:36 np0005604215.localdomain ceph-mon[298604]: pgmap v323: 177 pgs: 177 active+clean; 838 MiB data, 2.7 GiB used, 39 GiB / 42 GiB avail; 1.1 MiB/s rd, 26 MiB/s wr, 153 op/s
Feb 01 09:57:36 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:36.549 259225 INFO neutron.agent.dhcp.agent [None req-6ee81a67-a201-4f64-a965-657c7a7e324f - - - - - -] DHCP configuration for ports {'08753108-79e1-436c-9b23-2aa988e503fa'} is completed
Feb 01 09:57:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e164 e164: 6 total, 6 up, 6 in
Feb 01 09:57:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e165 e165: 6 total, 6 up, 6 in
Feb 01 09:57:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:37.086 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:37 np0005604215.localdomain ceph-mon[298604]: osdmap e164: 6 total, 6 up, 6 in
Feb 01 09:57:37 np0005604215.localdomain ceph-mon[298604]: osdmap e165: 6 total, 6 up, 6 in
Feb 01 09:57:37 np0005604215.localdomain sudo[311667]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:57:37 np0005604215.localdomain sudo[311667]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:57:37 np0005604215.localdomain sudo[311667]: pam_unix(sudo:session): session closed for user root
Feb 01 09:57:37 np0005604215.localdomain sudo[311685]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:57:37 np0005604215.localdomain sudo[311685]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:57:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:37.814 2 INFO neutron.agent.securitygroups_rpc [None req-0302dcee-a94e-448d-b9a1-97eb07e05bc2 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:38 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:38.257 2 INFO neutron.agent.securitygroups_rpc [None req-56c81e71-4e5b-4ed6-b56f-ca0ee463b60f 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']
Feb 01 09:57:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v326: 177 pgs: 177 active+clean; 889 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 1.1 MiB/s rd, 50 MiB/s wr, 264 op/s
Feb 01 09:57:38 np0005604215.localdomain sudo[311685]: pam_unix(sudo:session): session closed for user root
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:57:38 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 5dae15b9-7e9f-47ed-b9ed-103090603098 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:57:38 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 5dae15b9-7e9f-47ed-b9ed-103090603098 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:57:38 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 5dae15b9-7e9f-47ed-b9ed-103090603098 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2288622633' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2288622633' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: pgmap v326: 177 pgs: 177 active+clean; 889 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 1.1 MiB/s rd, 50 MiB/s wr, 264 op/s
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:57:38 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:38.817 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:38Z, description=, device_id=a4c0ff24-9b72-4e78-ad9f-fbd408c26d38, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323505e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323504f0>], id=1096bc33-0247-4180-b3cc-295157fa16a5, ip_allocation=immediate, mac_address=fa:16:3e:21:3b:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:29Z, description=, dns_domain=, id=c94cfbe2-b38a-4f65-b5ff-344bf4929a50, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-482542227, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4706, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2209, status=ACTIVE, subnets=['7d86d575-ccf2-4403-beee-fe491e92869a'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:57:33Z, vlan_transparent=None, network_id=c94cfbe2-b38a-4f65-b5ff-344bf4929a50, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2273, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:57:38Z on network c94cfbe2-b38a-4f65-b5ff-344bf4929a50
Feb 01 09:57:38 np0005604215.localdomain sudo[311736]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:57:38 np0005604215.localdomain sudo[311736]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:57:38 np0005604215.localdomain sudo[311736]: pam_unix(sudo:session): session closed for user root
Feb 01 09:57:39 np0005604215.localdomain systemd[1]: tmp-crun.hgXKhx.mount: Deactivated successfully.
Feb 01 09:57:39 np0005604215.localdomain podman[311771]: 2026-02-01 09:57:39.095984423 +0000 UTC m=+0.060459025 container kill 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:39 np0005604215.localdomain dnsmasq[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/addn_hosts - 1 addresses
Feb 01 09:57:39 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/host
Feb 01 09:57:39 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/opts
Feb 01 09:57:39 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:39.302 259225 INFO neutron.agent.dhcp.agent [None req-1fb6d6cd-53c0-4b42-84c7-340330e81d56 - - - - - -] DHCP configuration for ports {'1096bc33-0247-4180-b3cc-295157fa16a5'} is completed
Feb 01 09:57:39 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:39.361 2 INFO neutron.agent.securitygroups_rpc [None req-e6670585-76cd-441d-8f68-6f14a6f35b07 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:57:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:57:39 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:39.699 2 INFO neutron.agent.securitygroups_rpc [None req-d6e67e7e-f7e2-4516-bf2c-7113ac674e15 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:39 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:39.944 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:57:40 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:40.097 2 INFO neutron.agent.securitygroups_rpc [None req-1328bdc5-e5a5-40ba-b48b-34d65147d68f afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group rule updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']
Feb 01 09:57:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e165 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v327: 177 pgs: 177 active+clean; 889 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 65 KiB/s rd, 21 MiB/s wr, 97 op/s
Feb 01 09:57:40 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:40.520 2 INFO neutron.agent.securitygroups_rpc [None req-a0493fd0-d741-481f-ad73-307c48cc986a afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group rule updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']
Feb 01 09:57:40 np0005604215.localdomain ceph-mon[298604]: pgmap v327: 177 pgs: 177 active+clean; 889 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 65 KiB/s rd, 21 MiB/s wr, 97 op/s
Feb 01 09:57:40 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:40.790 2 INFO neutron.agent.securitygroups_rpc [None req-ffd61713-cff0-4f50-b6e4-2930ab2a8c56 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:41.196 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:41 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:41.550 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:38Z, description=, device_id=a4c0ff24-9b72-4e78-ad9f-fbd408c26d38, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032298370>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322980d0>], id=1096bc33-0247-4180-b3cc-295157fa16a5, ip_allocation=immediate, mac_address=fa:16:3e:21:3b:a6, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:29Z, description=, dns_domain=, id=c94cfbe2-b38a-4f65-b5ff-344bf4929a50, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-482542227, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=4706, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2209, status=ACTIVE, subnets=['7d86d575-ccf2-4403-beee-fe491e92869a'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:57:33Z, vlan_transparent=None, network_id=c94cfbe2-b38a-4f65-b5ff-344bf4929a50, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2273, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:57:38Z on network c94cfbe2-b38a-4f65-b5ff-344bf4929a50
Feb 01 09:57:41 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:57:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:57:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e166 e166: 6 total, 6 up, 6 in
Feb 01 09:57:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:41.774 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:57:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:41.775 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:57:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:41.775 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:57:41 np0005604215.localdomain systemd[1]: tmp-crun.9znZ3y.mount: Deactivated successfully.
Feb 01 09:57:41 np0005604215.localdomain dnsmasq[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/addn_hosts - 1 addresses
Feb 01 09:57:41 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/host
Feb 01 09:57:41 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/opts
Feb 01 09:57:41 np0005604215.localdomain podman[311808]: 2026-02-01 09:57:41.793819015 +0000 UTC m=+0.082553059 container kill 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:57:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:42.088 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:42 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:42.103 259225 INFO neutron.agent.dhcp.agent [None req-b7df7686-a057-4d07-b6dc-0a5bd98054fa - - - - - -] DHCP configuration for ports {'1096bc33-0247-4180-b3cc-295157fa16a5'} is completed
Feb 01 09:57:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v329: 177 pgs: 177 active+clean; 889 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 73 KiB/s rd, 24 MiB/s wr, 110 op/s
Feb 01 09:57:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:42.270 2 INFO neutron.agent.securitygroups_rpc [None req-cf10ff71-4763-464e-9c85-a62c4de0813a 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']
Feb 01 09:57:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:57:42 np0005604215.localdomain ceph-mon[298604]: osdmap e166: 6 total, 6 up, 6 in
Feb 01 09:57:42 np0005604215.localdomain ceph-mon[298604]: pgmap v329: 177 pgs: 177 active+clean; 889 MiB data, 3.0 GiB used, 39 GiB / 42 GiB avail; 73 KiB/s rd, 24 MiB/s wr, 110 op/s
Feb 01 09:57:43 np0005604215.localdomain dnsmasq[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/addn_hosts - 0 addresses
Feb 01 09:57:43 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/host
Feb 01 09:57:43 np0005604215.localdomain dnsmasq-dhcp[311666]: read /var/lib/neutron/dhcp/c94cfbe2-b38a-4f65-b5ff-344bf4929a50/opts
Feb 01 09:57:43 np0005604215.localdomain podman[311843]: 2026-02-01 09:57:43.879343545 +0000 UTC m=+0.051361302 container kill 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:57:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:44.060 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:44 np0005604215.localdomain kernel: device tapff2b3531-69 left promiscuous mode
Feb 01 09:57:44 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:44Z|00189|binding|INFO|Releasing lport ff2b3531-69db-424e-a495-69e43824d008 from this chassis (sb_readonly=0)
Feb 01 09:57:44 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:44Z|00190|binding|INFO|Setting lport ff2b3531-69db-424e-a495-69e43824d008 down in Southbound
Feb 01 09:57:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:44.068 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.101.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c94cfbe2-b38a-4f65-b5ff-344bf4929a50', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c94cfbe2-b38a-4f65-b5ff-344bf4929a50', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=049a24bb-789a-44cb-8aa4-57bf18fabc72, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=ff2b3531-69db-424e-a495-69e43824d008) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:44.070 158655 INFO neutron.agent.ovn.metadata.agent [-] Port ff2b3531-69db-424e-a495-69e43824d008 in datapath c94cfbe2-b38a-4f65-b5ff-344bf4929a50 unbound from our chassis
Feb 01 09:57:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:44.074 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c94cfbe2-b38a-4f65-b5ff-344bf4929a50, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:44.075 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[f5ae4823-d30e-4b8d-8808-997cf4175e62]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:44.083 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v330: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 69 KiB/s rd, 36 MiB/s wr, 107 op/s
Feb 01 09:57:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e166 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:45 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:45.133 2 INFO neutron.agent.securitygroups_rpc [None req-1b4acf80-44e4-4b72-89ee-2b772b9e0127 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:45 np0005604215.localdomain ceph-mon[298604]: pgmap v330: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 69 KiB/s rd, 36 MiB/s wr, 107 op/s
Feb 01 09:57:45 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:45.712 2 INFO neutron.agent.securitygroups_rpc [None req-83cc4c41-7e96-4a99-8728-f0acec1b6354 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:46.198 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v331: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 55 KiB/s rd, 29 MiB/s wr, 86 op/s
Feb 01 09:57:46 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3106067247' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:57:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:47.147 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:47 np0005604215.localdomain systemd[1]: tmp-crun.kYsI3s.mount: Deactivated successfully.
Feb 01 09:57:47 np0005604215.localdomain dnsmasq[311666]: exiting on receipt of SIGTERM
Feb 01 09:57:47 np0005604215.localdomain podman[311882]: 2026-02-01 09:57:47.243827533 +0000 UTC m=+0.078386210 container kill 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:57:47 np0005604215.localdomain systemd[1]: libpod-2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb.scope: Deactivated successfully.
Feb 01 09:57:47 np0005604215.localdomain podman[311896]: 2026-02-01 09:57:47.316871256 +0000 UTC m=+0.053368174 container died 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS)
Feb 01 09:57:47 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:47.323 2 INFO neutron.agent.securitygroups_rpc [req-501f4cee-4307-45db-b319-f1b9ce6bf1c9 req-e1bd36d6-c912-4d1e-9e97-4f42b13c68e6 afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group member updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']
Feb 01 09:57:47 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb-userdata-shm.mount: Deactivated successfully.
Feb 01 09:57:47 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:47.366 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:46Z, description=, device_id=125e04fe-9d17-4c49-90a0-ac05d2f548c1, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322aa580>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322a63d0>], id=5da6f481-393c-409c-8dea-40614079a5c1, ip_allocation=immediate, mac_address=fa:16:3e:74:df:91, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:56:48Z, description=, dns_domain=, id=c3e71f40-156c-4217-bedf-836f04a8f728, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-VolumesBackupsTest-2085708237-network, port_security_enabled=True, project_id=ff200d66c230435098f5a0489bf1e8f7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=55349, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2036, status=ACTIVE, subnets=['098397c5-98ca-4cc3-a654-3c1e4a604734'], tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:56:50Z, vlan_transparent=None, network_id=c3e71f40-156c-4217-bedf-836f04a8f728, port_security_enabled=True, project_id=ff200d66c230435098f5a0489bf1e8f7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['95400daf-a74d-4007-ac5f-e79aa8e5c1cd'], standard_attr_id=2322, status=DOWN, tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:57:47Z on network c3e71f40-156c-4217-bedf-836f04a8f728
Feb 01 09:57:47 np0005604215.localdomain ceph-mon[298604]: pgmap v331: 177 pgs: 177 active+clean; 1.0 GiB data, 3.4 GiB used, 39 GiB / 42 GiB avail; 55 KiB/s rd, 29 MiB/s wr, 86 op/s
Feb 01 09:57:47 np0005604215.localdomain podman[311896]: 2026-02-01 09:57:47.389626691 +0000 UTC m=+0.126123569 container remove 2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c94cfbe2-b38a-4f65-b5ff-344bf4929a50, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:57:47 np0005604215.localdomain systemd[1]: libpod-conmon-2b51db5aee8e443f525a345d92130e93dd36d8776f37c21e9b109f623ca160cb.scope: Deactivated successfully.
Feb 01 09:57:47 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 2 addresses
Feb 01 09:57:47 np0005604215.localdomain podman[311940]: 2026-02-01 09:57:47.725853489 +0000 UTC m=+0.063315053 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 09:57:47 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:57:47 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:57:47 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:47.988 259225 INFO neutron.agent.dhcp.agent [None req-93010f84-a8da-46e7-b2dd-7877f51a21a5 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:48 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:48.003 259225 INFO neutron.agent.dhcp.agent [None req-0899b159-fde9-485f-bebc-e4248be0beac - - - - - -] DHCP configuration for ports {'5da6f481-393c-409c-8dea-40614079a5c1'} is completed
Feb 01 09:57:48 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-be4dd53ceceb959b546f67efa71b185264c4777498bb167cd87c7cd7f3c9fe8d-merged.mount: Deactivated successfully.
Feb 01 09:57:48 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dc94cfbe2\x2db38a\x2d4f65\x2db5ff\x2d344bf4929a50.mount: Deactivated successfully.
Feb 01 09:57:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v332: 177 pgs: 177 active+clean; 1.2 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 27 KiB/s rd, 29 MiB/s wr, 50 op/s
Feb 01 09:57:48 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:48.332 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e167 e167: 6 total, 6 up, 6 in
Feb 01 09:57:48 np0005604215.localdomain ceph-mon[298604]: pgmap v332: 177 pgs: 177 active+clean; 1.2 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 27 KiB/s rd, 29 MiB/s wr, 50 op/s
Feb 01 09:57:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:48.857 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:49 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:49.050 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=np0005604213.localdomain, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:57:46Z, description=, device_id=125e04fe-9d17-4c49-90a0-ac05d2f548c1, device_owner=compute:nova, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323358e0>], dns_domain=, dns_name=tempest-volumesbackupstest-instance-1295046314, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f003347bdc0>], id=5da6f481-393c-409c-8dea-40614079a5c1, ip_allocation=immediate, mac_address=fa:16:3e:74:df:91, name=, network_id=c3e71f40-156c-4217-bedf-836f04a8f728, port_security_enabled=True, project_id=ff200d66c230435098f5a0489bf1e8f7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['95400daf-a74d-4007-ac5f-e79aa8e5c1cd'], standard_attr_id=2322, status=DOWN, tags=[], tenant_id=ff200d66c230435098f5a0489bf1e8f7, updated_at=2026-02-01T09:57:48Z on network c3e71f40-156c-4217-bedf-836f04a8f728
Feb 01 09:57:49 np0005604215.localdomain systemd[1]: tmp-crun.nUhgxs.mount: Deactivated successfully.
Feb 01 09:57:49 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 2 addresses
Feb 01 09:57:49 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:57:49 np0005604215.localdomain podman[311978]: 2026-02-01 09:57:49.28857796 +0000 UTC m=+0.074370765 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 01 09:57:49 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:57:49 np0005604215.localdomain ceph-mon[298604]: osdmap e167: 6 total, 6 up, 6 in
Feb 01 09:57:49 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:49.528 259225 INFO neutron.agent.dhcp.agent [None req-c34df685-afcd-46cf-ae8c-06988204d5f1 - - - - - -] DHCP configuration for ports {'5da6f481-393c-409c-8dea-40614079a5c1'} is completed
Feb 01 09:57:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 09:57:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Cumulative writes: 2225 writes, 22K keys, 2225 commit groups, 1.0 writes per commit group, ingest: 0.04 GB, 0.07 MB/s
                                                           Cumulative WAL: 2225 writes, 2225 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2225 writes, 22K keys, 2225 commit groups, 1.0 writes per commit group, ingest: 41.36 MB, 0.07 MB/s
                                                           Interval WAL: 2225 writes, 2225 syncs, 1.00 writes per sync, written: 0.04 GB, 0.07 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    197.5      0.16              0.07         8    0.020       0      0       0.0       0.0
                                                             L6      1/0   18.56 MB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   4.4    167.5    152.7      0.93              0.36         7    0.133     89K   3435       0.0       0.0
                                                            Sum      1/0   18.56 MB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.4    142.5    159.4      1.09              0.42        15    0.073     89K   3435       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0   5.4    142.8    159.8      1.09              0.42        14    0.078     89K   3435       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.1      0.0       0.0   0.0    167.5    152.7      0.93              0.36         7    0.133     89K   3435       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    200.5      0.16              0.07         7    0.023       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 600.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.032, interval 0.032
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.17 GB write, 0.29 MB/s write, 0.15 GB read, 0.26 MB/s read, 1.1 seconds
                                                           Interval compaction: 0.17 GB write, 0.29 MB/s write, 0.15 GB read, 0.26 MB/s read, 1.1 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x562ae85ff1f0#2 capacity: 308.00 MB usage: 15.27 MB table_size: 0 occupancy: 18446744073709551615 collections: 2 last_copies: 0 last_secs: 0.000121 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(682,14.67 MB,4.76337%) FilterBlock(15,267.42 KB,0.0847903%) IndexBlock(15,350.02 KB,0.110978%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 01 09:57:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:50.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e167 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 322961408
Feb 01 09:57:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v334: 177 pgs: 177 active+clean; 1.2 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 32 KiB/s rd, 34 MiB/s wr, 59 op/s
Feb 01 09:57:50 np0005604215.localdomain ceph-mon[298604]: pgmap v334: 177 pgs: 177 active+clean; 1.2 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 32 KiB/s rd, 34 MiB/s wr, 59 op/s
Feb 01 09:57:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e168 e168: 6 total, 6 up, 6 in
Feb 01 09:57:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:50.818 2 INFO neutron.agent.securitygroups_rpc [None req-c7462549-dc03-49f9-bdc5-6b707b180a08 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:51.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:51.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:57:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:51.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:57:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:51.120 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:57:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:51.200 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:57:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:57:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:57:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:57:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e169 e169: 6 total, 6 up, 6 in
Feb 01 09:57:51 np0005604215.localdomain ceph-mon[298604]: osdmap e168: 6 total, 6 up, 6 in
Feb 01 09:57:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3528299120' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:57:51 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:51.496 2 INFO neutron.agent.securitygroups_rpc [None req-f8e6f518-1a93-44d5-bed3-15bc3df0d353 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:57:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:57:51 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:51.513 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:51.767 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8:0:1:f816:3eff:fe29:3cbc'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '30', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6df31c80-e655-4133-9c32-9708470a03c7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=d4bc4012-7c81-4a7f-9a67-f9545d549873) old=Port_Binding(mac=['fa:16:3e:29:3c:bc 2001:db8::f816:3eff:fe29:3cbc'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe29:3cbc/64', 'neutron:device_id': 'ovnmeta-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-cba39058-6a05-4f77-add1-57334b728a66', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fe5c9037c1c44846b3c840cd81d7f177', 'neutron:revision_number': '28', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:51.769 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port d4bc4012-7c81-4a7f-9a67-f9545d549873 in datapath cba39058-6a05-4f77-add1-57334b728a66 updated
Feb 01 09:57:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:51.773 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network cba39058-6a05-4f77-add1-57334b728a66, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:51.774 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[73f3040c-27cb-441c-9ee8-b680be735486]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3426990219' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3426990219' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:52.171 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v337: 177 pgs: 177 active+clean; 1.2 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 30 KiB/s rd, 26 MiB/s wr, 56 op/s
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3411230727' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: osdmap e169: 6 total, 6 up, 6 in
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3426990219' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3426990219' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:52 np0005604215.localdomain ceph-mon[298604]: pgmap v337: 177 pgs: 177 active+clean; 1.2 GiB data, 3.8 GiB used, 38 GiB / 42 GiB avail; 30 KiB/s rd, 26 MiB/s wr, 56 op/s
Feb 01 09:57:52 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:52.592 2 INFO neutron.agent.securitygroups_rpc [None req-78a64591-c841-4e75-af76-a7af0cedc758 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:57:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:57:52 np0005604215.localdomain systemd[1]: tmp-crun.xUARQD.mount: Deactivated successfully.
Feb 01 09:57:52 np0005604215.localdomain podman[311999]: 2026-02-01 09:57:52.887221524 +0000 UTC m=+0.090589308 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:57:52 np0005604215.localdomain podman[311998]: 2026-02-01 09:57:52.854363776 +0000 UTC m=+0.063297633 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, io.buildah.version=1.41.3)
Feb 01 09:57:52 np0005604215.localdomain podman[311999]: 2026-02-01 09:57:52.924169359 +0000 UTC m=+0.127537123 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:57:52 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:57:52 np0005604215.localdomain podman[311998]: 2026-02-01 09:57:52.939669089 +0000 UTC m=+0.148602966 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:57:52 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.118 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.118 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.119 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.119 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.120 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:57:53 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:53.415 2 INFO neutron.agent.securitygroups_rpc [None req-ff2643e0-e24a-4f9b-a883-5340b9397f69 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:57:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3624975071' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.543 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.424s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:57:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3624975071' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:57:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e170 e170: 6 total, 6 up, 6 in
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.748 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.751 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11628MB free_disk=41.774723052978516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.752 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.754 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.820 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.821 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:57:53 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:53.825 2 INFO neutron.agent.securitygroups_rpc [None req-865e7db1-f37b-4e27-b7e7-fae9537a70ac 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.842 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 09:57:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:53.846 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.861 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.861 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.875 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 09:57:53 np0005604215.localdomain podman[312084]: 2026-02-01 09:57:53.908117477 +0000 UTC m=+0.061450356 container kill 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:53 np0005604215.localdomain dnsmasq[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/addn_hosts - 0 addresses
Feb 01 09:57:53 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/host
Feb 01 09:57:53 np0005604215.localdomain dnsmasq-dhcp[310963]: read /var/lib/neutron/dhcp/0ac2ccf3-74d8-4f0a-903f-4cf43406d18d/opts
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.909 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 09:57:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:57:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:53.925 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:57:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:57:54 np0005604215.localdomain podman[312099]: 2026-02-01 09:57:54.052428018 +0000 UTC m=+0.114625042 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, release=1769056855, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7)
Feb 01 09:57:54 np0005604215.localdomain podman[312099]: 2026-02-01 09:57:54.062726017 +0000 UTC m=+0.124922991 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, architecture=x86_64, managed_by=edpm_ansible, config_id=openstack_network_exporter, vcs-type=git, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, release=1769056855, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 09:57:54 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:57:54 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:54Z|00191|binding|INFO|Releasing lport 70e8c4ee-b7bf-45c9-80c5-43450e09967e from this chassis (sb_readonly=0)
Feb 01 09:57:54 np0005604215.localdomain kernel: device tap70e8c4ee-b7 left promiscuous mode
Feb 01 09:57:54 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:57:54Z|00192|binding|INFO|Setting lport 70e8c4ee-b7bf-45c9-80c5-43450e09967e down in Southbound
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.093 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:54.108 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9279ffc0dc2f48079045ce3d49e21210', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2934a88b-2cb8-43fc-bc4a-0266d2f826b9, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=70e8c4ee-b7bf-45c9-80c5-43450e09967e) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:57:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:54.110 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 70e8c4ee-b7bf-45c9-80c5-43450e09967e in datapath 0ac2ccf3-74d8-4f0a-903f-4cf43406d18d unbound from our chassis
Feb 01 09:57:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:54.115 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:57:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:57:54.116 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[81382063-417f-46e6-a096-2bc6be446403]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.116 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:54 np0005604215.localdomain podman[312135]: 2026-02-01 09:57:54.139998952 +0000 UTC m=+0.081763585 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:57:54 np0005604215.localdomain podman[312135]: 2026-02-01 09:57:54.14867041 +0000 UTC m=+0.090435043 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 01 09:57:54 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:57:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:54.228 2 INFO neutron.agent.securitygroups_rpc [None req-ba46717f-c9e1-458a-88f8-c050502ffc34 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v339: 177 pgs: 177 active+clean; 192 MiB data, 1018 MiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 1.4 MiB/s wr, 156 op/s
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.391 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.465s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.397 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.418 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.445 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:57:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:54.446 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:57:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:54.590 2 INFO neutron.agent.securitygroups_rpc [None req-6fa3fe0c-2f0b-4fce-bb11-9a1bc41c0c58 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:57:54 np0005604215.localdomain ceph-mon[298604]: osdmap e170: 6 total, 6 up, 6 in
Feb 01 09:57:54 np0005604215.localdomain ceph-mon[298604]: pgmap v339: 177 pgs: 177 active+clean; 192 MiB data, 1018 MiB used, 41 GiB / 42 GiB avail; 99 KiB/s rd, 1.4 MiB/s wr, 156 op/s
Feb 01 09:57:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3300270245' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:57:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/355717740' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:57:55 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:55.011 2 INFO neutron.agent.securitygroups_rpc [None req-021b4e45-6986-46a7-9869-b4d11b35b6ad 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:57:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:55.030 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e170 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:57:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:55.446 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:55.447 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:55.448 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:55.448 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:55.448 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:57:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3381610859' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:57:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:56.098 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:56.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:57:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:56.202 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:56 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:57:56.235 2 INFO neutron.agent.securitygroups_rpc [None req-bf304f40-e466-4d37-a7c0-f4cca9d82926 c808dfb9cb284e60ac814aa25eae5d58 3e1ea1a33e554968ba8ebaf6753c9c5d - - default default] Security group member updated ['7af9328f-e889-4487-9888-9c5f8b1745d9']
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1939286376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v340: 177 pgs: 177 active+clean; 192 MiB data, 1018 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 1.4 MiB/s wr, 153 op/s
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1939286376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e171 e171: 6 total, 6 up, 6 in
Feb 01 09:57:56 np0005604215.localdomain dnsmasq[310963]: exiting on receipt of SIGTERM
Feb 01 09:57:56 np0005604215.localdomain systemd[1]: libpod-6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c.scope: Deactivated successfully.
Feb 01 09:57:56 np0005604215.localdomain podman[312184]: 2026-02-01 09:57:56.790492557 +0000 UTC m=+0.071566069 container kill 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:57:56 np0005604215.localdomain podman[312196]: 2026-02-01 09:57:56.863951963 +0000 UTC m=+0.060211596 container died 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:57:56 np0005604215.localdomain podman[312196]: 2026-02-01 09:57:56.894042736 +0000 UTC m=+0.090302329 container cleanup 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1939286376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: pgmap v340: 177 pgs: 177 active+clean; 192 MiB data, 1018 MiB used, 41 GiB / 42 GiB avail; 97 KiB/s rd, 1.4 MiB/s wr, 153 op/s
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1939286376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:57:56 np0005604215.localdomain ceph-mon[298604]: osdmap e171: 6 total, 6 up, 6 in
Feb 01 09:57:56 np0005604215.localdomain systemd[1]: libpod-conmon-6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c.scope: Deactivated successfully.
Feb 01 09:57:56 np0005604215.localdomain podman[312198]: 2026-02-01 09:57:56.932481827 +0000 UTC m=+0.120486644 container remove 6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-0ac2ccf3-74d8-4f0a-903f-4cf43406d18d, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:57:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:57.198 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:57.306 259225 INFO neutron.agent.dhcp.agent [None req-fca1c2ab-09be-4ffa-ad47-b60b40dfc27d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:57.308 259225 INFO neutron.agent.dhcp.agent [None req-fca1c2ab-09be-4ffa-ad47-b60b40dfc27d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-a141c608be1b875224d2f7067e777389f3126fc9994644b0ea89131e8d650861-merged.mount: Deactivated successfully.
Feb 01 09:57:57 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6eea0caba832f1892b9044d8e2489846e1759460e40b2c886b19aed32c48bb3c-userdata-shm.mount: Deactivated successfully.
Feb 01 09:57:57 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d0ac2ccf3\x2d74d8\x2d4f0a\x2d903f\x2d4cf43406d18d.mount: Deactivated successfully.
Feb 01 09:57:57 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:57:57.852 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:57:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:57.946 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:57.996 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:57:58.019 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:57:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v342: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 1.2 MiB/s wr, 289 op/s
Feb 01 09:57:59 np0005604215.localdomain ceph-mon[298604]: pgmap v342: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 3.3 MiB/s rd, 1.2 MiB/s wr, 289 op/s
Feb 01 09:58:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:58:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:58:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:58:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157180 "" "Go-http-client/1.1"
Feb 01 09:58:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:58:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18777 "" "Go-http-client/1.1"
Feb 01 09:58:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:00.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:00.117 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e171 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v343: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.0 MiB/s wr, 245 op/s
Feb 01 09:58:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1532095624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:01.204 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:01 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:01.248 2 INFO neutron.agent.securitygroups_rpc [None req-9c92fd3b-2244-4a02-b891-f277532d3dc4 c808dfb9cb284e60ac814aa25eae5d58 3e1ea1a33e554968ba8ebaf6753c9c5d - - default default] Security group member updated ['7af9328f-e889-4487-9888-9c5f8b1745d9']
Feb 01 09:58:01 np0005604215.localdomain ceph-mon[298604]: pgmap v343: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 1.0 MiB/s wr, 245 op/s
Feb 01 09:58:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/783780452' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:01 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:01.558 2 INFO neutron.agent.securitygroups_rpc [None req-eadca791-9490-47ec-9527-60ebe2a9b958 e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:58:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:58:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:58:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:58:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:58:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:58:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:58:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e172 e172: 6 total, 6 up, 6 in
Feb 01 09:58:02 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:02.000 2 INFO neutron.agent.securitygroups_rpc [None req-825e2b67-68b3-4de9-af8c-c04099a8e61e e0ee367368fd4fbebf2e13aa0ff98129 fe5c9037c1c44846b3c840cd81d7f177 - - default default] Security group member updated ['3438fec4-12ca-4b88-8e3d-decadab8f7bf']
Feb 01 09:58:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:02.239 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v345: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 1.1 KiB/s wr, 130 op/s
Feb 01 09:58:02 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:02.665 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f3:be:37 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=9b17a2b9-5e93-4788-90e6-3eea4883a111, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a21a9b5e-c616-4953-aa12-b45630ee9601) old=Port_Binding(mac=['fa:16:3e:f3:be:37 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-f90b2d3c-17ac-4074-8e52-3a58738705b1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:02 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:02.667 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a21a9b5e-c616-4953-aa12-b45630ee9601 in datapath f90b2d3c-17ac-4074-8e52-3a58738705b1 updated
Feb 01 09:58:02 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:02.670 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network f90b2d3c-17ac-4074-8e52-3a58738705b1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:02 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:02.671 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[effcda5e-b00a-48e7-9cd6-7a5990be0057]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:02 np0005604215.localdomain ceph-mon[298604]: osdmap e172: 6 total, 6 up, 6 in
Feb 01 09:58:02 np0005604215.localdomain ceph-mon[298604]: pgmap v345: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 2.7 MiB/s rd, 1.1 KiB/s wr, 130 op/s
Feb 01 09:58:02 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:58:02 np0005604215.localdomain systemd[1]: tmp-crun.ytLvfd.mount: Deactivated successfully.
Feb 01 09:58:02 np0005604215.localdomain podman[312225]: 2026-02-01 09:58:02.858611948 +0000 UTC m=+0.073180528 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:02 np0005604215.localdomain podman[312225]: 2026-02-01 09:58:02.896680248 +0000 UTC m=+0.111248788 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Feb 01 09:58:02 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:58:03 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e173 e173: 6 total, 6 up, 6 in
Feb 01 09:58:03 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:03.774 2 INFO neutron.agent.securitygroups_rpc [None req-6f36bda7-f8b8-46c7-a4c3-95b983979dc7 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']
Feb 01 09:58:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:04.027 2 INFO neutron.agent.securitygroups_rpc [None req-752af62a-19da-4b3f-a3e8-9a1412f9f50e 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v347: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 3.1 MiB/s rd, 1.7 KiB/s wr, 146 op/s
Feb 01 09:58:04 np0005604215.localdomain ceph-mon[298604]: osdmap e173: 6 total, 6 up, 6 in
Feb 01 09:58:04 np0005604215.localdomain ceph-mon[298604]: pgmap v347: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 3.1 MiB/s rd, 1.7 KiB/s wr, 146 op/s
Feb 01 09:58:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e174 e174: 6 total, 6 up, 6 in
Feb 01 09:58:05 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:05.004 2 INFO neutron.agent.securitygroups_rpc [None req-dcf8ea11-57cc-44a8-b32b-d084e8cc9746 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:05 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:05.045 2 INFO neutron.agent.securitygroups_rpc [None req-6f8cb305-6336-427d-a0d9-37ed7bed8449 388100543d2c4f8fb0150ffdd8da2504 674a59d5810c453484339f60db55c64e - - default default] Security group member updated ['41d73aa2-6075-4985-b34c-e67fa66518ee']
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, vol_name:cephfs) < ""
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:05.059+0000 7f93ec23e640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:05.059+0000 7f93ec23e640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:05.059+0000 7f93ec23e640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:05.059+0000 7f93ec23e640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:05.059+0000 7f93ec23e640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e174 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/57e9d5dc-73a6-45c8-a219-7bfb6963c354/.meta.tmp'
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/57e9d5dc-73a6-45c8-a219-7bfb6963c354/.meta.tmp' to config b'/volumes/_nogroup/57e9d5dc-73a6-45c8-a219-7bfb6963c354/.meta'
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, vol_name:cephfs) < ""
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "format": "json"}]: dispatch
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, vol_name:cephfs) < ""
Feb 01 09:58:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, vol_name:cephfs) < ""
Feb 01 09:58:05 np0005604215.localdomain ceph-mon[298604]: osdmap e174: 6 total, 6 up, 6 in
Feb 01 09:58:05 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:58:05 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "format": "json"}]: dispatch
Feb 01 09:58:05 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e175 e175: 6 total, 6 up, 6 in
Feb 01 09:58:05 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:05.879 2 INFO neutron.agent.securitygroups_rpc [None req-cf4e6267-50f2-41a1-bdc4-48a2e39e61cf 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:06.206 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v350: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 341 KiB/s rd, 899 B/s wr, 13 op/s
Feb 01 09:58:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:58:06 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:06.804 2 INFO neutron.agent.securitygroups_rpc [None req-b303317c-4287-410b-804b-7e395b86e859 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:06 np0005604215.localdomain podman[312257]: 2026-02-01 09:58:06.823158279 +0000 UTC m=+0.083763376 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:58:06 np0005604215.localdomain podman[312257]: 2026-02-01 09:58:06.834725398 +0000 UTC m=+0.095330505 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:58:06 np0005604215.localdomain ceph-mon[298604]: mgrmap e59: np0005604215.uhhqtv(active, since 8m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:58:06 np0005604215.localdomain ceph-mon[298604]: osdmap e175: 6 total, 6 up, 6 in
Feb 01 09:58:06 np0005604215.localdomain ceph-mon[298604]: pgmap v350: 177 pgs: 177 active+clean; 192 MiB data, 926 MiB used, 41 GiB / 42 GiB avail; 341 KiB/s rd, 899 B/s wr, 13 op/s
Feb 01 09:58:06 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:58:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:07.274 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:07 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e176 e176: 6 total, 6 up, 6 in
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v352: 177 pgs: 177 active+clean; 217 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 953 KiB/s rd, 5.4 MiB/s wr, 228 op/s
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "format": "json"}]: dispatch
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.652+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '57e9d5dc-73a6-45c8-a219-7bfb6963c354' of type subvolume
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '57e9d5dc-73a6-45c8-a219-7bfb6963c354' of type subvolume
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, vol_name:cephfs) < ""
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/57e9d5dc-73a6-45c8-a219-7bfb6963c354'' moved to trashcan
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:57e9d5dc-73a6-45c8-a219-7bfb6963c354, vol_name:cephfs) < ""
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.684+0000 7f93eea43640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.684+0000 7f93eea43640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.684+0000 7f93eea43640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.684+0000 7f93eea43640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.684+0000 7f93eea43640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.713+0000 7f93eda41640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.713+0000 7f93eda41640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.713+0000 7f93eda41640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.713+0000 7f93eda41640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:08.713+0000 7f93eda41640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 09:58:08 np0005604215.localdomain ceph-mon[298604]: osdmap e176: 6 total, 6 up, 6 in
Feb 01 09:58:08 np0005604215.localdomain ceph-mon[298604]: pgmap v352: 177 pgs: 177 active+clean; 217 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 953 KiB/s rd, 5.4 MiB/s wr, 228 op/s
Feb 01 09:58:08 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e177 e177: 6 total, 6 up, 6 in
Feb 01 09:58:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "format": "json"}]: dispatch
Feb 01 09:58:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "57e9d5dc-73a6-45c8-a219-7bfb6963c354", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:09 np0005604215.localdomain ceph-mon[298604]: osdmap e177: 6 total, 6 up, 6 in
Feb 01 09:58:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e178 e178: 6 total, 6 up, 6 in
Feb 01 09:58:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e178 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 348127232 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v355: 177 pgs: 177 active+clean; 217 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 972 KiB/s rd, 5.5 MiB/s wr, 233 op/s
Feb 01 09:58:10 np0005604215.localdomain ceph-mon[298604]: mgrmap e60: np0005604215.uhhqtv(active, since 8m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 09:58:10 np0005604215.localdomain ceph-mon[298604]: osdmap e178: 6 total, 6 up, 6 in
Feb 01 09:58:10 np0005604215.localdomain ceph-mon[298604]: pgmap v355: 177 pgs: 177 active+clean; 217 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 972 KiB/s rd, 5.5 MiB/s wr, 233 op/s
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.208 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:11.279 2 INFO neutron.agent.securitygroups_rpc [None req-2e636825-6352-4de3-92a6-2082180ce0f9 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:11 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:11.442 2 INFO neutron.agent.securitygroups_rpc [None req-0b3ddafe-7058-48e9-ade9-6122faaa4a98 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:11 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:11.790 259225 INFO neutron.agent.linux.ip_lib [None req-93a17f9e-594d-4b12-bd0f-6a081cf91730 - - - - - -] Device tapb401a566-f9 cannot be used as it has no MAC address
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.812 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain kernel: device tapb401a566-f9 entered promiscuous mode
Feb 01 09:58:11 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939891.8226] manager: (tapb401a566-f9): new Generic device (/org/freedesktop/NetworkManager/Devices/37)
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.822 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:11Z|00193|binding|INFO|Claiming lport b401a566-f92c-44aa-86ca-bb673a1a49df for this chassis.
Feb 01 09:58:11 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:11Z|00194|binding|INFO|b401a566-f92c-44aa-86ca-bb673a1a49df: Claiming unknown
Feb 01 09:58:11 np0005604215.localdomain systemd-udevd[312314]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:11.835 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fd0f2c71-aa97-4b39-b751-c91f8ed96a20', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd0f2c71-aa97-4b39-b751-c91f8ed96a20', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1a79666-54fd-413d-b574-80dec3e84f3c, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=b401a566-f92c-44aa-86ca-bb673a1a49df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:11.837 158655 INFO neutron.agent.ovn.metadata.agent [-] Port b401a566-f92c-44aa-86ca-bb673a1a49df in datapath fd0f2c71-aa97-4b39-b751-c91f8ed96a20 bound to our chassis
Feb 01 09:58:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:11.839 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 91f0b00b-52ea-4ae1-b321-59487fbf888e IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:58:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:11.840 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd0f2c71-aa97-4b39-b751-c91f8ed96a20, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:11.841 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[91bbd165-8b3b-4925-b686-2920868a9136]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.863 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:11Z|00195|binding|INFO|Setting lport b401a566-f92c-44aa-86ca-bb673a1a49df ovn-installed in OVS
Feb 01 09:58:11 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:11Z|00196|binding|INFO|Setting lport b401a566-f92c-44aa-86ca-bb673a1a49df up in Southbound
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.868 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapb401a566-f9: No such device
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.903 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:11.929 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/916283044' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:58:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/916283044' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:58:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:12.027 2 INFO neutron.agent.securitygroups_rpc [None req-4918b331-88ae-4de3-8570-b8490451d4d3 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v356: 177 pgs: 177 active+clean; 217 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 716 KiB/s rd, 4.1 MiB/s wr, 171 op/s
Feb 01 09:58:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:12.298 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:12 np0005604215.localdomain podman[312385]: 
Feb 01 09:58:12 np0005604215.localdomain podman[312385]: 2026-02-01 09:58:12.793332004 +0000 UTC m=+0.090236506 container create 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:12 np0005604215.localdomain systemd[1]: Started libpod-conmon-4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf.scope.
Feb 01 09:58:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:12.845 2 INFO neutron.agent.securitygroups_rpc [None req-c4f766e3-6d2a-4be1-a25c-795440958939 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:12 np0005604215.localdomain podman[312385]: 2026-02-01 09:58:12.751726856 +0000 UTC m=+0.048631428 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:12 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:12 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/21c7157689accf0627dec7ac41e1a6a7bb79ef190b9ceae4d14af1a2c64b5d83/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:12 np0005604215.localdomain podman[312385]: 2026-02-01 09:58:12.871425524 +0000 UTC m=+0.168330036 container init 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:12 np0005604215.localdomain podman[312385]: 2026-02-01 09:58:12.880975451 +0000 UTC m=+0.177879953 container start 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:58:12 np0005604215.localdomain dnsmasq[312404]: started, version 2.85 cachesize 150
Feb 01 09:58:12 np0005604215.localdomain dnsmasq[312404]: DNS service limited to local subnets
Feb 01 09:58:12 np0005604215.localdomain dnsmasq[312404]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:12 np0005604215.localdomain dnsmasq[312404]: warning: no upstream servers configured
Feb 01 09:58:12 np0005604215.localdomain dnsmasq-dhcp[312404]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:58:12 np0005604215.localdomain dnsmasq[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/addn_hosts - 0 addresses
Feb 01 09:58:12 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/host
Feb 01 09:58:12 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/opts
Feb 01 09:58:12 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:12.939 259225 INFO neutron.agent.dhcp.agent [None req-8da746e0-d7aa-4991-abe2-3ef940dd82e9 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:11Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b642e0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323bc2b0>], id=24e9fcfa-968e-4b3e-9010-3ad066cf1940, ip_allocation=immediate, mac_address=fa:16:3e:a5:88:44, name=tempest-PortsTestJSON-1760108498, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:09Z, description=, dns_domain=, id=fd0f2c71-aa97-4b39-b751-c91f8ed96a20, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-988107197, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50137, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2381, status=ACTIVE, subnets=['752bc011-3910-4afc-b3cd-dd2d12938ecb'], tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:10Z, vlan_transparent=None, network_id=fd0f2c71-aa97-4b39-b751-c91f8ed96a20, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['277d73b7-d267-437d-b5df-bd560d180a7a'], standard_attr_id=2395, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:11Z on network fd0f2c71-aa97-4b39-b751-c91f8ed96a20
Feb 01 09:58:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e179 e179: 6 total, 6 up, 6 in
Feb 01 09:58:12 np0005604215.localdomain ceph-mon[298604]: pgmap v356: 177 pgs: 177 active+clean; 217 MiB data, 1000 MiB used, 41 GiB / 42 GiB avail; 716 KiB/s rd, 4.1 MiB/s wr, 171 op/s
Feb 01 09:58:13 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:13.119 259225 INFO neutron.agent.dhcp.agent [None req-e4492cc7-bcc7-40a4-a219-ea28a02989d8 - - - - - -] DHCP configuration for ports {'ca31c6b2-9a19-43e6-9856-38c7b199a032'} is completed
Feb 01 09:58:13 np0005604215.localdomain dnsmasq[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/addn_hosts - 1 addresses
Feb 01 09:58:13 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/host
Feb 01 09:58:13 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/opts
Feb 01 09:58:13 np0005604215.localdomain podman[312422]: 2026-02-01 09:58:13.252227733 +0000 UTC m=+0.059502474 container kill 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:58:13 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:13.351 2 INFO neutron.agent.securitygroups_rpc [None req-07f0d5e0-07b1-4081-9883-e9d75a91fc18 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:13 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:13.392 259225 INFO neutron.agent.dhcp.agent [None req-ad122b0b-ae9e-4cdc-82dc-6ad8b77d9941 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:11Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322e11c0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322ac370>], id=95c5ff06-26c9-4d2f-b0bb-4c214ed71f24, ip_allocation=immediate, mac_address=fa:16:3e:26:75:fb, name=tempest-PortsTestJSON-1814381455, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:09Z, description=, dns_domain=, id=fd0f2c71-aa97-4b39-b751-c91f8ed96a20, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-988107197, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=50137, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2381, status=ACTIVE, subnets=['752bc011-3910-4afc-b3cd-dd2d12938ecb'], tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:10Z, vlan_transparent=None, network_id=fd0f2c71-aa97-4b39-b751-c91f8ed96a20, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['277d73b7-d267-437d-b5df-bd560d180a7a'], standard_attr_id=2408, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:11Z on network fd0f2c71-aa97-4b39-b751-c91f8ed96a20
Feb 01 09:58:13 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:13.468 259225 INFO neutron.agent.dhcp.agent [None req-ba2f7910-494f-4cc0-a3d7-94ff2e458719 - - - - - -] DHCP configuration for ports {'24e9fcfa-968e-4b3e-9010-3ad066cf1940'} is completed
Feb 01 09:58:13 np0005604215.localdomain dnsmasq[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/addn_hosts - 2 addresses
Feb 01 09:58:13 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/host
Feb 01 09:58:13 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/opts
Feb 01 09:58:13 np0005604215.localdomain podman[312462]: 2026-02-01 09:58:13.59838919 +0000 UTC m=+0.057043840 container kill 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 09:58:13 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:13.615 2 INFO neutron.agent.securitygroups_rpc [None req-07f0d5e0-07b1-4081-9883-e9d75a91fc18 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:13 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:13.653 259225 INFO neutron.agent.linux.ip_lib [None req-0473742d-5051-4b1d-8f7f-48321e2503c9 - - - - - -] Device tap3a91aa3a-fb cannot be used as it has no MAC address
Feb 01 09:58:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:13.727 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:13 np0005604215.localdomain kernel: device tap3a91aa3a-fb entered promiscuous mode
Feb 01 09:58:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:13Z|00197|binding|INFO|Claiming lport 3a91aa3a-fb9f-4945-91e3-85f0d278b0b5 for this chassis.
Feb 01 09:58:13 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939893.7337] manager: (tap3a91aa3a-fb): new Generic device (/org/freedesktop/NetworkManager/Devices/38)
Feb 01 09:58:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:13Z|00198|binding|INFO|3a91aa3a-fb9f-4945-91e3-85f0d278b0b5: Claiming unknown
Feb 01 09:58:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:13.735 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:13 np0005604215.localdomain systemd-udevd[312317]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:13.744 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28f2370b-4aa7-434f-90cb-05cc01bed2bb, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=3a91aa3a-fb9f-4945-91e3-85f0d278b0b5) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:13.745 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 3a91aa3a-fb9f-4945-91e3-85f0d278b0b5 in datapath 4da937bf-f66e-48ce-bf66-f7d3d9f7bc52 bound to our chassis
Feb 01 09:58:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:13.747 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4da937bf-f66e-48ce-bf66-f7d3d9f7bc52 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:13.747 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ff3505d4-dfe6-41d6-bded-f2987918e060]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:13Z|00199|binding|INFO|Setting lport 3a91aa3a-fb9f-4945-91e3-85f0d278b0b5 ovn-installed in OVS
Feb 01 09:58:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:13Z|00200|binding|INFO|Setting lport 3a91aa3a-fb9f-4945-91e3-85f0d278b0b5 up in Southbound
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:13.768 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap3a91aa3a-fb: No such device
Feb 01 09:58:13 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:13.862 2 INFO neutron.agent.securitygroups_rpc [None req-b64013eb-118b-4ee2-9fc4-6cc3b98c578e 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:13.905 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:13.929 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:13 np0005604215.localdomain ceph-mon[298604]: osdmap e179: 6 total, 6 up, 6 in
Feb 01 09:58:14 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:14.005 259225 INFO neutron.agent.dhcp.agent [None req-3d38efc9-8679-40f9-8a62-ff441f1be388 - - - - - -] DHCP configuration for ports {'95c5ff06-26c9-4d2f-b0bb-4c214ed71f24'} is completed
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/addn_hosts - 1 addresses
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/host
Feb 01 09:58:14 np0005604215.localdomain podman[312542]: 2026-02-01 09:58:14.191724714 +0000 UTC m=+0.069812604 container kill 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/opts
Feb 01 09:58:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v358: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 132 KiB/s rd, 191 KiB/s wr, 99 op/s
Feb 01 09:58:14 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:14.503 2 INFO neutron.agent.securitygroups_rpc [None req-ff9a1527-b1c3-4b84-beeb-30758949e010 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:14 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:14.526 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:14 np0005604215.localdomain systemd[1]: tmp-crun.E0T6mI.mount: Deactivated successfully.
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/addn_hosts - 0 addresses
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/host
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312404]: read /var/lib/neutron/dhcp/fd0f2c71-aa97-4b39-b751-c91f8ed96a20/opts
Feb 01 09:58:14 np0005604215.localdomain podman[312593]: 2026-02-01 09:58:14.572846033 +0000 UTC m=+0.077138331 container kill 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 09:58:14 np0005604215.localdomain podman[312631]: 
Feb 01 09:58:14 np0005604215.localdomain podman[312631]: 2026-02-01 09:58:14.78316763 +0000 UTC m=+0.084400326 container create 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Feb 01 09:58:14 np0005604215.localdomain systemd[1]: Started libpod-conmon-6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b.scope.
Feb 01 09:58:14 np0005604215.localdomain systemd[1]: tmp-crun.n4HhGo.mount: Deactivated successfully.
Feb 01 09:58:14 np0005604215.localdomain podman[312631]: 2026-02-01 09:58:14.743987846 +0000 UTC m=+0.045220512 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:14 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:14 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/4383ecd4dd19cf8e2c67a94d44ef1d65db9bfee3a02ca20cd31b89d8b95a13e4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:14 np0005604215.localdomain podman[312631]: 2026-02-01 09:58:14.872850779 +0000 UTC m=+0.174083465 container init 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:58:14 np0005604215.localdomain podman[312631]: 2026-02-01 09:58:14.881981572 +0000 UTC m=+0.183214268 container start 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312654]: started, version 2.85 cachesize 150
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312654]: DNS service limited to local subnets
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312654]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312654]: warning: no upstream servers configured
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312654]: DHCP, static leases only on 10.100.255.240, lease time 1d
Feb 01 09:58:14 np0005604215.localdomain dnsmasq[312654]: read /var/lib/neutron/dhcp/4da937bf-f66e-48ce-bf66-f7d3d9f7bc52/addn_hosts - 0 addresses
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312654]: read /var/lib/neutron/dhcp/4da937bf-f66e-48ce-bf66-f7d3d9f7bc52/host
Feb 01 09:58:14 np0005604215.localdomain dnsmasq-dhcp[312654]: read /var/lib/neutron/dhcp/4da937bf-f66e-48ce-bf66-f7d3d9f7bc52/opts
Feb 01 09:58:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:14.948 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 91f0b00b-52ea-4ae1-b321-59487fbf888e with type ""
Feb 01 09:58:14 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:14Z|00201|binding|INFO|Removing iface tapb401a566-f9 ovn-installed in OVS
Feb 01 09:58:14 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:14Z|00202|binding|INFO|Removing lport b401a566-f92c-44aa-86ca-bb673a1a49df ovn-installed in OVS
Feb 01 09:58:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:14.950 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fd0f2c71-aa97-4b39-b751-c91f8ed96a20', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fd0f2c71-aa97-4b39-b751-c91f8ed96a20', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a1a79666-54fd-413d-b574-80dec3e84f3c, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=b401a566-f92c-44aa-86ca-bb673a1a49df) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:14.951 158655 INFO neutron.agent.ovn.metadata.agent [-] Port b401a566-f92c-44aa-86ca-bb673a1a49df in datapath fd0f2c71-aa97-4b39-b751-c91f8ed96a20 unbound from our chassis
Feb 01 09:58:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:14.953 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fd0f2c71-aa97-4b39-b751-c91f8ed96a20, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:14 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:14.954 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[7f3b873c-e56b-4324-a22f-d75f9d5b31c4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:14.988 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:14.995 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:15 np0005604215.localdomain ceph-mon[298604]: pgmap v358: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 132 KiB/s rd, 191 KiB/s wr, 99 op/s
Feb 01 09:58:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e180 e180: 6 total, 6 up, 6 in
Feb 01 09:58:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:15.052 259225 INFO neutron.agent.dhcp.agent [None req-66213892-68cc-46aa-a125-71ce87b58acf - - - - - -] DHCP configuration for ports {'5935b501-bf91-4a16-bdb0-4b0523f2e8eb'} is completed
Feb 01 09:58:15 np0005604215.localdomain podman[312668]: 2026-02-01 09:58:15.059001726 +0000 UTC m=+0.052808807 container kill 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:58:15 np0005604215.localdomain dnsmasq[312404]: exiting on receipt of SIGTERM
Feb 01 09:58:15 np0005604215.localdomain systemd[1]: libpod-4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf.scope: Deactivated successfully.
Feb 01 09:58:15 np0005604215.localdomain podman[312682]: 2026-02-01 09:58:15.118231491 +0000 UTC m=+0.047777311 container died 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e180 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:15 np0005604215.localdomain podman[312682]: 2026-02-01 09:58:15.149645166 +0000 UTC m=+0.079190916 container cleanup 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:58:15 np0005604215.localdomain systemd[1]: libpod-conmon-4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf.scope: Deactivated successfully.
Feb 01 09:58:15 np0005604215.localdomain podman[312686]: 2026-02-01 09:58:15.211560944 +0000 UTC m=+0.130623359 container remove 4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fd0f2c71-aa97-4b39-b751-c91f8ed96a20, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:15.222 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:15.230 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:15 np0005604215.localdomain kernel: device tapb401a566-f9 left promiscuous mode
Feb 01 09:58:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:15.242 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:15.258 259225 INFO neutron.agent.dhcp.agent [None req-dc7fbdb6-94ac-4033-8e77-88407fd3bf51 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:15.259 259225 INFO neutron.agent.dhcp.agent [None req-dc7fbdb6-94ac-4033-8e77-88407fd3bf51 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:15.310 2 INFO neutron.agent.securitygroups_rpc [None req-dea63ea4-98ed-4833-b1d5-beae69081804 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-21c7157689accf0627dec7ac41e1a6a7bb79ef190b9ceae4d14af1a2c64b5d83-merged.mount: Deactivated successfully.
Feb 01 09:58:15 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4342662ef3e8d0ee2dc1ebaffa6b34371e14b3d219158381d69898f07ea49adf-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:15 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dfd0f2c71\x2daa97\x2d4b39\x2db751\x2dc91f8ed96a20.mount: Deactivated successfully.
Feb 01 09:58:16 np0005604215.localdomain ceph-mon[298604]: osdmap e180: 6 total, 6 up, 6 in
Feb 01 09:58:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:16.210 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:16 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:16.236 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v360: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 181 KiB/s wr, 94 op/s
Feb 01 09:58:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e181 e181: 6 total, 6 up, 6 in
Feb 01 09:58:17 np0005604215.localdomain ceph-mon[298604]: pgmap v360: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 125 KiB/s rd, 181 KiB/s wr, 94 op/s
Feb 01 09:58:17 np0005604215.localdomain ceph-mon[298604]: osdmap e181: 6 total, 6 up, 6 in
Feb 01 09:58:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:17.327 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:17.899 2 INFO neutron.agent.securitygroups_rpc [None req-38946330-c139-4cbe-adf2-c14c00d51ca6 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v362: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 169 KiB/s rd, 220 KiB/s wr, 152 op/s
Feb 01 09:58:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e182 e182: 6 total, 6 up, 6 in
Feb 01 09:58:19 np0005604215.localdomain ceph-mon[298604]: pgmap v362: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 169 KiB/s rd, 220 KiB/s wr, 152 op/s
Feb 01 09:58:19 np0005604215.localdomain ceph-mon[298604]: osdmap e182: 6 total, 6 up, 6 in
Feb 01 09:58:19 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:19.414 2 INFO neutron.agent.securitygroups_rpc [None req-dc033b5b-d88f-40f8-9b27-1242de33844a d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:19 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:19.733 259225 INFO neutron.agent.linux.ip_lib [None req-ddf51fcc-26fa-4652-804e-7b0579ae6d59 - - - - - -] Device tap9a001166-81 cannot be used as it has no MAC address
Feb 01 09:58:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:19.752 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:19 np0005604215.localdomain kernel: device tap9a001166-81 entered promiscuous mode
Feb 01 09:58:19 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939899.7603] manager: (tap9a001166-81): new Generic device (/org/freedesktop/NetworkManager/Devices/39)
Feb 01 09:58:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:19.762 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:19Z|00203|binding|INFO|Claiming lport 9a001166-8198-4237-92de-4f0266ce26a0 for this chassis.
Feb 01 09:58:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:19Z|00204|binding|INFO|9a001166-8198-4237-92de-4f0266ce26a0: Claiming unknown
Feb 01 09:58:19 np0005604215.localdomain systemd-udevd[312722]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:19.777 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-8f28b580-fb6e-4167-81da-20b98b3e9051', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f28b580-fb6e-4167-81da-20b98b3e9051', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=578abda3-3a27-44f6-a802-bbe6cde94e49, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9a001166-8198-4237-92de-4f0266ce26a0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:19.778 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9a001166-8198-4237-92de-4f0266ce26a0 in datapath 8f28b580-fb6e-4167-81da-20b98b3e9051 bound to our chassis
Feb 01 09:58:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:19.779 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f28b580-fb6e-4167-81da-20b98b3e9051 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:19.781 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[74e114ff-5e50-4375-ab90-84ab6aa872fd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:19.796 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:19Z|00205|binding|INFO|Setting lport 9a001166-8198-4237-92de-4f0266ce26a0 ovn-installed in OVS
Feb 01 09:58:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:19Z|00206|binding|INFO|Setting lport 9a001166-8198-4237-92de-4f0266ce26a0 up in Southbound
Feb 01 09:58:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:19.800 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9a001166-81: No such device
Feb 01 09:58:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:19.831 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:19.855 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:20 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:20.027 2 INFO neutron.agent.securitygroups_rpc [None req-26640e7d-8ddf-4cc3-b0ea-945d98bbd76e d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e182 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v364: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 28 KiB/s wr, 52 op/s
Feb 01 09:58:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e183 e183: 6 total, 6 up, 6 in
Feb 01 09:58:20 np0005604215.localdomain podman[312793]: 
Feb 01 09:58:20 np0005604215.localdomain podman[312793]: 2026-02-01 09:58:20.585889607 +0000 UTC m=+0.064006354 container create fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:20 np0005604215.localdomain systemd[1]: Started libpod-conmon-fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1.scope.
Feb 01 09:58:20 np0005604215.localdomain podman[312793]: 2026-02-01 09:58:20.553991499 +0000 UTC m=+0.032108226 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:20 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:20 np0005604215.localdomain systemd[1]: tmp-crun.TIL960.mount: Deactivated successfully.
Feb 01 09:58:20 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fa36a994553b881feddf38c260e7f34cbb97e21c52634976f025a88a1d4d79cc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:20 np0005604215.localdomain podman[312793]: 2026-02-01 09:58:20.676130073 +0000 UTC m=+0.154246810 container init fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:58:20 np0005604215.localdomain podman[312793]: 2026-02-01 09:58:20.689551629 +0000 UTC m=+0.167668376 container start fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 01 09:58:20 np0005604215.localdomain dnsmasq[312811]: started, version 2.85 cachesize 150
Feb 01 09:58:20 np0005604215.localdomain dnsmasq[312811]: DNS service limited to local subnets
Feb 01 09:58:20 np0005604215.localdomain dnsmasq[312811]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:20 np0005604215.localdomain dnsmasq[312811]: warning: no upstream servers configured
Feb 01 09:58:20 np0005604215.localdomain dnsmasq-dhcp[312811]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:58:20 np0005604215.localdomain dnsmasq[312811]: read /var/lib/neutron/dhcp/8f28b580-fb6e-4167-81da-20b98b3e9051/addn_hosts - 0 addresses
Feb 01 09:58:20 np0005604215.localdomain dnsmasq-dhcp[312811]: read /var/lib/neutron/dhcp/8f28b580-fb6e-4167-81da-20b98b3e9051/host
Feb 01 09:58:20 np0005604215.localdomain dnsmasq-dhcp[312811]: read /var/lib/neutron/dhcp/8f28b580-fb6e-4167-81da-20b98b3e9051/opts
Feb 01 09:58:20 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:20.892 259225 INFO neutron.agent.dhcp.agent [None req-0355b246-ad0c-469f-97c7-28507c1a8e5e - - - - - -] DHCP configuration for ports {'41c64030-4370-4e37-b8a1-94597c9c8e2f'} is completed
Feb 01 09:58:21 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:21Z|00207|binding|INFO|Removing iface tap9a001166-81 ovn-installed in OVS
Feb 01 09:58:21 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:21Z|00208|binding|INFO|Removing lport 9a001166-8198-4237-92de-4f0266ce26a0 ovn-installed in OVS
Feb 01 09:58:21 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:21.013 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port dc7b660d-1162-4d4d-ae2d-e314c0d2e224 with type ""
Feb 01 09:58:21 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:21.015 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-8f28b580-fb6e-4167-81da-20b98b3e9051', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8f28b580-fb6e-4167-81da-20b98b3e9051', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=578abda3-3a27-44f6-a802-bbe6cde94e49, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9a001166-8198-4237-92de-4f0266ce26a0) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:21 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:21.017 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9a001166-8198-4237-92de-4f0266ce26a0 in datapath 8f28b580-fb6e-4167-81da-20b98b3e9051 unbound from our chassis
Feb 01 09:58:21 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:21.019 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 8f28b580-fb6e-4167-81da-20b98b3e9051 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:21 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:21.020 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5dda09d2-e287-4111-b6e0-91fcae2ca37f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:21.045 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:21 np0005604215.localdomain dnsmasq[312811]: exiting on receipt of SIGTERM
Feb 01 09:58:21 np0005604215.localdomain podman[312827]: 2026-02-01 09:58:21.073985641 +0000 UTC m=+0.080710931 container kill fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:58:21 np0005604215.localdomain systemd[1]: libpod-fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1.scope: Deactivated successfully.
Feb 01 09:58:21 np0005604215.localdomain podman[312841]: 2026-02-01 09:58:21.137325424 +0000 UTC m=+0.053529850 container died fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:21 np0005604215.localdomain podman[312841]: 2026-02-01 09:58:21.168403597 +0000 UTC m=+0.084608023 container cleanup fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:21 np0005604215.localdomain systemd[1]: libpod-conmon-fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1.scope: Deactivated successfully.
Feb 01 09:58:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:21.212 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:21 np0005604215.localdomain podman[312846]: 2026-02-01 09:58:21.216317521 +0000 UTC m=+0.122388534 container remove fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-8f28b580-fb6e-4167-81da-20b98b3e9051, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:58:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:21.226 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:21 np0005604215.localdomain kernel: device tap9a001166-81 left promiscuous mode
Feb 01 09:58:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:21.237 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:21 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:21.252 259225 INFO neutron.agent.dhcp.agent [None req-70d9c5bb-c8f3-42c0-a492-aa912eb246c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:21 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:21.252 259225 INFO neutron.agent.dhcp.agent [None req-70d9c5bb-c8f3-42c0-a492-aa912eb246c2 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:21 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:21.277 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:58:21
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['manila_metadata', '.mgr', 'manila_data', 'backups', 'vms', 'volumes', 'images']
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:58:21 np0005604215.localdomain ceph-mon[298604]: pgmap v364: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 36 KiB/s rd, 28 KiB/s wr, 52 op/s
Feb 01 09:58:21 np0005604215.localdomain ceph-mon[298604]: osdmap e183: 6 total, 6 up, 6 in
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.006580482708682301 of space, bias 1.0, pg target 1.31609654173646 quantized to 32 (current 32)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.635783082077052e-06 of space, bias 1.0, pg target 0.0003255208333333333 quantized to 32 (current 32)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8555772569444443 quantized to 32 (current 32)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.4071718546435884e-05 quantized to 32 (current 32)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 7.088393355667225e-06 of space, bias 4.0, pg target 0.0056234587288293315 quantized to 16 (current 16)
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:58:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:58:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-fa36a994553b881feddf38c260e7f34cbb97e21c52634976f025a88a1d4d79cc-merged.mount: Deactivated successfully.
Feb 01 09:58:21 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fc39355a2956ee74a90ca1008c91234f5246b5b112f8b8dca778b8fd8f1bfee1-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:21 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d8f28b580\x2dfb6e\x2d4167\x2d81da\x2d20b98b3e9051.mount: Deactivated successfully.
Feb 01 09:58:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e184 e184: 6 total, 6 up, 6 in
Feb 01 09:58:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:21.761 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v367: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 31 KiB/s wr, 56 op/s
Feb 01 09:58:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:22.361 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:22 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:22.603 2 INFO neutron.agent.securitygroups_rpc [None req-4dc119ae-7dfd-4794-add0-4c162f41d887 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:22 np0005604215.localdomain ceph-mon[298604]: osdmap e184: 6 total, 6 up, 6 in
Feb 01 09:58:22 np0005604215.localdomain ceph-mon[298604]: pgmap v367: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 31 KiB/s wr, 56 op/s
Feb 01 09:58:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e185 e185: 6 total, 6 up, 6 in
Feb 01 09:58:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:58:23 np0005604215.localdomain ceph-mon[298604]: osdmap e185: 6 total, 6 up, 6 in
Feb 01 09:58:23 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:58:23 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:23.852 259225 INFO neutron.agent.linux.ip_lib [None req-207058d4-8934-477d-9997-5666c14de9b7 - - - - - -] Device tap9d56394b-75 cannot be used as it has no MAC address
Feb 01 09:58:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:23.876 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:23 np0005604215.localdomain kernel: device tap9d56394b-75 entered promiscuous mode
Feb 01 09:58:23 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939903.8859] manager: (tap9d56394b-75): new Generic device (/org/freedesktop/NetworkManager/Devices/40)
Feb 01 09:58:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:23.887 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:23 np0005604215.localdomain systemd-udevd[312912]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:23 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:23Z|00209|binding|INFO|Claiming lport 9d56394b-750d-4167-8e00-5138f0e20ab4 for this chassis.
Feb 01 09:58:23 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:23Z|00210|binding|INFO|9d56394b-750d-4167-8e00-5138f0e20ab4: Claiming unknown
Feb 01 09:58:23 np0005604215.localdomain podman[312877]: 2026-02-01 09:58:23.894398171 +0000 UTC m=+0.098322897 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:58:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:23.904 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-ddb7490a-2172-4022-90d8-32c9167c3083', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ddb7490a-2172-4022-90d8-32c9167c3083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08cdf106-e071-4811-8b04-e1f5131f8f49, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9d56394b-750d-4167-8e00-5138f0e20ab4) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:23.906 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9d56394b-750d-4167-8e00-5138f0e20ab4 in datapath ddb7490a-2172-4022-90d8-32c9167c3083 bound to our chassis
Feb 01 09:58:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:23.908 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ddb7490a-2172-4022-90d8-32c9167c3083 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:23 np0005604215.localdomain podman[312877]: 2026-02-01 09:58:23.909569102 +0000 UTC m=+0.113493848 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:58:23 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:23.910 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[b0a52351-28df-45ca-8dbc-8916dac014a3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:23Z|00211|binding|INFO|Setting lport 9d56394b-750d-4167-8e00-5138f0e20ab4 ovn-installed in OVS
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:23Z|00212|binding|INFO|Setting lport 9d56394b-750d-4167-8e00-5138f0e20ab4 up in Southbound
Feb 01 09:58:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:23.924 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:23 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap9d56394b-75: No such device
Feb 01 09:58:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:23.957 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:23.990 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:23 np0005604215.localdomain podman[312876]: 2026-02-01 09:58:23.994040979 +0000 UTC m=+0.198742369 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:58:24 np0005604215.localdomain podman[312876]: 2026-02-01 09:58:24.058785195 +0000 UTC m=+0.263486585 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:58:24 np0005604215.localdomain podman[312963]: 2026-02-01 09:58:24.207824403 +0000 UTC m=+0.107148251 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.7, vendor=Red Hat, Inc., distribution-scope=public, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, architecture=x86_64, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, release=1769056855, io.openshift.expose-services=)
Feb 01 09:58:24 np0005604215.localdomain podman[312963]: 2026-02-01 09:58:24.248522835 +0000 UTC m=+0.147846663 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, build-date=2026-01-22T05:09:47Z, release=1769056855, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-type=git, managed_by=edpm_ansible, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:58:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v369: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 9.1 KiB/s wr, 44 op/s
Feb 01 09:58:24 np0005604215.localdomain podman[312986]: 2026-02-01 09:58:24.317705848 +0000 UTC m=+0.097223453 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:58:24 np0005604215.localdomain podman[312986]: 2026-02-01 09:58:24.350638698 +0000 UTC m=+0.130156273 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:58:24 np0005604215.localdomain podman[313040]: 
Feb 01 09:58:24 np0005604215.localdomain podman[313040]: 2026-02-01 09:58:24.749342212 +0000 UTC m=+0.087112810 container create 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: tmp-crun.tdg12E.mount: Deactivated successfully.
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: Started libpod-conmon-835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c.scope.
Feb 01 09:58:24 np0005604215.localdomain ceph-mon[298604]: pgmap v369: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 9.1 KiB/s wr, 44 op/s
Feb 01 09:58:24 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e186 e186: 6 total, 6 up, 6 in
Feb 01 09:58:24 np0005604215.localdomain podman[313040]: 2026-02-01 09:58:24.706541816 +0000 UTC m=+0.044312474 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:24 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6e196fdec99f1e2f5b5d327d1b1bde3b88b31de40ec25a82cda6557ff2bf9c71/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:24 np0005604215.localdomain podman[313040]: 2026-02-01 09:58:24.820359143 +0000 UTC m=+0.158129771 container init 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:58:24 np0005604215.localdomain podman[313040]: 2026-02-01 09:58:24.830876318 +0000 UTC m=+0.168646916 container start 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:58:24 np0005604215.localdomain dnsmasq[313058]: started, version 2.85 cachesize 150
Feb 01 09:58:24 np0005604215.localdomain dnsmasq[313058]: DNS service limited to local subnets
Feb 01 09:58:24 np0005604215.localdomain dnsmasq[313058]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:24 np0005604215.localdomain dnsmasq[313058]: warning: no upstream servers configured
Feb 01 09:58:24 np0005604215.localdomain dnsmasq-dhcp[313058]: DHCPv6, static leases only on 2001:db8::, lease time 1d
Feb 01 09:58:24 np0005604215.localdomain dnsmasq[313058]: read /var/lib/neutron/dhcp/ddb7490a-2172-4022-90d8-32c9167c3083/addn_hosts - 0 addresses
Feb 01 09:58:24 np0005604215.localdomain dnsmasq-dhcp[313058]: read /var/lib/neutron/dhcp/ddb7490a-2172-4022-90d8-32c9167c3083/host
Feb 01 09:58:24 np0005604215.localdomain dnsmasq-dhcp[313058]: read /var/lib/neutron/dhcp/ddb7490a-2172-4022-90d8-32c9167c3083/opts
Feb 01 09:58:24 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:24.892 259225 INFO neutron.agent.dhcp.agent [None req-207058d4-8934-477d-9997-5666c14de9b7 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:23Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00323350d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032335a60>], id=d1696f58-ce55-430b-aa17-07e03a47e863, ip_allocation=immediate, mac_address=fa:16:3e:1a:26:a2, name=tempest-PortsIpV6TestJSON-760066674, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:21Z, description=, dns_domain=, id=ddb7490a-2172-4022-90d8-32c9167c3083, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsIpV6TestJSON-1383902506, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=14606, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2478, status=ACTIVE, subnets=['2f006930-26ca-4f51-9fb8-930c6e961e50'], tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:23Z, vlan_transparent=None, network_id=ddb7490a-2172-4022-90d8-32c9167c3083, port_security_enabled=True, project_id=fea4c3ac6fd14aee8b0de1bad5f8673a, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2486, status=DOWN, tags=[], tenant_id=fea4c3ac6fd14aee8b0de1bad5f8673a, updated_at=2026-02-01T09:58:23Z on network ddb7490a-2172-4022-90d8-32c9167c3083
Feb 01 09:58:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:25.043 259225 INFO neutron.agent.dhcp.agent [None req-e62f7f83-c786-4ac0-9dcb-0985735b3227 - - - - - -] DHCP configuration for ports {'691e6e5a-5b53-4cda-aa63-ca7824ce5dca'} is completed
Feb 01 09:58:25 np0005604215.localdomain dnsmasq[313058]: read /var/lib/neutron/dhcp/ddb7490a-2172-4022-90d8-32c9167c3083/addn_hosts - 1 addresses
Feb 01 09:58:25 np0005604215.localdomain dnsmasq-dhcp[313058]: read /var/lib/neutron/dhcp/ddb7490a-2172-4022-90d8-32c9167c3083/host
Feb 01 09:58:25 np0005604215.localdomain dnsmasq-dhcp[313058]: read /var/lib/neutron/dhcp/ddb7490a-2172-4022-90d8-32c9167c3083/opts
Feb 01 09:58:25 np0005604215.localdomain podman[313077]: 2026-02-01 09:58:25.085825208 +0000 UTC m=+0.061966491 container kill 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:58:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e186 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:25.265 259225 INFO neutron.agent.dhcp.agent [None req-24c82a0f-4fe7-4088-8621-1cd89bbb5b2a - - - - - -] DHCP configuration for ports {'d1696f58-ce55-430b-aa17-07e03a47e863'} is completed
Feb 01 09:58:25 np0005604215.localdomain dnsmasq[313058]: exiting on receipt of SIGTERM
Feb 01 09:58:25 np0005604215.localdomain podman[313114]: 2026-02-01 09:58:25.461485038 +0000 UTC m=+0.057111081 container kill 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 09:58:25 np0005604215.localdomain systemd[1]: libpod-835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c.scope: Deactivated successfully.
Feb 01 09:58:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:25Z|00213|binding|INFO|Removing iface tap9d56394b-75 ovn-installed in OVS
Feb 01 09:58:25 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:25Z|00214|binding|INFO|Removing lport 9d56394b-750d-4167-8e00-5138f0e20ab4 ovn-installed in OVS
Feb 01 09:58:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:25.482 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:25.483 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 34732b3b-eb6e-4f20-9a97-5a558283bc2d with type ""
Feb 01 09:58:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:25.484 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::2/64', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-ddb7490a-2172-4022-90d8-32c9167c3083', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-ddb7490a-2172-4022-90d8-32c9167c3083', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fea4c3ac6fd14aee8b0de1bad5f8673a', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=08cdf106-e071-4811-8b04-e1f5131f8f49, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=9d56394b-750d-4167-8e00-5138f0e20ab4) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:25.485 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 9d56394b-750d-4167-8e00-5138f0e20ab4 in datapath ddb7490a-2172-4022-90d8-32c9167c3083 unbound from our chassis
Feb 01 09:58:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:25.486 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network ddb7490a-2172-4022-90d8-32c9167c3083 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:25 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:25.487 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ba3cf813-d8df-408d-971c-64c1f0c8ed2a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:25.489 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:25 np0005604215.localdomain podman[313127]: 2026-02-01 09:58:25.531857979 +0000 UTC m=+0.058993809 container died 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 01 09:58:25 np0005604215.localdomain podman[313127]: 2026-02-01 09:58:25.559910448 +0000 UTC m=+0.087046268 container cleanup 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:25 np0005604215.localdomain systemd[1]: libpod-conmon-835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c.scope: Deactivated successfully.
Feb 01 09:58:25 np0005604215.localdomain podman[313129]: 2026-02-01 09:58:25.613187719 +0000 UTC m=+0.135068347 container remove 835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-ddb7490a-2172-4022-90d8-32c9167c3083, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:58:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:25.625 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:25 np0005604215.localdomain kernel: device tap9d56394b-75 left promiscuous mode
Feb 01 09:58:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:25.638 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:25.656 259225 INFO neutron.agent.dhcp.agent [None req-4fcb9f02-55d7-41dd-bc48-a2ed016ae948 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:25.657 259225 INFO neutron.agent.dhcp.agent [None req-4fcb9f02-55d7-41dd-bc48-a2ed016ae948 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:25 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:25.708 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-6e196fdec99f1e2f5b5d327d1b1bde3b88b31de40ec25a82cda6557ff2bf9c71-merged.mount: Deactivated successfully.
Feb 01 09:58:25 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-835b8ad501100b60648f14b459b7768157807b06b432edf5c549cfac1776687c-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:25 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dddb7490a\x2d2172\x2d4022\x2d90d8\x2d32c9167c3083.mount: Deactivated successfully.
Feb 01 09:58:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e187 e187: 6 total, 6 up, 6 in
Feb 01 09:58:25 np0005604215.localdomain ceph-mon[298604]: osdmap e186: 6 total, 6 up, 6 in
Feb 01 09:58:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:26.108 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:26.215 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v372: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 12 KiB/s wr, 57 op/s
Feb 01 09:58:26 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:26.404 2 INFO neutron.agent.securitygroups_rpc [None req-b01da263-4ebd-4a0d-81ec-4ece56b6a941 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e188 e188: 6 total, 6 up, 6 in
Feb 01 09:58:26 np0005604215.localdomain ceph-mon[298604]: osdmap e187: 6 total, 6 up, 6 in
Feb 01 09:58:26 np0005604215.localdomain ceph-mon[298604]: pgmap v372: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 12 KiB/s wr, 57 op/s
Feb 01 09:58:26 np0005604215.localdomain ceph-mon[298604]: osdmap e188: 6 total, 6 up, 6 in
Feb 01 09:58:27 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:27.361 2 INFO neutron.agent.securitygroups_rpc [None req-6b687509-4ccb-4206-8762-0407780a338c d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.395 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:27.459 259225 INFO neutron.agent.linux.ip_lib [None req-3fa6789c-9961-4c6f-b8eb-8c15dedeee1f - - - - - -] Device tapbfd160dd-fa cannot be used as it has no MAC address
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.481 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain kernel: device tapbfd160dd-fa entered promiscuous mode
Feb 01 09:58:27 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939907.4910] manager: (tapbfd160dd-fa): new Generic device (/org/freedesktop/NetworkManager/Devices/41)
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.491 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:27Z|00215|binding|INFO|Claiming lport bfd160dd-fa98-4ea9-815c-a97263cc82ea for this chassis.
Feb 01 09:58:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:27Z|00216|binding|INFO|bfd160dd-fa98-4ea9-815c-a97263cc82ea: Claiming unknown
Feb 01 09:58:27 np0005604215.localdomain systemd-udevd[313168]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:27.505 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=bfd160dd-fa98-4ea9-815c-a97263cc82ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:27.508 158655 INFO neutron.agent.ovn.metadata.agent [-] Port bfd160dd-fa98-4ea9-815c-a97263cc82ea in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 bound to our chassis
Feb 01 09:58:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:27.512 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:27 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:27.513 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[55ca846c-1be3-4baf-9bf2-63f0b3684971]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.537 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:27Z|00217|binding|INFO|Setting lport bfd160dd-fa98-4ea9-815c-a97263cc82ea ovn-installed in OVS
Feb 01 09:58:27 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:27Z|00218|binding|INFO|Setting lport bfd160dd-fa98-4ea9-815c-a97263cc82ea up in Southbound
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.545 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.574 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:27.601 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:27 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:27.858 2 INFO neutron.agent.securitygroups_rpc [None req-8578e386-0bfd-4f36-a9ea-ecec11376e1e d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v374: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 6.3 KiB/s wr, 71 op/s
Feb 01 09:58:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e189 e189: 6 total, 6 up, 6 in
Feb 01 09:58:28 np0005604215.localdomain podman[313223]: 
Feb 01 09:58:28 np0005604215.localdomain podman[313223]: 2026-02-01 09:58:28.463180875 +0000 UTC m=+0.100549627 container create e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:58:28 np0005604215.localdomain systemd[1]: Started libpod-conmon-e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103.scope.
Feb 01 09:58:28 np0005604215.localdomain podman[313223]: 2026-02-01 09:58:28.410325558 +0000 UTC m=+0.047694339 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:28 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:28 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/785448961ff9a090dea5bf4a331b82550a367213dfd8e1351fa4cefd516f8cf1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:28 np0005604215.localdomain podman[313223]: 2026-02-01 09:58:28.537791568 +0000 UTC m=+0.175160319 container init e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 01 09:58:28 np0005604215.localdomain podman[313223]: 2026-02-01 09:58:28.549253993 +0000 UTC m=+0.186622744 container start e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:28 np0005604215.localdomain dnsmasq[313241]: started, version 2.85 cachesize 150
Feb 01 09:58:28 np0005604215.localdomain dnsmasq[313241]: DNS service limited to local subnets
Feb 01 09:58:28 np0005604215.localdomain dnsmasq[313241]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:28 np0005604215.localdomain dnsmasq[313241]: warning: no upstream servers configured
Feb 01 09:58:28 np0005604215.localdomain dnsmasq-dhcp[313241]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:58:28 np0005604215.localdomain dnsmasq[313241]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 0 addresses
Feb 01 09:58:28 np0005604215.localdomain dnsmasq-dhcp[313241]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:28 np0005604215.localdomain dnsmasq-dhcp[313241]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:28 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:28.755 259225 INFO neutron.agent.dhcp.agent [None req-84d5578d-da31-443a-afd5-02dffc3dd952 - - - - - -] DHCP configuration for ports {'ff54b909-b3b9-4669-8851-459606a86b19', 'f5db53be-fb30-4c27-aabf-1c052ca12256'} is completed
Feb 01 09:58:28 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:28.793 2 INFO neutron.agent.securitygroups_rpc [None req-c10ec1b0-3019-415a-9281-06c26de3609b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:29 np0005604215.localdomain ceph-mon[298604]: pgmap v374: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 6.3 KiB/s wr, 71 op/s
Feb 01 09:58:29 np0005604215.localdomain ceph-mon[298604]: osdmap e189: 6 total, 6 up, 6 in
Feb 01 09:58:29 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:29.642 2 INFO neutron.agent.securitygroups_rpc [None req-53541160-40c7-461f-a788-c0d63f1c152b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:58:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, vol_name:cephfs) < ""
Feb 01 09:58:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:58:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160832 "" "Go-http-client/1.1"
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e/.meta.tmp'
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e/.meta.tmp' to config b'/volumes/_nogroup/a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e/.meta'
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, vol_name:cephfs) < ""
Feb 01 09:58:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:58:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19710 "" "Go-http-client/1.1"
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "format": "json"}]: dispatch
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, vol_name:cephfs) < ""
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, vol_name:cephfs) < ""
Feb 01 09:58:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e189 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v376: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 6.4 KiB/s wr, 72 op/s
Feb 01 09:58:30 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2430941797' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:58:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:30 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2430941797' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:58:30 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:30.847 2 INFO neutron.agent.securitygroups_rpc [None req-7e2c05f6-614a-4970-b06a-02bbc809b5c5 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:31.216 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:58:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "format": "json"}]: dispatch
Feb 01 09:58:31 np0005604215.localdomain ceph-mon[298604]: pgmap v376: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 53 KiB/s rd, 6.4 KiB/s wr, 72 op/s
Feb 01 09:58:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:58:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:58:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:58:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:58:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:58:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:58:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e190 e190: 6 total, 6 up, 6 in
Feb 01 09:58:32 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:32.250 2 INFO neutron.agent.securitygroups_rpc [None req-29619353-dbda-49ae-a09a-12a5be7ce5b3 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['179c1cf2-2f2b-4c28-9577-447c415ef292']
Feb 01 09:58:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v378: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 5.8 KiB/s wr, 66 op/s
Feb 01 09:58:32 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:32.313 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:30Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322a60a0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322a6040>], id=06e8676e-e136-4f95-a1cd-29d3bab497ba, ip_allocation=immediate, mac_address=fa:16:3e:91:24:70, name=tempest-PortsTestJSON-495577314, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:25Z, description=, dns_domain=, id=42a0a17b-be28-4b0f-b80f-055ba2c3d245, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-1335101638, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39008, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2183, status=ACTIVE, subnets=['50402cd1-8e08-4101-9563-d54c0a29610f'], tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:26Z, vlan_transparent=None, network_id=42a0a17b-be28-4b0f-b80f-055ba2c3d245, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['179c1cf2-2f2b-4c28-9577-447c415ef292'], standard_attr_id=2516, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:31Z on network 42a0a17b-be28-4b0f-b80f-055ba2c3d245
Feb 01 09:58:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:32.441 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:32 np0005604215.localdomain dnsmasq[313241]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 1 addresses
Feb 01 09:58:32 np0005604215.localdomain podman[313259]: 2026-02-01 09:58:32.624412142 +0000 UTC m=+0.059899037 container kill e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 09:58:32 np0005604215.localdomain dnsmasq-dhcp[313241]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:32 np0005604215.localdomain dnsmasq-dhcp[313241]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:32 np0005604215.localdomain ceph-mon[298604]: osdmap e190: 6 total, 6 up, 6 in
Feb 01 09:58:32 np0005604215.localdomain ceph-mon[298604]: pgmap v378: 177 pgs: 177 active+clean; 225 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 5.8 KiB/s wr, 66 op/s
Feb 01 09:58:32 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:32.964 259225 INFO neutron.agent.dhcp.agent [None req-2ba2f90c-3129-4257-91e0-1db79c5c1b4c - - - - - -] DHCP configuration for ports {'06e8676e-e136-4f95-a1cd-29d3bab497ba'} is completed
Feb 01 09:58:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:58:33 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1904111334' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1904111334' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:33 np0005604215.localdomain podman[313294]: 2026-02-01 09:58:33.781277137 +0000 UTC m=+0.065576772 container kill e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:33 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:58:33 np0005604215.localdomain dnsmasq[313241]: exiting on receipt of SIGTERM
Feb 01 09:58:33 np0005604215.localdomain systemd[1]: libpod-e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103.scope: Deactivated successfully.
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta.tmp'
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta.tmp' to config b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta'
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "format": "json"}]: dispatch
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:33 np0005604215.localdomain podman[313313]: 2026-02-01 09:58:33.878441228 +0000 UTC m=+0.067831692 container died e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:58:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-785448961ff9a090dea5bf4a331b82550a367213dfd8e1351fa4cefd516f8cf1-merged.mount: Deactivated successfully.
Feb 01 09:58:33 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:33 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:33.928 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:33.928 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:33 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:33.931 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:58:33 np0005604215.localdomain podman[313313]: 2026-02-01 09:58:33.979931963 +0000 UTC m=+0.169322377 container remove e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:58:33 np0005604215.localdomain systemd[1]: libpod-conmon-e831fbd361ac11f8dfda1a2661c1ad8d5a36f458a1996a9a622e64e3526fa103.scope: Deactivated successfully.
Feb 01 09:58:33 np0005604215.localdomain podman[313307]: 2026-02-01 09:58:33.993416461 +0000 UTC m=+0.197217191 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute)
Feb 01 09:58:34 np0005604215.localdomain podman[313307]: 2026-02-01 09:58:34.031560273 +0000 UTC m=+0.235361003 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:58:34 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:58:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v379: 177 pgs: 177 active+clean; 225 MiB data, 1013 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 8.5 KiB/s wr, 89 op/s
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2894475016' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2894475016' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:58:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:34.778 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.3'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff54b909-b3b9-4669-8851-459606a86b19) old=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:34.781 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff54b909-b3b9-4669-8851-459606a86b19 in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 updated
Feb 01 09:58:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:34.785 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 428ddfe1-cc5b-46ff-b105-7d44dab1a252 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:58:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:34.785 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:34 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:34.786 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[447eab43-bbf6-4582-902e-d11c322e86bf]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "format": "json"}]: dispatch
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: pgmap v379: 177 pgs: 177 active+clean; 225 MiB data, 1013 MiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 8.5 KiB/s wr, 89 op/s
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2894475016' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:58:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2894475016' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:58:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:58:35 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/741059174' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e190 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:35 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:35.409 2 INFO neutron.agent.securitygroups_rpc [None req-dcb1a11b-934a-4ada-b696-4f65471e82a7 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e191 e191: 6 total, 6 up, 6 in
Feb 01 09:58:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/741059174' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:35 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:35.866 2 INFO neutron.agent.securitygroups_rpc [None req-76985e45-05b5-4e3d-9a6f-ec3725c3f7a9 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['179c1cf2-2f2b-4c28-9577-447c415ef292', 'c3da8dd5-026e-4efc-b2ae-7f08a7679dbe']
Feb 01 09:58:35 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:35.874 2 INFO neutron.agent.securitygroups_rpc [None req-f5224dd1-77ce-42ca-881a-3d11d7bd93a9 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:36.219 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:36 np0005604215.localdomain podman[313398]: 
Feb 01 09:58:36 np0005604215.localdomain podman[313398]: 2026-02-01 09:58:36.234138348 +0000 UTC m=+0.117751500 container create 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127)
Feb 01 09:58:36 np0005604215.localdomain systemd[1]: Started libpod-conmon-2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565.scope.
Feb 01 09:58:36 np0005604215.localdomain podman[313398]: 2026-02-01 09:58:36.18798453 +0000 UTC m=+0.071597682 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:36 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:36 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/37e3dd6208f5f8f9cad46217d918b863e3df3d372eef86fd21e9c103ee3ff3eb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v381: 177 pgs: 177 active+clean; 225 MiB data, 1013 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 3.6 KiB/s wr, 35 op/s
Feb 01 09:58:36 np0005604215.localdomain podman[313398]: 2026-02-01 09:58:36.311223221 +0000 UTC m=+0.194836373 container init 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:36 np0005604215.localdomain podman[313398]: 2026-02-01 09:58:36.320908873 +0000 UTC m=+0.204522025 container start 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:36 np0005604215.localdomain dnsmasq[313416]: started, version 2.85 cachesize 150
Feb 01 09:58:36 np0005604215.localdomain dnsmasq[313416]: DNS service limited to local subnets
Feb 01 09:58:36 np0005604215.localdomain dnsmasq[313416]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:36 np0005604215.localdomain dnsmasq[313416]: warning: no upstream servers configured
Feb 01 09:58:36 np0005604215.localdomain dnsmasq-dhcp[313416]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:58:36 np0005604215.localdomain dnsmasq-dhcp[313416]: DHCP, static leases only on 10.100.0.16, lease time 1d
Feb 01 09:58:36 np0005604215.localdomain dnsmasq[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 1 addresses
Feb 01 09:58:36 np0005604215.localdomain dnsmasq-dhcp[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:36 np0005604215.localdomain dnsmasq-dhcp[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:36 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.384 259225 INFO neutron.agent.dhcp.agent [None req-2ebfb342-f83f-470f-b395-6ea245cc524a - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:30Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310f10>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310be0>], id=06e8676e-e136-4f95-a1cd-29d3bab497ba, ip_allocation=immediate, mac_address=fa:16:3e:91:24:70, name=tempest-PortsTestJSON-1566818112, network_id=42a0a17b-be28-4b0f-b80f-055ba2c3d245, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['c3da8dd5-026e-4efc-b2ae-7f08a7679dbe'], standard_attr_id=2516, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:35Z on network 42a0a17b-be28-4b0f-b80f-055ba2c3d245
Feb 01 09:58:36 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.387 259225 INFO oslo.privsep.daemon [None req-2ebfb342-f83f-470f-b395-6ea245cc524a - - - - - -] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.dhcp_release_cmd', '--privsep_sock_path', '/tmp/tmpzfw4hr0f/privsep.sock']
Feb 01 09:58:36 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.703 259225 INFO neutron.agent.dhcp.agent [None req-3d7f0e2f-d2b7-403f-a8da-cc41e0e59b4b - - - - - -] DHCP configuration for ports {'f5db53be-fb30-4c27-aabf-1c052ca12256', 'ff54b909-b3b9-4669-8851-459606a86b19', '06e8676e-e136-4f95-a1cd-29d3bab497ba', 'bfd160dd-fa98-4ea9-815c-a97263cc82ea'} is completed
Feb 01 09:58:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e192 e192: 6 total, 6 up, 6 in
Feb 01 09:58:36 np0005604215.localdomain ceph-mon[298604]: osdmap e191: 6 total, 6 up, 6 in
Feb 01 09:58:36 np0005604215.localdomain ceph-mon[298604]: pgmap v381: 177 pgs: 177 active+clean; 225 MiB data, 1013 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 3.6 KiB/s wr, 35 op/s
Feb 01 09:58:36 np0005604215.localdomain ceph-mon[298604]: osdmap e192: 6 total, 6 up, 6 in
Feb 01 09:58:36 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:36.940 2 INFO neutron.agent.securitygroups_rpc [None req-1e2fb4b2-ba87-4ef5-8fe2-a6ebe885c08a 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['c3da8dd5-026e-4efc-b2ae-7f08a7679dbe']
Feb 01 09:58:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:58:37 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:37.058 259225 INFO oslo.privsep.daemon [None req-2ebfb342-f83f-470f-b395-6ea245cc524a - - - - - -] Spawned new privsep daemon via rootwrap
Feb 01 09:58:37 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.937 313421 INFO oslo.privsep.daemon [-] privsep daemon starting
Feb 01 09:58:37 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.942 313421 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0
Feb 01 09:58:37 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.945 313421 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none
Feb 01 09:58:37 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:36.945 313421 INFO oslo.privsep.daemon [-] privsep daemon running as pid 313421
Feb 01 09:58:37 np0005604215.localdomain podman[313422]: 2026-02-01 09:58:37.122564036 +0000 UTC m=+0.085075322 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:58:37 np0005604215.localdomain podman[313422]: 2026-02-01 09:58:37.156017069 +0000 UTC m=+0.118528315 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 09:58:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "snap_name": "c067672e-1db0-4df2-a2d8-4c39c3271393", "format": "json"}]: dispatch
Feb 01 09:58:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:c067672e-1db0-4df2-a2d8-4c39c3271393, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:37 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:58:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:c067672e-1db0-4df2-a2d8-4c39c3271393, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:37 np0005604215.localdomain dnsmasq-dhcp[313416]: DHCPRELEASE(tapbfd160dd-fa) 10.100.0.8 fa:16:3e:91:24:70
Feb 01 09:58:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:37.473 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:37 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:37.595 2 INFO neutron.agent.securitygroups_rpc [None req-9cffbc9d-d1cb-4c4b-934e-47effaf28296 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:37 np0005604215.localdomain dnsmasq[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 1 addresses
Feb 01 09:58:37 np0005604215.localdomain dnsmasq-dhcp[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:37 np0005604215.localdomain dnsmasq-dhcp[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:37 np0005604215.localdomain podman[313465]: 2026-02-01 09:58:37.809099573 +0000 UTC m=+0.061332573 container kill 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:58:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "snap_name": "c067672e-1db0-4df2-a2d8-4c39c3271393", "format": "json"}]: dispatch
Feb 01 09:58:37 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:37.933 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 09:58:38 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:38.026 259225 INFO neutron.agent.dhcp.agent [None req-7dd87d10-4d78-49fa-8ace-7f351efa19b1 - - - - - -] DHCP configuration for ports {'06e8676e-e136-4f95-a1cd-29d3bab497ba'} is completed
Feb 01 09:58:38 np0005604215.localdomain dnsmasq[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 0 addresses
Feb 01 09:58:38 np0005604215.localdomain dnsmasq-dhcp[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:38 np0005604215.localdomain dnsmasq-dhcp[313416]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:38 np0005604215.localdomain podman[313502]: 2026-02-01 09:58:38.186478864 +0000 UTC m=+0.056003536 container kill 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:38 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:38.217 2 INFO neutron.agent.securitygroups_rpc [None req-901cd980-7a05-470c-b831-e021c0b0d3ee d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v383: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 15 KiB/s wr, 75 op/s
Feb 01 09:58:38 np0005604215.localdomain ceph-mon[298604]: pgmap v383: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 15 KiB/s wr, 75 op/s
Feb 01 09:58:38 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2113394554' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e193 e193: 6 total, 6 up, 6 in
Feb 01 09:58:38 np0005604215.localdomain sudo[313523]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:58:38 np0005604215.localdomain sudo[313523]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:58:38 np0005604215.localdomain sudo[313523]: pam_unix(sudo:session): session closed for user root
Feb 01 09:58:39 np0005604215.localdomain sudo[313551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:58:39 np0005604215.localdomain sudo[313551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:58:39 np0005604215.localdomain dnsmasq[313416]: exiting on receipt of SIGTERM
Feb 01 09:58:39 np0005604215.localdomain systemd[1]: libpod-2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565.scope: Deactivated successfully.
Feb 01 09:58:39 np0005604215.localdomain podman[313575]: 2026-02-01 09:58:39.150354123 +0000 UTC m=+0.081261103 container kill 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true)
Feb 01 09:58:39 np0005604215.localdomain podman[313590]: 2026-02-01 09:58:39.216189835 +0000 UTC m=+0.054393486 container died 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:58:39 np0005604215.localdomain podman[313590]: 2026-02-01 09:58:39.299313486 +0000 UTC m=+0.137517137 container cleanup 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:58:39 np0005604215.localdomain systemd[1]: libpod-conmon-2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565.scope: Deactivated successfully.
Feb 01 09:58:39 np0005604215.localdomain podman[313597]: 2026-02-01 09:58:39.327661499 +0000 UTC m=+0.150244313 container remove 2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 01 09:58:39 np0005604215.localdomain sudo[313551]: pam_unix(sudo:session): session closed for user root
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: osdmap e193: 6 total, 6 up, 6 in
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e194 e194: 6 total, 6 up, 6 in
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:58:39 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 8462a3c3-cad9-48f1-8d30-4b069df0223a (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:58:39 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 8462a3c3-cad9-48f1-8d30-4b069df0223a (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:58:39 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 8462a3c3-cad9-48f1-8d30-4b069df0223a (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:58:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:58:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:40.020 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.2'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff54b909-b3b9-4669-8851-459606a86b19) old=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.3'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.3/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:40.024 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff54b909-b3b9-4669-8851-459606a86b19 in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 updated
Feb 01 09:58:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:40.027 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 428ddfe1-cc5b-46ff-b105-7d44dab1a252 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:58:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:40.027 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:40 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:40.029 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[9a4cd69c-e7e5-4874-8091-e54252e324ee]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e194 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-37e3dd6208f5f8f9cad46217d918b863e3df3d372eef86fd21e9c103ee3ff3eb-merged.mount: Deactivated successfully.
Feb 01 09:58:40 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2ad4b9b0d0bcfb570f406f533c36cc22a71695f544a8daf1ff663d07e93fe565-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:40 np0005604215.localdomain sudo[313682]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:58:40 np0005604215.localdomain sudo[313682]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:58:40 np0005604215.localdomain sudo[313682]: pam_unix(sudo:session): session closed for user root
Feb 01 09:58:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v386: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 16 KiB/s wr, 48 op/s
Feb 01 09:58:40 np0005604215.localdomain podman[313716]: 
Feb 01 09:58:40 np0005604215.localdomain podman[313716]: 2026-02-01 09:58:40.349395562 +0000 UTC m=+0.095041223 container create c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:58:40 np0005604215.localdomain systemd[1]: Started libpod-conmon-c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc.scope.
Feb 01 09:58:40 np0005604215.localdomain podman[313716]: 2026-02-01 09:58:40.304335088 +0000 UTC m=+0.049980779 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:40 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:40 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/925dea6d8969dbc30afe3d38ec9b67cf20165bdd875a10c16a2ec9974bdd03b4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:40 np0005604215.localdomain podman[313716]: 2026-02-01 09:58:40.424383149 +0000 UTC m=+0.170028810 container init c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Feb 01 09:58:40 np0005604215.localdomain podman[313716]: 2026-02-01 09:58:40.434566636 +0000 UTC m=+0.180212297 container start c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Feb 01 09:58:40 np0005604215.localdomain dnsmasq[313734]: started, version 2.85 cachesize 150
Feb 01 09:58:40 np0005604215.localdomain dnsmasq[313734]: DNS service limited to local subnets
Feb 01 09:58:40 np0005604215.localdomain dnsmasq[313734]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:40 np0005604215.localdomain dnsmasq[313734]: warning: no upstream servers configured
Feb 01 09:58:40 np0005604215.localdomain dnsmasq-dhcp[313734]: DHCP, static leases only on 10.100.0.16, lease time 1d
Feb 01 09:58:40 np0005604215.localdomain dnsmasq[313734]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 0 addresses
Feb 01 09:58:40 np0005604215.localdomain dnsmasq-dhcp[313734]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:40 np0005604215.localdomain dnsmasq-dhcp[313734]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:40 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:40.741 259225 INFO neutron.agent.dhcp.agent [None req-998dab09-fea8-4fa3-8c43-7d40e440673a - - - - - -] DHCP configuration for ports {'bfd160dd-fa98-4ea9-815c-a97263cc82ea', 'ff54b909-b3b9-4669-8851-459606a86b19', 'f5db53be-fb30-4c27-aabf-1c052ca12256'} is completed
Feb 01 09:58:40 np0005604215.localdomain dnsmasq[313734]: exiting on receipt of SIGTERM
Feb 01 09:58:40 np0005604215.localdomain podman[313752]: 2026-02-01 09:58:40.808525512 +0000 UTC m=+0.060928521 container kill c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:58:40 np0005604215.localdomain systemd[1]: libpod-c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc.scope: Deactivated successfully.
Feb 01 09:58:40 np0005604215.localdomain podman[313765]: 2026-02-01 09:58:40.881437104 +0000 UTC m=+0.054879082 container died c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: osdmap e194: 6 total, 6 up, 6 in
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: pgmap v386: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 16 KiB/s wr, 48 op/s
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1489678481' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:58:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1489678481' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:58:40 np0005604215.localdomain podman[313765]: 2026-02-01 09:58:40.91276736 +0000 UTC m=+0.086209298 container cleanup c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:58:40 np0005604215.localdomain systemd[1]: libpod-conmon-c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc.scope: Deactivated successfully.
Feb 01 09:58:40 np0005604215.localdomain podman[313766]: 2026-02-01 09:58:40.961152758 +0000 UTC m=+0.128870208 container remove c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:58:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-925dea6d8969dbc30afe3d38ec9b67cf20165bdd875a10c16a2ec9974bdd03b4-merged.mount: Deactivated successfully.
Feb 01 09:58:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c5a4095967a1c724908b8e5409271cf4420c247bf97d1d8a345d1279dd9805fc-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:41.221 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:41 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:41.275 2 INFO neutron.agent.securitygroups_rpc [None req-74812456-6c53-4459-bad0-4aac2b18e077 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:41 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:58:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:58:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:41.775 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:58:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:41.776 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:58:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:41.776 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:58:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e195 e195: 6 total, 6 up, 6 in
Feb 01 09:58:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:42.032 2 INFO neutron.agent.securitygroups_rpc [None req-c40bbf53-5b24-4f24-a7f5-e7d11bb1e253 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['2fbf2c39-17e9-4e72-bbbb-e5125197536a']
Feb 01 09:58:42 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:42.269 2 INFO neutron.agent.securitygroups_rpc [None req-5822cc8d-7dc4-44de-b45f-4af6413189cc d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v388: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 13 KiB/s wr, 39 op/s
Feb 01 09:58:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:42.514 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:58:42 np0005604215.localdomain ceph-mon[298604]: osdmap e195: 6 total, 6 up, 6 in
Feb 01 09:58:42 np0005604215.localdomain ceph-mon[298604]: pgmap v388: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 27 KiB/s rd, 13 KiB/s wr, 39 op/s
Feb 01 09:58:43 np0005604215.localdomain podman[313845]: 
Feb 01 09:58:43 np0005604215.localdomain podman[313845]: 2026-02-01 09:58:43.010684832 +0000 UTC m=+0.091566725 container create c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Feb 01 09:58:43 np0005604215.localdomain systemd[1]: Started libpod-conmon-c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd.scope.
Feb 01 09:58:43 np0005604215.localdomain podman[313845]: 2026-02-01 09:58:42.963752189 +0000 UTC m=+0.044634062 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:43 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:43 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e36507fc607162f25bf21bac759aebb58f629cfa9b69496d7bbd7d8684964b06/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:43 np0005604215.localdomain podman[313845]: 2026-02-01 09:58:43.079239458 +0000 UTC m=+0.160121341 container init c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127)
Feb 01 09:58:43 np0005604215.localdomain podman[313845]: 2026-02-01 09:58:43.088640532 +0000 UTC m=+0.169522445 container start c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: started, version 2.85 cachesize 150
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: DNS service limited to local subnets
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: warning: no upstream servers configured
Feb 01 09:58:43 np0005604215.localdomain dnsmasq-dhcp[313861]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:58:43 np0005604215.localdomain dnsmasq-dhcp[313861]: DHCP, static leases only on 10.100.0.16, lease time 1d
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 0 addresses
Feb 01 09:58:43 np0005604215.localdomain dnsmasq-dhcp[313861]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:43 np0005604215.localdomain dnsmasq-dhcp[313861]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:43.147 259225 INFO neutron.agent.dhcp.agent [None req-6edf3a3b-2b93-4b5c-a396-6c031b435828 - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032303e50>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032310f10>], id=d96f800e-edb5-41eb-98bd-c4af07c61ec1, ip_allocation=immediate, mac_address=fa:16:3e:05:18:80, name=tempest-PortsTestJSON-440048988, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:57:25Z, description=, dns_domain=, id=42a0a17b-be28-4b0f-b80f-055ba2c3d245, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-PortsTestJSON-test-network-1335101638, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=39008, qos_policy_id=None, revision_number=5, router:external=False, shared=False, standard_attr_id=2183, status=ACTIVE, subnets=['0862afad-3da9-478d-89ae-76118bd953d3', '4a542e30-5baa-4eb1-8b2a-1d247e546755'], tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:39Z, vlan_transparent=None, network_id=42a0a17b-be28-4b0f-b80f-055ba2c3d245, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=['2fbf2c39-17e9-4e72-bbbb-e5125197536a'], standard_attr_id=2588, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:41Z on network 42a0a17b-be28-4b0f-b80f-055ba2c3d245
Feb 01 09:58:43 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:43.303 2 INFO neutron.agent.securitygroups_rpc [None req-850b62dc-2adf-4cc9-b2ae-b1b62bc1910e d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:43.374 259225 INFO neutron.agent.dhcp.agent [None req-a432e735-9a53-4ae2-9db0-e08c38d94ace - - - - - -] DHCP configuration for ports {'f5db53be-fb30-4c27-aabf-1c052ca12256', 'ff54b909-b3b9-4669-8851-459606a86b19', 'bfd160dd-fa98-4ea9-815c-a97263cc82ea'} is completed
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 1 addresses
Feb 01 09:58:43 np0005604215.localdomain dnsmasq-dhcp[313861]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:43 np0005604215.localdomain dnsmasq-dhcp[313861]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:43 np0005604215.localdomain podman[313879]: 2026-02-01 09:58:43.391390707 +0000 UTC m=+0.043123274 container kill c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 01 09:58:43 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:43.613 259225 INFO neutron.agent.dhcp.agent [None req-737b90dc-ecb4-47dc-976c-0b070bc84d65 - - - - - -] DHCP configuration for ports {'d96f800e-edb5-41eb-98bd-c4af07c61ec1'} is completed
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "snap_name": "c067672e-1db0-4df2-a2d8-4c39c3271393_4eb61cba-a4e1-44c0-96fc-ccd97cf15833", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:c067672e-1db0-4df2-a2d8-4c39c3271393_4eb61cba-a4e1-44c0-96fc-ccd97cf15833, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta.tmp'
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta.tmp' to config b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta'
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:c067672e-1db0-4df2-a2d8-4c39c3271393_4eb61cba-a4e1-44c0-96fc-ccd97cf15833, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "snap_name": "c067672e-1db0-4df2-a2d8-4c39c3271393", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:c067672e-1db0-4df2-a2d8-4c39c3271393, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta.tmp'
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta.tmp' to config b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703/.meta'
Feb 01 09:58:43 np0005604215.localdomain dnsmasq[313861]: exiting on receipt of SIGTERM
Feb 01 09:58:43 np0005604215.localdomain podman[313915]: 2026-02-01 09:58:43.873336087 +0000 UTC m=+0.074245475 container kill c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:58:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:c067672e-1db0-4df2-a2d8-4c39c3271393, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:43 np0005604215.localdomain systemd[1]: libpod-c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd.scope: Deactivated successfully.
Feb 01 09:58:43 np0005604215.localdomain podman[313929]: 2026-02-01 09:58:43.939305083 +0000 UTC m=+0.056639176 container died c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:58:43 np0005604215.localdomain podman[313929]: 2026-02-01 09:58:43.970091992 +0000 UTC m=+0.087426055 container cleanup c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:43 np0005604215.localdomain systemd[1]: libpod-conmon-c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd.scope: Deactivated successfully.
Feb 01 09:58:43 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2215809037' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:58:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-e36507fc607162f25bf21bac759aebb58f629cfa9b69496d7bbd7d8684964b06-merged.mount: Deactivated successfully.
Feb 01 09:58:44 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:44 np0005604215.localdomain podman[313935]: 2026-02-01 09:58:44.03067009 +0000 UTC m=+0.136311449 container remove c4576f575873cc1f76777736374bbf588300bee42d59e599086b9776acbf62dd (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:44.184 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.2 10.100.0.34'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28 10.100.0.34/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=ff54b909-b3b9-4669-8851-459606a86b19) old=Port_Binding(mac=['fa:16:3e:60:f7:71 10.100.0.18 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.18/28 10.100.0.2/28', 'neutron:device_id': 'ovnmeta-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '5', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:44.188 158655 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port ff54b909-b3b9-4669-8851-459606a86b19 in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 updated
Feb 01 09:58:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:44.192 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 428ddfe1-cc5b-46ff-b105-7d44dab1a252 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:58:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:44.192 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:44 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:44.194 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[84bc535e-a273-4730-9676-db309a4e70b3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v389: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 10 KiB/s wr, 86 op/s
Feb 01 09:58:44 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:44.428 2 INFO neutron.agent.securitygroups_rpc [None req-1da9737c-8fe2-425a-864f-b783c535a95b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "snap_name": "c067672e-1db0-4df2-a2d8-4c39c3271393_4eb61cba-a4e1-44c0-96fc-ccd97cf15833", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "snap_name": "c067672e-1db0-4df2-a2d8-4c39c3271393", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:44 np0005604215.localdomain ceph-mon[298604]: pgmap v389: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 10 KiB/s wr, 86 op/s
Feb 01 09:58:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e196 e196: 6 total, 6 up, 6 in
Feb 01 09:58:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e196 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:45 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:45.536 2 INFO neutron.agent.securitygroups_rpc [None req-16f44daa-a956-4040-8610-4ccc01164dad 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['57bcc40e-59d1-4274-96b5-b777fde58e85', '436843c7-c8bd-4657-a6cb-df8e0dddd33a', '2fbf2c39-17e9-4e72-bbbb-e5125197536a']
Feb 01 09:58:45 np0005604215.localdomain podman[314008]: 
Feb 01 09:58:45 np0005604215.localdomain podman[314008]: 2026-02-01 09:58:45.566594158 +0000 UTC m=+0.088231840 container create 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true)
Feb 01 09:58:45 np0005604215.localdomain systemd[1]: Started libpod-conmon-6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22.scope.
Feb 01 09:58:45 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:45 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/645b45bd538c7154c5e174dfa275410a59a91bbc0b2a58b48dfb47eec9e95ab1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:45 np0005604215.localdomain podman[314008]: 2026-02-01 09:58:45.524324521 +0000 UTC m=+0.045962243 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:45 np0005604215.localdomain podman[314008]: 2026-02-01 09:58:45.629013294 +0000 UTC m=+0.150650966 container init 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 01 09:58:45 np0005604215.localdomain podman[314008]: 2026-02-01 09:58:45.648264154 +0000 UTC m=+0.169901826 container start 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Feb 01 09:58:45 np0005604215.localdomain dnsmasq[314026]: started, version 2.85 cachesize 150
Feb 01 09:58:45 np0005604215.localdomain dnsmasq[314026]: DNS service limited to local subnets
Feb 01 09:58:45 np0005604215.localdomain dnsmasq[314026]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:45 np0005604215.localdomain dnsmasq[314026]: warning: no upstream servers configured
Feb 01 09:58:45 np0005604215.localdomain dnsmasq-dhcp[314026]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:58:45 np0005604215.localdomain dnsmasq-dhcp[314026]: DHCP, static leases only on 10.100.0.16, lease time 1d
Feb 01 09:58:45 np0005604215.localdomain dnsmasq-dhcp[314026]: DHCP, static leases only on 10.100.0.32, lease time 1d
Feb 01 09:58:45 np0005604215.localdomain dnsmasq[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 1 addresses
Feb 01 09:58:45 np0005604215.localdomain dnsmasq-dhcp[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:45 np0005604215.localdomain dnsmasq-dhcp[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:45 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:45.725 259225 INFO neutron.agent.dhcp.agent [None req-a582b95a-0f98-4318-b6de-3aa83d0df205 - - - - - -] Trigger reload_allocations for port admin_state_up=False, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:41Z, description=, device_id=, device_owner=, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032242970>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032242e80>], id=d96f800e-edb5-41eb-98bd-c4af07c61ec1, ip_allocation=immediate, mac_address=fa:16:3e:05:18:80, name=tempest-PortsTestJSON-1307719028, network_id=42a0a17b-be28-4b0f-b80f-055ba2c3d245, port_security_enabled=True, project_id=fc33978cc1c94009a152ec3cacbfe0e5, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=2, security_groups=['436843c7-c8bd-4657-a6cb-df8e0dddd33a', '57bcc40e-59d1-4274-96b5-b777fde58e85'], standard_attr_id=2588, status=DOWN, tags=[], tenant_id=fc33978cc1c94009a152ec3cacbfe0e5, updated_at=2026-02-01T09:58:45Z on network 42a0a17b-be28-4b0f-b80f-055ba2c3d245
Feb 01 09:58:45 np0005604215.localdomain dnsmasq-dhcp[314026]: DHCPRELEASE(tapbfd160dd-fa) 10.100.0.14 fa:16:3e:05:18:80
Feb 01 09:58:45 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:45.930 259225 INFO neutron.agent.dhcp.agent [None req-2d1e2e7a-d2ad-402c-aeae-fa7d4398fffd - - - - - -] DHCP configuration for ports {'d96f800e-edb5-41eb-98bd-c4af07c61ec1', 'ff54b909-b3b9-4669-8851-459606a86b19', 'bfd160dd-fa98-4ea9-815c-a97263cc82ea', 'f5db53be-fb30-4c27-aabf-1c052ca12256'} is completed
Feb 01 09:58:46 np0005604215.localdomain ceph-mon[298604]: osdmap e196: 6 total, 6 up, 6 in
Feb 01 09:58:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e197 e197: 6 total, 6 up, 6 in
Feb 01 09:58:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:46.264 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v392: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 10 KiB/s wr, 86 op/s
Feb 01 09:58:46 np0005604215.localdomain dnsmasq[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 1 addresses
Feb 01 09:58:46 np0005604215.localdomain dnsmasq-dhcp[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:46 np0005604215.localdomain dnsmasq-dhcp[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:46 np0005604215.localdomain podman[314046]: 2026-02-01 09:58:46.377281365 +0000 UTC m=+0.058712831 container kill 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:46 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:46.565 259225 INFO neutron.agent.dhcp.agent [None req-b9eb3ad3-2aff-427a-87fc-b34af5af4f64 - - - - - -] DHCP configuration for ports {'d96f800e-edb5-41eb-98bd-c4af07c61ec1'} is completed
Feb 01 09:58:46 np0005604215.localdomain systemd[1]: tmp-crun.zFj9xY.mount: Deactivated successfully.
Feb 01 09:58:46 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:46.622 2 INFO neutron.agent.securitygroups_rpc [None req-5515d09c-f3a8-4bc8-bfd1-9ff7fafccb05 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['57bcc40e-59d1-4274-96b5-b777fde58e85', '436843c7-c8bd-4657-a6cb-df8e0dddd33a']
Feb 01 09:58:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e198 e198: 6 total, 6 up, 6 in
Feb 01 09:58:46 np0005604215.localdomain dnsmasq[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 0 addresses
Feb 01 09:58:46 np0005604215.localdomain dnsmasq-dhcp[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:46 np0005604215.localdomain podman[314083]: 2026-02-01 09:58:46.949383284 +0000 UTC m=+0.051582528 container kill 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:58:46 np0005604215.localdomain dnsmasq-dhcp[314026]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:47 np0005604215.localdomain ceph-mon[298604]: osdmap e197: 6 total, 6 up, 6 in
Feb 01 09:58:47 np0005604215.localdomain ceph-mon[298604]: pgmap v392: 177 pgs: 177 active+clean; 225 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 63 KiB/s rd, 10 KiB/s wr, 86 op/s
Feb 01 09:58:47 np0005604215.localdomain ceph-mon[298604]: osdmap e198: 6 total, 6 up, 6 in
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "format": "json"}]: dispatch
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:58:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:47.175+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '48a610c1-90ab-40fc-9c1e-287cd3c7e703' of type subvolume
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '48a610c1-90ab-40fc-9c1e-287cd3c7e703' of type subvolume
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/48a610c1-90ab-40fc-9c1e-287cd3c7e703'' moved to trashcan
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:58:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:48a610c1-90ab-40fc-9c1e-287cd3c7e703, vol_name:cephfs) < ""
Feb 01 09:58:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:47.517 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e199 e199: 6 total, 6 up, 6 in
Feb 01 09:58:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "format": "json"}]: dispatch
Feb 01 09:58:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "48a610c1-90ab-40fc-9c1e-287cd3c7e703", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v395: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 20 KiB/s wr, 71 op/s
Feb 01 09:58:48 np0005604215.localdomain dnsmasq[314026]: exiting on receipt of SIGTERM
Feb 01 09:58:48 np0005604215.localdomain podman[314120]: 2026-02-01 09:58:48.425604931 +0000 UTC m=+0.058985509 container kill 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:58:48 np0005604215.localdomain systemd[1]: libpod-6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22.scope: Deactivated successfully.
Feb 01 09:58:48 np0005604215.localdomain podman[314132]: 2026-02-01 09:58:48.499944408 +0000 UTC m=+0.058280977 container died 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Feb 01 09:58:48 np0005604215.localdomain podman[314132]: 2026-02-01 09:58:48.530328225 +0000 UTC m=+0.088664734 container cleanup 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:58:48 np0005604215.localdomain systemd[1]: libpod-conmon-6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22.scope: Deactivated successfully.
Feb 01 09:58:48 np0005604215.localdomain podman[314135]: 2026-02-01 09:58:48.570740455 +0000 UTC m=+0.122958943 container remove 6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:58:48 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:48.656 2 INFO neutron.agent.securitygroups_rpc [None req-ecc18215-6c26-46d4-bb1e-4f9b7e08ab87 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:48 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:48.800 259225 INFO neutron.agent.linux.ip_lib [None req-76446668-2ac3-4c38-a599-bbabe26e3095 - - - - - -] Device tap5b4ba3e8-5f cannot be used as it has no MAC address
Feb 01 09:58:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:48.869 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:48 np0005604215.localdomain kernel: device tap5b4ba3e8-5f entered promiscuous mode
Feb 01 09:58:48 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939928.8779] manager: (tap5b4ba3e8-5f): new Generic device (/org/freedesktop/NetworkManager/Devices/42)
Feb 01 09:58:48 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:48Z|00219|binding|INFO|Claiming lport 5b4ba3e8-5f00-4316-a86d-1c06057933fc for this chassis.
Feb 01 09:58:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:48.878 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:48 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:48Z|00220|binding|INFO|5b4ba3e8-5f00-4316-a86d-1c06057933fc: Claiming unknown
Feb 01 09:58:48 np0005604215.localdomain systemd-udevd[314189]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:48.898 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-efcb439d-5008-49b3-8d74-fc95eb1e0a3c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efcb439d-5008-49b3-8d74-fc95eb1e0a3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33e912d1-6794-400a-b37c-704b8b53759d, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=5b4ba3e8-5f00-4316-a86d-1c06057933fc) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:48.905 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 5b4ba3e8-5f00-4316-a86d-1c06057933fc in datapath efcb439d-5008-49b3-8d74-fc95eb1e0a3c bound to our chassis
Feb 01 09:58:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:48.909 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port ce3bc52f-aef5-49c7-bb45-7d0cc76cd721 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 09:58:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:48.910 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network efcb439d-5008-49b3-8d74-fc95eb1e0a3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:48.911 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[16d0a980-ac30-4af0-b1db-bca7758d809e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:48 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:48Z|00221|binding|INFO|Setting lport 5b4ba3e8-5f00-4316-a86d-1c06057933fc ovn-installed in OVS
Feb 01 09:58:48 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:48Z|00222|binding|INFO|Setting lport 5b4ba3e8-5f00-4316-a86d-1c06057933fc up in Southbound
Feb 01 09:58:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:48.916 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap5b4ba3e8-5f: No such device
Feb 01 09:58:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:48.956 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:48.985 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:49 np0005604215.localdomain ceph-mon[298604]: osdmap e199: 6 total, 6 up, 6 in
Feb 01 09:58:49 np0005604215.localdomain ceph-mon[298604]: pgmap v395: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 51 KiB/s rd, 20 KiB/s wr, 71 op/s
Feb 01 09:58:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e200 e200: 6 total, 6 up, 6 in
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-645b45bd538c7154c5e174dfa275410a59a91bbc0b2a58b48dfb47eec9e95ab1-merged.mount: Deactivated successfully.
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6d36c0b9e0b2c41599e6947f933975a9f4b52579fce13dde334b1efc9e87ba22-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:49 np0005604215.localdomain podman[314267]: 
Feb 01 09:58:49 np0005604215.localdomain podman[314267]: 2026-02-01 09:58:49.531516678 +0000 UTC m=+0.085382322 container create 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: Started libpod-conmon-342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893.scope.
Feb 01 09:58:49 np0005604215.localdomain podman[314267]: 2026-02-01 09:58:49.490576092 +0000 UTC m=+0.044441726 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:49 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5557d9afffbbda1966ecc891e6b582c678fb403d0ebb1224f5064084d81706b3/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:49 np0005604215.localdomain podman[314267]: 2026-02-01 09:58:49.610472329 +0000 UTC m=+0.164338003 container init 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Feb 01 09:58:49 np0005604215.localdomain podman[314267]: 2026-02-01 09:58:49.620395177 +0000 UTC m=+0.174260821 container start 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314287]: started, version 2.85 cachesize 150
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314287]: DNS service limited to local subnets
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314287]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314287]: warning: no upstream servers configured
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314287]: DHCP, static leases only on 10.100.0.16, lease time 1d
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314287]: DHCP, static leases only on 10.100.0.32, lease time 1d
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314287]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/addn_hosts - 0 addresses
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314287]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/host
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314287]: read /var/lib/neutron/dhcp/42a0a17b-be28-4b0f-b80f-055ba2c3d245/opts
Feb 01 09:58:49 np0005604215.localdomain podman[314310]: 
Feb 01 09:58:49 np0005604215.localdomain podman[314310]: 2026-02-01 09:58:49.857101955 +0000 UTC m=+0.092302848 container create b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:58:49 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:49.863 259225 INFO neutron.agent.dhcp.agent [None req-3c3fc470-2c5a-469e-a56b-8b233ecc51dd - - - - - -] DHCP configuration for ports {'f5db53be-fb30-4c27-aabf-1c052ca12256', 'ff54b909-b3b9-4669-8851-459606a86b19', 'bfd160dd-fa98-4ea9-815c-a97263cc82ea'} is completed
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: Started libpod-conmon-b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8.scope.
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:49 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3d531bacba67f6f62eddd0d069e5b217cc14d9a3f2a39da03970900ebf0f5bfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:49 np0005604215.localdomain podman[314310]: 2026-02-01 09:58:49.813252008 +0000 UTC m=+0.048452931 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:49 np0005604215.localdomain podman[314310]: 2026-02-01 09:58:49.920496221 +0000 UTC m=+0.155697114 container init b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:58:49 np0005604215.localdomain podman[314310]: 2026-02-01 09:58:49.929277444 +0000 UTC m=+0.164478337 container start b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314345]: started, version 2.85 cachesize 150
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314345]: DNS service limited to local subnets
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314345]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314345]: warning: no upstream servers configured
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314345]: DHCP, static leases only on 10.102.0.0, lease time 1d
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/addn_hosts - 0 addresses
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/host
Feb 01 09:58:49 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/opts
Feb 01 09:58:49 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:49.988 259225 INFO neutron.agent.dhcp.agent [None req-978517d7-3ec8-4b39-a6ce-32b689987fac - - - - - -] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:48Z, description=, device_id=0d3b238e-083b-4f7c-8e29-650b41019987, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00321ec8b0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00321ecbb0>], id=cd504d75-8802-4bc4-9b4d-ae05b1c87cde, ip_allocation=immediate, mac_address=fa:16:3e:44:ce:ea, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:45Z, description=, dns_domain=, id=efcb439d-5008-49b3-8d74-fc95eb1e0a3c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-544404190, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18623, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2617, status=ACTIVE, subnets=['ffc0767c-8f10-49f5-8670-ad4918ca881f'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:46Z, vlan_transparent=None, network_id=efcb439d-5008-49b3-8d74-fc95eb1e0a3c, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2637, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:48Z on network efcb439d-5008-49b3-8d74-fc95eb1e0a3c
Feb 01 09:58:49 np0005604215.localdomain dnsmasq[314287]: exiting on receipt of SIGTERM
Feb 01 09:58:49 np0005604215.localdomain podman[314344]: 2026-02-01 09:58:49.992010889 +0000 UTC m=+0.055237623 container kill 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:58:49 np0005604215.localdomain systemd[1]: libpod-342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893.scope: Deactivated successfully.
Feb 01 09:58:50 np0005604215.localdomain podman[314357]: 2026-02-01 09:58:50.04595553 +0000 UTC m=+0.040326267 container died 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 09:58:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:50.104 259225 INFO neutron.agent.dhcp.agent [None req-798dcf21-10f7-4903-9e17-f73e973d673c - - - - - -] DHCP configuration for ports {'b3a9245d-7b09-4ca7-945e-fb5d33a48fc0'} is completed
Feb 01 09:58:50 np0005604215.localdomain ceph-mon[298604]: osdmap e200: 6 total, 6 up, 6 in
Feb 01 09:58:50 np0005604215.localdomain podman[314357]: 2026-02-01 09:58:50.134258583 +0000 UTC m=+0.128629310 container cleanup 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 09:58:50 np0005604215.localdomain systemd[1]: libpod-conmon-342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893.scope: Deactivated successfully.
Feb 01 09:58:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e200 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:50 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:50.152 2 INFO neutron.agent.securitygroups_rpc [None req-421b8c9c-f64a-4a6d-8e56-2aad4effb91c 80bbd13fa0544ff98e6c38448e01c054 fc33978cc1c94009a152ec3cacbfe0e5 - - default default] Security group member updated ['277d73b7-d267-437d-b5df-bd560d180a7a']
Feb 01 09:58:50 np0005604215.localdomain podman[314359]: 2026-02-01 09:58:50.161798101 +0000 UTC m=+0.148744306 container remove 342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-42a0a17b-be28-4b0f-b80f-055ba2c3d245, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127)
Feb 01 09:58:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:50.171 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:50Z|00223|binding|INFO|Releasing lport bfd160dd-fa98-4ea9-815c-a97263cc82ea from this chassis (sb_readonly=0)
Feb 01 09:58:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:50Z|00224|binding|INFO|Setting lport bfd160dd-fa98-4ea9-815c-a97263cc82ea down in Southbound
Feb 01 09:58:50 np0005604215.localdomain kernel: device tapbfd160dd-fa left promiscuous mode
Feb 01 09:58:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:50.189 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:50.192 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.19/28 10.100.0.3/28 10.100.0.35/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-42a0a17b-be28-4b0f-b80f-055ba2c3d245', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fc33978cc1c94009a152ec3cacbfe0e5', 'neutron:revision_number': '7', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eae16595-a7d2-468e-9eb0-cb266a7101cb, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=bfd160dd-fa98-4ea9-815c-a97263cc82ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:50.194 158655 INFO neutron.agent.ovn.metadata.agent [-] Port bfd160dd-fa98-4ea9-815c-a97263cc82ea in datapath 42a0a17b-be28-4b0f-b80f-055ba2c3d245 unbound from our chassis
Feb 01 09:58:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:50.197 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 42a0a17b-be28-4b0f-b80f-055ba2c3d245, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:58:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:50.198 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ec733fe7-a9da-4f2d-a719-2c1dc2d9c25f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:50 np0005604215.localdomain dnsmasq[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/addn_hosts - 1 addresses
Feb 01 09:58:50 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/host
Feb 01 09:58:50 np0005604215.localdomain podman[314403]: 2026-02-01 09:58:50.200073563 +0000 UTC m=+0.065943045 container kill b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:58:50 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/opts
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v397: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 19 KiB/s wr, 67 op/s
Feb 01 09:58:50 np0005604215.localdomain systemd[1]: tmp-crun.idLqLe.mount: Deactivated successfully.
Feb 01 09:58:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-5557d9afffbbda1966ecc891e6b582c678fb403d0ebb1224f5064084d81706b3-merged.mount: Deactivated successfully.
Feb 01 09:58:50 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-342afcda5f7aeabf1c3f9435a3d9a5e21d4af63074651e4b683430042b8a0893-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:50.494 259225 INFO neutron.agent.dhcp.agent [None req-23abf485-3fef-423e-93c0-14affb3f57db - - - - - -] DHCP configuration for ports {'cd504d75-8802-4bc4-9b4d-ae05b1c87cde'} is completed
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "format": "json"}]: dispatch
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e' of type subvolume
Feb 01 09:58:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:58:50.518+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e' of type subvolume
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, vol_name:cephfs) < ""
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e'' moved to trashcan
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:58:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e, vol_name:cephfs) < ""
Feb 01 09:58:50 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d42a0a17b\x2dbe28\x2d4b0f\x2db80f\x2d055ba2c3d245.mount: Deactivated successfully.
Feb 01 09:58:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:50.693 259225 INFO neutron.agent.dhcp.agent [None req-13d0cffd-6500-4d98-b5d3-732cfd6a9d8f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:50.694 259225 INFO neutron.agent.dhcp.agent [None req-13d0cffd-6500-4d98-b5d3-732cfd6a9d8f - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:50.818 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:51 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:51.088 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:51.117 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:51 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:51.132 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T09:58:48Z, description=, device_id=0d3b238e-083b-4f7c-8e29-650b41019987, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032b499d0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322735e0>], id=cd504d75-8802-4bc4-9b4d-ae05b1c87cde, ip_allocation=immediate, mac_address=fa:16:3e:44:ce:ea, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T09:58:45Z, description=, dns_domain=, id=efcb439d-5008-49b3-8d74-fc95eb1e0a3c, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-RoutersTest-544404190, port_security_enabled=True, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=18623, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=2617, status=ACTIVE, subnets=['ffc0767c-8f10-49f5-8670-ad4918ca881f'], tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:46Z, vlan_transparent=None, network_id=efcb439d-5008-49b3-8d74-fc95eb1e0a3c, port_security_enabled=False, project_id=3e1ea1a33e554968ba8ebaf6753c9c5d, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=2637, status=DOWN, tags=[], tenant_id=3e1ea1a33e554968ba8ebaf6753c9c5d, updated_at=2026-02-01T09:58:48Z on network efcb439d-5008-49b3-8d74-fc95eb1e0a3c
Feb 01 09:58:51 np0005604215.localdomain ceph-mon[298604]: pgmap v397: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 19 KiB/s wr, 67 op/s
Feb 01 09:58:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:51.265 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:51 np0005604215.localdomain dnsmasq[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/addn_hosts - 1 addresses
Feb 01 09:58:51 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/host
Feb 01 09:58:51 np0005604215.localdomain podman[314442]: 2026-02-01 09:58:51.371070308 +0000 UTC m=+0.074187983 container kill b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 09:58:51 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/opts
Feb 01 09:58:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:51.371 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:58:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:58:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:58:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:58:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:58:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:58:51 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:51.704 259225 INFO neutron.agent.dhcp.agent [None req-051c8c14-9127-4216-a252-4b018ec392ac - - - - - -] DHCP configuration for ports {'cd504d75-8802-4bc4-9b4d-ae05b1c87cde'} is completed
Feb 01 09:58:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:51.733 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e201 e201: 6 total, 6 up, 6 in
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.114 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:58:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "format": "json"}]: dispatch
Feb 01 09:58:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a00bb56c-4e4f-4bbc-b0ef-d7713d639d3e", "force": true, "format": "json"}]: dispatch
Feb 01 09:58:52 np0005604215.localdomain ceph-mon[298604]: osdmap e201: 6 total, 6 up, 6 in
Feb 01 09:58:52 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:52.182 259225 INFO neutron.agent.linux.ip_lib [None req-f5ca7c42-845e-41dc-9662-6fca0cc63260 - - - - - -] Device tap99a3af7a-2f cannot be used as it has no MAC address
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.205 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:52 np0005604215.localdomain kernel: device tap99a3af7a-2f entered promiscuous mode
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.214 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:52 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939932.2145] manager: (tap99a3af7a-2f): new Generic device (/org/freedesktop/NetworkManager/Devices/43)
Feb 01 09:58:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:52Z|00225|binding|INFO|Claiming lport 99a3af7a-2f31-45d3-a12b-e57ea71be76c for this chassis.
Feb 01 09:58:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:52Z|00226|binding|INFO|99a3af7a-2f31-45d3-a12b-e57ea71be76c: Claiming unknown
Feb 01 09:58:52 np0005604215.localdomain systemd-udevd[314474]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:58:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:52.229 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fcb235a7-3377-4ca7-8f52-37430165d9d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fcb235a7-3377-4ca7-8f52-37430165d9d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc4e7855-d30d-41d2-9b5d-873555255c0d, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=99a3af7a-2f31-45d3-a12b-e57ea71be76c) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:52.231 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 99a3af7a-2f31-45d3-a12b-e57ea71be76c in datapath fcb235a7-3377-4ca7-8f52-37430165d9d4 bound to our chassis
Feb 01 09:58:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:52.233 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fcb235a7-3377-4ca7-8f52-37430165d9d4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:52 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:52.235 2 INFO neutron.agent.securitygroups_rpc [None req-1cc83620-5d6e-42d2-912e-0b7270f21e87 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:58:52 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:52.234 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[3f28edb0-1c9e-4e3b-84e5-0fd338b966fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:52Z|00227|binding|INFO|Setting lport 99a3af7a-2f31-45d3-a12b-e57ea71be76c ovn-installed in OVS
Feb 01 09:58:52 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:52Z|00228|binding|INFO|Setting lport 99a3af7a-2f31-45d3-a12b-e57ea71be76c up in Southbound
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.247 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap99a3af7a-2f: No such device
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.289 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v399: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 15 KiB/s wr, 51 op/s
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.318 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:52.522 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:53 np0005604215.localdomain podman[314543]: 
Feb 01 09:58:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:53.116 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 2968b9f2-d43d-41e6-908a-fdd86fd98b2c with type ""
Feb 01 09:58:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:53Z|00229|binding|INFO|Removing iface tap99a3af7a-2f ovn-installed in OVS
Feb 01 09:58:53 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:53Z|00230|binding|INFO|Removing lport 99a3af7a-2f31-45d3-a12b-e57ea71be76c ovn-installed in OVS
Feb 01 09:58:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:53.118 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fcb235a7-3377-4ca7-8f52-37430165d9d4', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fcb235a7-3377-4ca7-8f52-37430165d9d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cc4e7855-d30d-41d2-9b5d-873555255c0d, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=99a3af7a-2f31-45d3-a12b-e57ea71be76c) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.119 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.119 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.120 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:58:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:53.120 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 99a3af7a-2f31-45d3-a12b-e57ea71be76c in datapath fcb235a7-3377-4ca7-8f52-37430165d9d4 unbound from our chassis
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.120 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.121 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:58:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:53.121 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fcb235a7-3377-4ca7-8f52-37430165d9d4 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:58:53 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:58:53.122 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[9a607e6f-c1ed-415d-bce8-e305aa9c0f74]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:58:53 np0005604215.localdomain podman[314543]: 2026-02-01 09:58:53.12660156 +0000 UTC m=+0.090573133 container create 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.150 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:53 np0005604215.localdomain systemd[1]: Started libpod-conmon-8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9.scope.
Feb 01 09:58:53 np0005604215.localdomain ceph-mon[298604]: pgmap v399: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 15 KiB/s wr, 51 op/s
Feb 01 09:58:53 np0005604215.localdomain podman[314543]: 2026-02-01 09:58:53.081815384 +0000 UTC m=+0.045786997 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:58:53 np0005604215.localdomain systemd[1]: tmp-crun.zbMqq4.mount: Deactivated successfully.
Feb 01 09:58:53 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:58:53 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7b79e32ec6f1286ebc23bc523a5ce9ad71e572897b6cdcd2a4d0a356ebe608b9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:58:53 np0005604215.localdomain podman[314543]: 2026-02-01 09:58:53.200831444 +0000 UTC m=+0.164803037 container init 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:58:53 np0005604215.localdomain podman[314543]: 2026-02-01 09:58:53.209788603 +0000 UTC m=+0.173760186 container start 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:58:53 np0005604215.localdomain dnsmasq[314563]: started, version 2.85 cachesize 150
Feb 01 09:58:53 np0005604215.localdomain dnsmasq[314563]: DNS service limited to local subnets
Feb 01 09:58:53 np0005604215.localdomain dnsmasq[314563]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:58:53 np0005604215.localdomain dnsmasq[314563]: warning: no upstream servers configured
Feb 01 09:58:53 np0005604215.localdomain dnsmasq-dhcp[314563]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:58:53 np0005604215.localdomain dnsmasq[314563]: read /var/lib/neutron/dhcp/fcb235a7-3377-4ca7-8f52-37430165d9d4/addn_hosts - 0 addresses
Feb 01 09:58:53 np0005604215.localdomain dnsmasq-dhcp[314563]: read /var/lib/neutron/dhcp/fcb235a7-3377-4ca7-8f52-37430165d9d4/host
Feb 01 09:58:53 np0005604215.localdomain dnsmasq-dhcp[314563]: read /var/lib/neutron/dhcp/fcb235a7-3377-4ca7-8f52-37430165d9d4/opts
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.344 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:53.351 259225 INFO neutron.agent.dhcp.agent [None req-07c7ae8a-ff69-49b5-84a1-29eb4b24381b - - - - - -] DHCP configuration for ports {'cd20fa64-7f72-4e68-9041-518ea56bf3ef'} is completed
Feb 01 09:58:53 np0005604215.localdomain dnsmasq[314563]: exiting on receipt of SIGTERM
Feb 01 09:58:53 np0005604215.localdomain podman[314598]: 2026-02-01 09:58:53.433892367 +0000 UTC m=+0.066083781 container kill 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:58:53 np0005604215.localdomain systemd[1]: libpod-8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9.scope: Deactivated successfully.
Feb 01 09:58:53 np0005604215.localdomain podman[314612]: 2026-02-01 09:58:53.506804099 +0000 UTC m=+0.058287437 container died 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:58:53 np0005604215.localdomain podman[314612]: 2026-02-01 09:58:53.545481705 +0000 UTC m=+0.096965023 container cleanup 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 09:58:53 np0005604215.localdomain systemd[1]: libpod-conmon-8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9.scope: Deactivated successfully.
Feb 01 09:58:53 np0005604215.localdomain podman[314614]: 2026-02-01 09:58:53.580458305 +0000 UTC m=+0.126086231 container remove 8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fcb235a7-3377-4ca7-8f52-37430165d9d4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127)
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.595 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:53 np0005604215.localdomain kernel: device tap99a3af7a-2f left promiscuous mode
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.609 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:53.626 259225 INFO neutron.agent.dhcp.agent [None req-945c7240-0ff5-49d0-9867-cd36ec058158 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:58:53.627 259225 INFO neutron.agent.dhcp.agent [None req-945c7240-0ff5-49d0-9867-cd36ec058158 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:58:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:58:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2759869622' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.678 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.557s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.876 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.878 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11556MB free_disk=41.70010757446289GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.878 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.879 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.934 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.935 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:58:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:53.964 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:58:54 np0005604215.localdomain podman[314643]: 2026-02-01 09:58:54.113206748 +0000 UTC m=+0.074825303 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 09:58:54 np0005604215.localdomain podman[314643]: 2026-02-01 09:58:54.124653625 +0000 UTC m=+0.086272180 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-7b79e32ec6f1286ebc23bc523a5ce9ad71e572897b6cdcd2a4d0a356ebe608b9-merged.mount: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8afd2e261a286fa4c5971f6f71f6c9a1bf8f069459ff7a3b28c27cfbff2ff1e9-userdata-shm.mount: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dfcb235a7\x2d3377\x2d4ca7\x2d8f52\x2d37430165d9d4.mount: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2759869622' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:54 np0005604215.localdomain podman[314685]: 2026-02-01 09:58:54.235158819 +0000 UTC m=+0.086993222 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:58:54 np0005604215.localdomain podman[314685]: 2026-02-01 09:58:54.273628577 +0000 UTC m=+0.125462930 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v400: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 10 KiB/s wr, 66 op/s
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: tmp-crun.N8CpoP.mount: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:58:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2177280335' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:58:54 np0005604215.localdomain podman[314710]: 2026-02-01 09:58:54.393674719 +0000 UTC m=+0.095666063 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., distribution-scope=public, config_id=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, io.buildah.version=1.33.7, version=9.7, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container)
Feb 01 09:58:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:54.401 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.437s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:58:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:54.408 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:58:54 np0005604215.localdomain podman[314710]: 2026-02-01 09:58:54.417657056 +0000 UTC m=+0.119648380 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, distribution-scope=public, release=1769056855, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, version=9.7)
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:54.430 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:58:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:54.484 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:58:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:54.485 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.606s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:58:54 np0005604215.localdomain podman[314728]: 2026-02-01 09:58:54.491117226 +0000 UTC m=+0.088225740 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true)
Feb 01 09:58:54 np0005604215.localdomain podman[314728]: 2026-02-01 09:58:54.497566026 +0000 UTC m=+0.094674580 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:58:54 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:58:54 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:54.946 2 INFO neutron.agent.securitygroups_rpc [req-6a252779-d066-42d3-937d-d19d3be50ea7 req-0e4e2316-508a-44f2-8edc-582240eab0d7 afad352e9d664799bf5de0cadcf3c7cd ff200d66c230435098f5a0489bf1e8f7 - - default default] Security group member updated ['95400daf-a74d-4007-ac5f-e79aa8e5c1cd']
Feb 01 09:58:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e201 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:58:55 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 1 addresses
Feb 01 09:58:55 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:58:55 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:58:55 np0005604215.localdomain podman[314766]: 2026-02-01 09:58:55.195152367 +0000 UTC m=+0.060039082 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 01 09:58:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:55.486 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:55.486 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:55 np0005604215.localdomain ceph-mon[298604]: pgmap v400: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 10 KiB/s wr, 66 op/s
Feb 01 09:58:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2177280335' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:56.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:56.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:56.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:58:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:56.268 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v401: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 8.0 KiB/s wr, 51 op/s
Feb 01 09:58:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:56.332 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3410674559' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:56 np0005604215.localdomain ceph-mon[298604]: pgmap v401: 177 pgs: 177 active+clean; 225 MiB data, 1015 MiB used, 41 GiB / 42 GiB avail; 37 KiB/s rd, 8.0 KiB/s wr, 51 op/s
Feb 01 09:58:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e202 e202: 6 total, 6 up, 6 in
Feb 01 09:58:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:56Z|00231|memory|INFO|peak resident set size grew 52% in last 2644.0 seconds, from 14972 kB to 22796 kB
Feb 01 09:58:56 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:58:56Z|00232|memory|INFO|idl-cells-OVN_Southbound:8429 idl-cells-Open_vSwitch:1155 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:221 lflow-cache-entries-cache-matches:271 lflow-cache-size-KB:903 local_datapath_usage-KB:2 ofctrl_desired_flow_usage-KB:474 ofctrl_installed_flow_usage-KB:346 ofctrl_sb_flow_ref_usage-KB:180
Feb 01 09:58:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:57.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:58:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:58:57.525 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:58:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3748050909' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:57 np0005604215.localdomain ceph-mon[298604]: osdmap e202: 6 total, 6 up, 6 in
Feb 01 09:58:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1558661522' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:58:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v403: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 12 KiB/s wr, 84 op/s
Feb 01 09:58:58 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:58.561 2 INFO neutron.agent.securitygroups_rpc [None req-fa4d9652-c836-4282-9ccf-0ce3333cfc7d 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['9475ea4c-43e5-4601-aa09-56b92b5b1098']
Feb 01 09:58:58 np0005604215.localdomain ceph-mon[298604]: pgmap v403: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 12 KiB/s wr, 84 op/s
Feb 01 09:58:58 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:58.895 2 INFO neutron.agent.securitygroups_rpc [None req-30e16878-3a93-4ab5-adf2-36d2809551ae 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['9475ea4c-43e5-4601-aa09-56b92b5b1098']
Feb 01 09:58:59 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:58:59.218 2 INFO neutron.agent.securitygroups_rpc [None req-da96df46-f62f-42b1-9cd1-d7e6a5376fca d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['5471bfa5-0ba1-439c-b208-7f1eef47ebe2']
Feb 01 09:58:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e203 e203: 6 total, 6 up, 6 in
Feb 01 09:59:00 np0005604215.localdomain podman[236852]: time="2026-02-01T09:59:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:59:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:59:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 160831 "" "Go-http-client/1.1"
Feb 01 09:59:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:59:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 19719 "" "Go-http-client/1.1"
Feb 01 09:59:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:00.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e203 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v405: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 12 KiB/s wr, 84 op/s
Feb 01 09:59:00 np0005604215.localdomain ceph-mon[298604]: osdmap e203: 6 total, 6 up, 6 in
Feb 01 09:59:00 np0005604215.localdomain ceph-mon[298604]: pgmap v405: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 12 KiB/s wr, 84 op/s
Feb 01 09:59:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e204 e204: 6 total, 6 up, 6 in
Feb 01 09:59:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:01.272 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:01 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:01.349 2 INFO neutron.agent.securitygroups_rpc [None req-c763a1d5-029b-4398-9f3b-b445d5b844aa d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['5471bfa5-0ba1-439c-b208-7f1eef47ebe2', '4d2012b8-f333-4b7a-9cf4-a971a1fa768f']
Feb 01 09:59:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:01.421 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:59:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:59:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:59:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:59:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:59:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:59:01 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:01.782 2 INFO neutron.agent.securitygroups_rpc [None req-1a8757de-ce31-4291-bd6c-41d255499299 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e205 e205: 6 total, 6 up, 6 in
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:02 np0005604215.localdomain ceph-mon[298604]: osdmap e204: 6 total, 6 up, 6 in
Feb 01 09:59:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3304763550' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp'
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp' to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta'
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "format": "json"}]: dispatch
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v408: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 5.4 KiB/s wr, 46 op/s
Feb 01 09:59:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:02.530 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:02 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:02.555 2 INFO neutron.agent.securitygroups_rpc [None req-c9b67f75-2c82-4fb2-8eff-26d6d15e8cf1 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['4d2012b8-f333-4b7a-9cf4-a971a1fa768f']
Feb 01 09:59:02 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:02.631 2 INFO neutron.agent.securitygroups_rpc [None req-77c4ed81-7ddc-4083-9aca-2d68314f54e3 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:02 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:02.861 2 INFO neutron.agent.securitygroups_rpc [None req-648bce3d-1b69-4a9f-8926-09639fc82cde 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: osdmap e205: 6 total, 6 up, 6 in
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "format": "json"}]: dispatch
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: pgmap v408: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 5.4 KiB/s wr, 46 op/s
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3141389870' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3141389870' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/565340341' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 09:59:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 09:59:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:04.068 2 INFO neutron.agent.securitygroups_rpc [None req-497c1bff-4722-4a35-9304-716d637751a5 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.289 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v409: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 7.7 KiB/s wr, 55 op/s
Feb 01 09:59:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:04.488 2 INFO neutron.agent.securitygroups_rpc [None req-91776191-3760-464d-b953-295169a6f779 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:04Z|00233|ovn_bfd|INFO|Disabled BFD on interface ovn-2186fb-0
Feb 01 09:59:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:04Z|00234|ovn_bfd|INFO|Disabled BFD on interface ovn-e1cc33-0
Feb 01 09:59:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:04Z|00235|ovn_bfd|INFO|Disabled BFD on interface ovn-45aa31-0
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.612 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.615 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.628 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.735 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain dnsmasq[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/addn_hosts - 0 addresses
Feb 01 09:59:04 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/host
Feb 01 09:59:04 np0005604215.localdomain dnsmasq-dhcp[310765]: read /var/lib/neutron/dhcp/c3e71f40-156c-4217-bedf-836f04a8f728/opts
Feb 01 09:59:04 np0005604215.localdomain systemd[1]: tmp-crun.tlDg4x.mount: Deactivated successfully.
Feb 01 09:59:04 np0005604215.localdomain podman[314803]: 2026-02-01 09:59:04.761673403 +0000 UTC m=+0.083600446 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 09:59:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:04.765 2 INFO neutron.agent.securitygroups_rpc [None req-95923de5-5aba-4d8b-a050-6896df644f34 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:04 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:59:04 np0005604215.localdomain podman[314816]: 2026-02-01 09:59:04.88157865 +0000 UTC m=+0.094069662 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, config_id=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Feb 01 09:59:04 np0005604215.localdomain podman[314816]: 2026-02-01 09:59:04.891426128 +0000 UTC m=+0.103917140 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:59:04 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.938 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain kernel: device tap0ff05a29-3c left promiscuous mode
Feb 01 09:59:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:04Z|00236|binding|INFO|Releasing lport 0ff05a29-3cc7-4c1a-a005-225d700300ca from this chassis (sb_readonly=0)
Feb 01 09:59:04 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:04Z|00237|binding|INFO|Setting lport 0ff05a29-3cc7-4c1a-a005-225d700300ca down in Southbound
Feb 01 09:59:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:04.950 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-c3e71f40-156c-4217-bedf-836f04a8f728', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c3e71f40-156c-4217-bedf-836f04a8f728', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'ff200d66c230435098f5a0489bf1e8f7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e4bd8115-ffb2-4415-a799-f41a6c9021b2, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=0ff05a29-3cc7-4c1a-a005-225d700300ca) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:59:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:04.951 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 0ff05a29-3cc7-4c1a-a005-225d700300ca in datapath c3e71f40-156c-4217-bedf-836f04a8f728 unbound from our chassis
Feb 01 09:59:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:04.955 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c3e71f40-156c-4217-bedf-836f04a8f728, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:59:04 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:04.956 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[3b846c1f-4163-4205-951d-5a11f2fb6a28]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:59:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:04.963 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:04 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:04.990 2 INFO neutron.agent.securitygroups_rpc [None req-99fb79a2-24fb-465b-80f5-1d805114aafe 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e205 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1", "format": "json"}]: dispatch
Feb 01 09:59:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:05 np0005604215.localdomain ceph-mon[298604]: pgmap v409: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 7.7 KiB/s wr, 55 op/s
Feb 01 09:59:05 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1", "format": "json"}]: dispatch
Feb 01 09:59:05 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:05.940 2 INFO neutron.agent.securitygroups_rpc [None req-47fc05ae-3df2-436b-85c1-a738a10459e4 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:06.309 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v410: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 7.2 KiB/s wr, 52 op/s
Feb 01 09:59:06 np0005604215.localdomain ceph-mon[298604]: pgmap v410: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 7.2 KiB/s wr, 52 op/s
Feb 01 09:59:06 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:06.686 2 INFO neutron.agent.securitygroups_rpc [None req-a0d41d31-bc4d-4dff-9a1f-c5fa8673a648 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e206 e206: 6 total, 6 up, 6 in
Feb 01 09:59:06 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:06.996 2 INFO neutron.agent.securitygroups_rpc [None req-1e2106aa-eb56-4244-87ab-21e381223ca0 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['1156e221-0c08-4d6b-9b97-5a6e91528188']
Feb 01 09:59:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:07.530 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:59:07 np0005604215.localdomain ceph-mon[298604]: osdmap e206: 6 total, 6 up, 6 in
Feb 01 09:59:07 np0005604215.localdomain podman[314845]: 2026-02-01 09:59:07.870065577 +0000 UTC m=+0.083032950 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:59:07 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:07.894 2 INFO neutron.agent.securitygroups_rpc [None req-29be5c8b-b62f-485c-998f-043fe218176b d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['4c513797-4919-4e80-9d08-c2b88dcc61a1']
Feb 01 09:59:07 np0005604215.localdomain podman[314845]: 2026-02-01 09:59:07.903277782 +0000 UTC m=+0.116245205 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 09:59:07 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:59:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v412: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 8.3 KiB/s wr, 46 op/s
Feb 01 09:59:08 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:08.395 2 INFO neutron.agent.securitygroups_rpc [None req-70d5b65b-a947-49dc-a1b2-2f95b637bb85 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['a0337e8b-7eb2-4444-b9bf-a19f28129233']
Feb 01 09:59:08 np0005604215.localdomain dnsmasq[310765]: exiting on receipt of SIGTERM
Feb 01 09:59:08 np0005604215.localdomain podman[314885]: 2026-02-01 09:59:08.519788876 +0000 UTC m=+0.059235747 container kill 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 09:59:08 np0005604215.localdomain systemd[1]: libpod-8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51.scope: Deactivated successfully.
Feb 01 09:59:08 np0005604215.localdomain podman[314897]: 2026-02-01 09:59:08.587762574 +0000 UTC m=+0.053229840 container died 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS)
Feb 01 09:59:08 np0005604215.localdomain podman[314897]: 2026-02-01 09:59:08.619368428 +0000 UTC m=+0.084835654 container cleanup 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:59:08 np0005604215.localdomain systemd[1]: libpod-conmon-8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51.scope: Deactivated successfully.
Feb 01 09:59:08 np0005604215.localdomain podman[314899]: 2026-02-01 09:59:08.671333818 +0000 UTC m=+0.131218250 container remove 8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-c3e71f40-156c-4217-bedf-836f04a8f728, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 09:59:08 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:08.702 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:59:08 np0005604215.localdomain ceph-mon[298604]: pgmap v412: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 33 KiB/s rd, 8.3 KiB/s wr, 46 op/s
Feb 01 09:59:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-6c22aff628e6bc84ba432acd1a0ec47a0f890d608bcc6d6b65fb5e1bf052ca32-merged.mount: Deactivated successfully.
Feb 01 09:59:08 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8e4559d8b52638c151427a0bb67f56a8e455aa18a53f27cb954ab69f4989ce51-userdata-shm.mount: Deactivated successfully.
Feb 01 09:59:08 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dc3e71f40\x2d156c\x2d4217\x2dbedf\x2d836f04a8f728.mount: Deactivated successfully.
Feb 01 09:59:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "768c0cef-f962-4eff-a24c-02f86df937ba", "format": "json"}]: dispatch
Feb 01 09:59:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:768c0cef-f962-4eff-a24c-02f86df937ba, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:768c0cef-f962-4eff-a24c-02f86df937ba, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:09 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:09.388 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:59:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "768c0cef-f962-4eff-a24c-02f86df937ba", "format": "json"}]: dispatch
Feb 01 09:59:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:09.955 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v413: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 7.3 KiB/s wr, 40 op/s
Feb 01 09:59:10 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:10.774 2 INFO neutron.agent.securitygroups_rpc [None req-db14a021-48cb-407b-8edd-e94cee0b2d02 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['882cde13-f256-402b-ab30-a0fc50e38425', 'd59fe500-82e7-40fc-885e-589a886bd9ec', '4c513797-4919-4e80-9d08-c2b88dcc61a1']
Feb 01 09:59:10 np0005604215.localdomain ceph-mon[298604]: pgmap v413: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 29 KiB/s rd, 7.3 KiB/s wr, 40 op/s
Feb 01 09:59:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:11.356 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:11 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:11.389 2 INFO neutron.agent.securitygroups_rpc [None req-3849eb70-2e22-4d07-ad6e-472e2f67b4ca 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['3d036e8e-d2c8-4e3a-9dbf-e906123b5f25']
Feb 01 09:59:11 np0005604215.localdomain dnsmasq[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/addn_hosts - 0 addresses
Feb 01 09:59:11 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/host
Feb 01 09:59:11 np0005604215.localdomain dnsmasq-dhcp[314345]: read /var/lib/neutron/dhcp/efcb439d-5008-49b3-8d74-fc95eb1e0a3c/opts
Feb 01 09:59:11 np0005604215.localdomain podman[314940]: 2026-02-01 09:59:11.605337188 +0000 UTC m=+0.059446933 container kill b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 09:59:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:12.068 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:12 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:12Z|00238|binding|INFO|Releasing lport 5b4ba3e8-5f00-4316-a86d-1c06057933fc from this chassis (sb_readonly=0)
Feb 01 09:59:12 np0005604215.localdomain kernel: device tap5b4ba3e8-5f left promiscuous mode
Feb 01 09:59:12 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:12Z|00239|binding|INFO|Setting lport 5b4ba3e8-5f00-4316-a86d-1c06057933fc down in Southbound
Feb 01 09:59:12 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:12.080 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.102.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-efcb439d-5008-49b3-8d74-fc95eb1e0a3c', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efcb439d-5008-49b3-8d74-fc95eb1e0a3c', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3e1ea1a33e554968ba8ebaf6753c9c5d', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=33e912d1-6794-400a-b37c-704b8b53759d, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=5b4ba3e8-5f00-4316-a86d-1c06057933fc) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:59:12 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:12.082 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 5b4ba3e8-5f00-4316-a86d-1c06057933fc in datapath efcb439d-5008-49b3-8d74-fc95eb1e0a3c unbound from our chassis
Feb 01 09:59:12 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:12.085 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network efcb439d-5008-49b3-8d74-fc95eb1e0a3c, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:59:12 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:12.086 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[29a3706a-06e7-47ca-9577-d8e360013d1b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:59:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:12.090 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:12.168 2 INFO neutron.agent.securitygroups_rpc [None req-5ba6c6ad-3063-4fac-a109-b3e261c4ab28 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['3d036e8e-d2c8-4e3a-9dbf-e906123b5f25']
Feb 01 09:59:12 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:12.195 2 INFO neutron.agent.securitygroups_rpc [None req-b3757230-46f4-432e-b101-9a3e15d9fc63 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['882cde13-f256-402b-ab30-a0fc50e38425', 'd59fe500-82e7-40fc-885e-589a886bd9ec']
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v414: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 6.1 KiB/s wr, 33 op/s
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "768c0cef-f962-4eff-a24c-02f86df937ba_da6bfb97-c826-4894-a29f-6e5051fe3817", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:768c0cef-f962-4eff-a24c-02f86df937ba_da6bfb97-c826-4894-a29f-6e5051fe3817, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp'
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp' to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta'
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:768c0cef-f962-4eff-a24c-02f86df937ba_da6bfb97-c826-4894-a29f-6e5051fe3817, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "768c0cef-f962-4eff-a24c-02f86df937ba", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:768c0cef-f962-4eff-a24c-02f86df937ba, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp'
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp' to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta'
Feb 01 09:59:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:12.534 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:768c0cef-f962-4eff-a24c-02f86df937ba, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:12 np0005604215.localdomain ceph-mon[298604]: pgmap v414: 177 pgs: 177 active+clean; 146 MiB data, 875 MiB used, 41 GiB / 42 GiB avail; 24 KiB/s rd, 6.1 KiB/s wr, 33 op/s
Feb 01 09:59:13 np0005604215.localdomain dnsmasq[314345]: exiting on receipt of SIGTERM
Feb 01 09:59:13 np0005604215.localdomain podman[314978]: 2026-02-01 09:59:13.119978663 +0000 UTC m=+0.063757858 container kill b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:59:13 np0005604215.localdomain systemd[1]: libpod-b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8.scope: Deactivated successfully.
Feb 01 09:59:13 np0005604215.localdomain podman[314992]: 2026-02-01 09:59:13.196920961 +0000 UTC m=+0.060865277 container died b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Feb 01 09:59:13 np0005604215.localdomain systemd[1]: tmp-crun.iU4DMF.mount: Deactivated successfully.
Feb 01 09:59:13 np0005604215.localdomain podman[314992]: 2026-02-01 09:59:13.240642294 +0000 UTC m=+0.104586580 container cleanup b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 09:59:13 np0005604215.localdomain systemd[1]: libpod-conmon-b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8.scope: Deactivated successfully.
Feb 01 09:59:13 np0005604215.localdomain podman[314994]: 2026-02-01 09:59:13.328037287 +0000 UTC m=+0.186154002 container remove b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-efcb439d-5008-49b3-8d74-fc95eb1e0a3c, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:59:13 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:13.389 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:59:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "768c0cef-f962-4eff-a24c-02f86df937ba_da6bfb97-c826-4894-a29f-6e5051fe3817", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "768c0cef-f962-4eff-a24c-02f86df937ba", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:13 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:13.861 259225 INFO neutron.agent.linux.ip_lib [None req-a957dd9d-aaeb-4fbf-bd60-2d01c8d33015 - - - - - -] Device tap8e0745c9-47 cannot be used as it has no MAC address
Feb 01 09:59:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:13.909 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:13 np0005604215.localdomain kernel: device tap8e0745c9-47 entered promiscuous mode
Feb 01 09:59:13 np0005604215.localdomain NetworkManager[5972]: <info>  [1769939953.9192] manager: (tap8e0745c9-47): new Generic device (/org/freedesktop/NetworkManager/Devices/44)
Feb 01 09:59:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:13Z|00240|binding|INFO|Claiming lport 8e0745c9-4755-4917-844d-acaa5ec19a3f for this chassis.
Feb 01 09:59:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:13Z|00241|binding|INFO|8e0745c9-4755-4917-844d-acaa5ec19a3f: Claiming unknown
Feb 01 09:59:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:13.921 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:13 np0005604215.localdomain systemd-udevd[315031]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 09:59:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:13.930 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-db4796fe-8da5-42e5-beb8-ef32cfa5ba89', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db4796fe-8da5-42e5-beb8-ef32cfa5ba89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f947c1-544d-485e-899d-5026404fa905, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=8e0745c9-4755-4917-844d-acaa5ec19a3f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:59:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:13.933 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 8e0745c9-4755-4917-844d-acaa5ec19a3f in datapath db4796fe-8da5-42e5-beb8-ef32cfa5ba89 bound to our chassis
Feb 01 09:59:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:13.936 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network db4796fe-8da5-42e5-beb8-ef32cfa5ba89 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 09:59:13 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:13.938 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[95ec30c6-f68d-4a46-b169-a13a6cf507f3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:13Z|00242|binding|INFO|Setting lport 8e0745c9-4755-4917-844d-acaa5ec19a3f ovn-installed in OVS
Feb 01 09:59:13 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:13Z|00243|binding|INFO|Setting lport 8e0745c9-4755-4917-844d-acaa5ec19a3f up in Southbound
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:13.967 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:13 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:14 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap8e0745c9-47: No such device
Feb 01 09:59:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:14.017 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:14.042 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:14 np0005604215.localdomain systemd[1]: tmp-crun.6duVYc.mount: Deactivated successfully.
Feb 01 09:59:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-3d531bacba67f6f62eddd0d069e5b217cc14d9a3f2a39da03970900ebf0f5bfb-merged.mount: Deactivated successfully.
Feb 01 09:59:14 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b120198883d37bfddc45936513c726a48c3fad611c725f95d1e4900378b085d8-userdata-shm.mount: Deactivated successfully.
Feb 01 09:59:14 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2defcb439d\x2d5008\x2d49b3\x2d8d74\x2dfc95eb1e0a3c.mount: Deactivated successfully.
Feb 01 09:59:14 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:14.177 259225 INFO neutron.agent.dhcp.agent [-] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:59:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v415: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 4.7 KiB/s wr, 1 op/s
Feb 01 09:59:14 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:14.500 2 INFO neutron.agent.securitygroups_rpc [None req-5e11ff1d-8674-48ea-81ee-b80b81afee00 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['7c2cef09-1439-45eb-af5a-e316fd7a5ca9']
Feb 01 09:59:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:14.726 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:14 np0005604215.localdomain ceph-mon[298604]: pgmap v415: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 4.7 KiB/s wr, 1 op/s
Feb 01 09:59:14 np0005604215.localdomain podman[315102]: 
Feb 01 09:59:14 np0005604215.localdomain podman[315102]: 2026-02-01 09:59:14.983957884 +0000 UTC m=+0.093605107 container create 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2)
Feb 01 09:59:15 np0005604215.localdomain systemd[1]: Started libpod-conmon-0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21.scope.
Feb 01 09:59:15 np0005604215.localdomain podman[315102]: 2026-02-01 09:59:14.938921411 +0000 UTC m=+0.048568694 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 09:59:15 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 09:59:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon dump", "format": "json"} v 0)
Feb 01 09:59:15 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2915970543' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:59:15 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2346ecdc71b972c7c9380ff6b9d8627db59885136140e3209e42239b23da8a64/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 09:59:15 np0005604215.localdomain podman[315102]: 2026-02-01 09:59:15.065343771 +0000 UTC m=+0.174991024 container init 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 09:59:15 np0005604215.localdomain podman[315102]: 2026-02-01 09:59:15.074575419 +0000 UTC m=+0.184222642 container start 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 09:59:15 np0005604215.localdomain dnsmasq[315121]: started, version 2.85 cachesize 150
Feb 01 09:59:15 np0005604215.localdomain dnsmasq[315121]: DNS service limited to local subnets
Feb 01 09:59:15 np0005604215.localdomain dnsmasq[315121]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 09:59:15 np0005604215.localdomain dnsmasq[315121]: warning: no upstream servers configured
Feb 01 09:59:15 np0005604215.localdomain dnsmasq-dhcp[315121]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 09:59:15 np0005604215.localdomain dnsmasq[315121]: read /var/lib/neutron/dhcp/db4796fe-8da5-42e5-beb8-ef32cfa5ba89/addn_hosts - 0 addresses
Feb 01 09:59:15 np0005604215.localdomain dnsmasq-dhcp[315121]: read /var/lib/neutron/dhcp/db4796fe-8da5-42e5-beb8-ef32cfa5ba89/host
Feb 01 09:59:15 np0005604215.localdomain dnsmasq-dhcp[315121]: read /var/lib/neutron/dhcp/db4796fe-8da5-42e5-beb8-ef32cfa5ba89/opts
Feb 01 09:59:15 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:15.120 2 INFO neutron.agent.securitygroups_rpc [None req-d9b77fb6-ebc7-41c7-a5b0-8d63738e2d77 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['7c2cef09-1439-45eb-af5a-e316fd7a5ca9']
Feb 01 09:59:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e206 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:15 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:15.234 259225 INFO neutron.agent.dhcp.agent [None req-c4afd2b4-de26-4382-a373-7a98d98ac865 - - - - - -] DHCP configuration for ports {'42220e7b-b7fd-49ee-9ded-1abeca61bab9'} is completed
Feb 01 09:59:15 np0005604215.localdomain sshd[315122]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 09:59:15 np0005604215.localdomain sshd[315122]: error: kex_exchange_identification: banner line contains invalid characters
Feb 01 09:59:15 np0005604215.localdomain sshd[315122]: banner exchange: Connection from 64.62.156.182 port 37132: invalid format
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1_2507b9ff-e305-4644-a480-f9c1a78c9d40", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1_2507b9ff-e305-4644-a480-f9c1a78c9d40, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp'
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp' to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta'
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1_2507b9ff-e305-4644-a480-f9c1a78c9d40, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp'
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta.tmp' to config b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840/.meta'
Feb 01 09:59:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:15 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2915970543' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:59:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e207 e207: 6 total, 6 up, 6 in
Feb 01 09:59:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v417: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 4.9 KiB/s wr, 1 op/s
Feb 01 09:59:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:16.359 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1_2507b9ff-e305-4644-a480-f9c1a78c9d40", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "snap_name": "b0497c4b-d9c3-43be-bbb4-ed8008c4b7c1", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:16 np0005604215.localdomain ceph-mon[298604]: osdmap e207: 6 total, 6 up, 6 in
Feb 01 09:59:16 np0005604215.localdomain ceph-mon[298604]: pgmap v417: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 4.9 KiB/s wr, 1 op/s
Feb 01 09:59:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e208 e208: 6 total, 6 up, 6 in
Feb 01 09:59:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:17.001 2 INFO neutron.agent.securitygroups_rpc [None req-7f0248f7-eb48-44f1-afa4-6c21da99fef7 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']
Feb 01 09:59:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:17.113 2 INFO neutron.agent.securitygroups_rpc [None req-8aeaa6d8-5ca7-4c3a-b8a2-e5b99f7af336 d74a270228ef43bb9eebd5b8b203e133 fea4c3ac6fd14aee8b0de1bad5f8673a - - default default] Security group member updated ['090b75a3-5cce-4012-8bbe-4b851ef442c2']
Feb 01 09:59:17 np0005604215.localdomain systemd[1]: tmp-crun.VXCC8K.mount: Deactivated successfully.
Feb 01 09:59:17 np0005604215.localdomain dnsmasq[315121]: read /var/lib/neutron/dhcp/db4796fe-8da5-42e5-beb8-ef32cfa5ba89/addn_hosts - 0 addresses
Feb 01 09:59:17 np0005604215.localdomain dnsmasq-dhcp[315121]: read /var/lib/neutron/dhcp/db4796fe-8da5-42e5-beb8-ef32cfa5ba89/host
Feb 01 09:59:17 np0005604215.localdomain dnsmasq-dhcp[315121]: read /var/lib/neutron/dhcp/db4796fe-8da5-42e5-beb8-ef32cfa5ba89/opts
Feb 01 09:59:17 np0005604215.localdomain podman[315141]: 2026-02-01 09:59:17.236248839 +0000 UTC m=+0.077377902 container kill 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 09:59:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:17.319 2 INFO neutron.agent.securitygroups_rpc [None req-bfce6195-4572-4858-b958-8ae0eaa1dc23 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']
Feb 01 09:59:17 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:17Z|00244|binding|INFO|Removing iface tap8e0745c9-47 ovn-installed in OVS
Feb 01 09:59:17 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:17Z|00245|binding|INFO|Removing lport 8e0745c9-4755-4917-844d-acaa5ec19a3f ovn-installed in OVS
Feb 01 09:59:17 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:17.448 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 6fb0cad9-71f8-424f-b704-4ac7422d6ca8 with type ""
Feb 01 09:59:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:17.451 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:17 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:17.452 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-db4796fe-8da5-42e5-beb8-ef32cfa5ba89', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-db4796fe-8da5-42e5-beb8-ef32cfa5ba89', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a6f947c1-544d-485e-899d-5026404fa905, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=8e0745c9-4755-4917-844d-acaa5ec19a3f) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:59:17 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:17.455 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 8e0745c9-4755-4917-844d-acaa5ec19a3f in datapath db4796fe-8da5-42e5-beb8-ef32cfa5ba89 unbound from our chassis
Feb 01 09:59:17 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:17.459 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network db4796fe-8da5-42e5-beb8-ef32cfa5ba89, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:59:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:17.459 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:17 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:17.460 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[b41210c7-2255-4c1b-b341-a2968e7e9a1e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:59:17 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:17.501 259225 INFO neutron.agent.dhcp.agent [None req-50ebb1dc-6507-4b64-b277-cd3328a48a1c - - - - - -] DHCP configuration for ports {'42220e7b-b7fd-49ee-9ded-1abeca61bab9', '8e0745c9-4755-4917-844d-acaa5ec19a3f'} is completed
Feb 01 09:59:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:17.536 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:17 np0005604215.localdomain podman[315177]: 2026-02-01 09:59:17.609684888 +0000 UTC m=+0.062534821 container kill 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:59:17 np0005604215.localdomain dnsmasq[315121]: exiting on receipt of SIGTERM
Feb 01 09:59:17 np0005604215.localdomain systemd[1]: libpod-0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21.scope: Deactivated successfully.
Feb 01 09:59:17 np0005604215.localdomain podman[315189]: 2026-02-01 09:59:17.689084382 +0000 UTC m=+0.064447300 container died 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Feb 01 09:59:17 np0005604215.localdomain podman[315189]: 2026-02-01 09:59:17.717766146 +0000 UTC m=+0.093128994 container cleanup 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:59:17 np0005604215.localdomain systemd[1]: libpod-conmon-0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21.scope: Deactivated successfully.
Feb 01 09:59:17 np0005604215.localdomain podman[315191]: 2026-02-01 09:59:17.76796124 +0000 UTC m=+0.135416361 container remove 0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-db4796fe-8da5-42e5-beb8-ef32cfa5ba89, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 09:59:17 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:17.779 2 INFO neutron.agent.securitygroups_rpc [None req-a584f8be-9a76-404e-9854-afe457f9fc2e 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']
Feb 01 09:59:17 np0005604215.localdomain kernel: device tap8e0745c9-47 left promiscuous mode
Feb 01 09:59:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:17.818 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:17.831 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:17 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:17.855 259225 INFO neutron.agent.dhcp.agent [-] Synchronizing state
Feb 01 09:59:17 np0005604215.localdomain ceph-mon[298604]: osdmap e208: 6 total, 6 up, 6 in
Feb 01 09:59:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:18.012 259225 INFO neutron.agent.dhcp.agent [None req-7dd38096-b9ec-4a6c-acff-8934fe3456bc - - - - - -] All active networks have been fetched through RPC.
Feb 01 09:59:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:18.014 259225 INFO neutron.agent.dhcp.agent [-] Starting network db4796fe-8da5-42e5-beb8-ef32cfa5ba89 dhcp configuration
Feb 01 09:59:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:18.014 259225 INFO neutron.agent.dhcp.agent [-] Finished network db4796fe-8da5-42e5-beb8-ef32cfa5ba89 dhcp configuration
Feb 01 09:59:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:18.014 259225 INFO neutron.agent.dhcp.agent [None req-7dd38096-b9ec-4a6c-acff-8934fe3456bc - - - - - -] Synchronizing state complete
Feb 01 09:59:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:18.086 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:18 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:18.120 259225 INFO neutron.agent.dhcp.agent [None req-c89990d5-78d9-464e-a59a-c794cde743d7 - - - - - -] DHCP configuration for ports {'42220e7b-b7fd-49ee-9ded-1abeca61bab9'} is completed
Feb 01 09:59:18 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:18.201 2 INFO neutron.agent.securitygroups_rpc [None req-e84e7596-d223-4f47-8254-074f89f65fdc 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']
Feb 01 09:59:18 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2346ecdc71b972c7c9380ff6b9d8627db59885136140e3209e42239b23da8a64-merged.mount: Deactivated successfully.
Feb 01 09:59:18 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0b43c1664215a7a3f7c6da320082b787ce24961f23469394a2a91fdd1c2d7e21-userdata-shm.mount: Deactivated successfully.
Feb 01 09:59:18 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2ddb4796fe\x2d8da5\x2d42e5\x2dbeb8\x2def32cfa5ba89.mount: Deactivated successfully.
Feb 01 09:59:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v419: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 23 KiB/s wr, 29 op/s
Feb 01 09:59:18 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:18.532 2 INFO neutron.agent.securitygroups_rpc [None req-e742ba15-30cb-436e-a4c2-2391ce2e0670 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']
Feb 01 09:59:18 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:18.888 2 INFO neutron.agent.securitygroups_rpc [None req-46338e73-87bf-491f-8ed0-9b7015f713f3 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['95f04fd7-2a7a-45df-b586-d0771f0c51c2']
Feb 01 09:59:18 np0005604215.localdomain ceph-mon[298604]: pgmap v419: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 17 KiB/s rd, 23 KiB/s wr, 29 op/s
Feb 01 09:59:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2510115252' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2510115252' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e209 e209: 6 total, 6 up, 6 in
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "17417769-f267-43fd-88c0-c1785d065840", "format": "json"}]: dispatch
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:17417769-f267-43fd-88c0-c1785d065840, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:17417769-f267-43fd-88c0-c1785d065840, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '17417769-f267-43fd-88c0-c1785d065840' of type subvolume
Feb 01 09:59:19 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:59:19.045+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '17417769-f267-43fd-88c0-c1785d065840' of type subvolume
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/17417769-f267-43fd-88c0-c1785d065840'' moved to trashcan
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:59:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:17417769-f267-43fd-88c0-c1785d065840, vol_name:cephfs) < ""
Feb 01 09:59:19 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:19.742 2 INFO neutron.agent.securitygroups_rpc [None req-7124cdef-fc68-40e6-a617-ec7afce96cf9 3e3e53f98794468b9dd11f09fac77776 0721038c814c404d9f2aa1ec859c5601 - - default default] Security group rule updated ['05a59877-b29c-4804-965e-2274924179d2']
Feb 01 09:59:19 np0005604215.localdomain podman[315239]: 2026-02-01 09:59:19.824608446 +0000 UTC m=+0.058082320 container kill 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127)
Feb 01 09:59:19 np0005604215.localdomain dnsmasq[312654]: exiting on receipt of SIGTERM
Feb 01 09:59:19 np0005604215.localdomain systemd[1]: libpod-6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b.scope: Deactivated successfully.
Feb 01 09:59:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:19Z|00246|binding|INFO|Removing iface tap3a91aa3a-fb ovn-installed in OVS
Feb 01 09:59:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:19.840 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 6b9acee3-bb0b-4711-8e27-ab74e9f04414 with type ""
Feb 01 09:59:19 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T09:59:19Z|00247|binding|INFO|Removing lport 3a91aa3a-fb9f-4945-91e3-85f0d278b0b5 ovn-installed in OVS
Feb 01 09:59:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:19.842 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.255.242/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'c23ed2a9641444eeac6ffb9689135326', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=28f2370b-4aa7-434f-90cb-05cc01bed2bb, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=3a91aa3a-fb9f-4945-91e3-85f0d278b0b5) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:59:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:19.846 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 3a91aa3a-fb9f-4945-91e3-85f0d278b0b5 in datapath 4da937bf-f66e-48ce-bf66-f7d3d9f7bc52 unbound from our chassis
Feb 01 09:59:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:19.849 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 09:59:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:19.850 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:19.851 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:19 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:19.850 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[79ddea18-c0e6-45eb-90fd-69e77e9cba00]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 09:59:19 np0005604215.localdomain podman[315252]: 2026-02-01 09:59:19.885204055 +0000 UTC m=+0.047491191 container died 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:59:19 np0005604215.localdomain podman[315252]: 2026-02-01 09:59:19.968688087 +0000 UTC m=+0.130975143 container cleanup 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 09:59:19 np0005604215.localdomain systemd[1]: libpod-conmon-6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b.scope: Deactivated successfully.
Feb 01 09:59:19 np0005604215.localdomain podman[315254]: 2026-02-01 09:59:19.992148738 +0000 UTC m=+0.146075483 container remove 6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-4da937bf-f66e-48ce-bf66-f7d3d9f7bc52, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:59:20 np0005604215.localdomain ceph-mon[298604]: osdmap e209: 6 total, 6 up, 6 in
Feb 01 09:59:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "17417769-f267-43fd-88c0-c1785d065840", "format": "json"}]: dispatch
Feb 01 09:59:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "17417769-f267-43fd-88c0-c1785d065840", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:20 np0005604215.localdomain kernel: device tap3a91aa3a-fb left promiscuous mode
Feb 01 09:59:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:20.006 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e210 e210: 6 total, 6 up, 6 in
Feb 01 09:59:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:20.023 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:20 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:20.054 259225 INFO neutron.agent.dhcp.agent [None req-5dcb9013-3153-4a15-b221-e281c05a548d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:59:20 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 09:59:20.055 259225 INFO neutron.agent.dhcp.agent [None req-5dcb9013-3153-4a15-b221-e281c05a548d - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 09:59:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e210 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:20.242 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v422: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 34 KiB/s wr, 52 op/s
Feb 01 09:59:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-4383ecd4dd19cf8e2c67a94d44ef1d65db9bfee3a02ca20cd31b89d8b95a13e4-merged.mount: Deactivated successfully.
Feb 01 09:59:20 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6556794d4e5bf704e06bf800eff0a39e5cd58b47cd3e5cb402c8509b64928f0b-userdata-shm.mount: Deactivated successfully.
Feb 01 09:59:20 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d4da937bf\x2df66e\x2d48ce\x2dbf66\x2df7d3d9f7bc52.mount: Deactivated successfully.
Feb 01 09:59:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e211 e211: 6 total, 6 up, 6 in
Feb 01 09:59:21 np0005604215.localdomain ceph-mon[298604]: osdmap e210: 6 total, 6 up, 6 in
Feb 01 09:59:21 np0005604215.localdomain ceph-mon[298604]: pgmap v422: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 34 KiB/s wr, 52 op/s
Feb 01 09:59:21 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2523053405' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:21 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2523053405' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_09:59:21
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['backups', 'volumes', 'manila_data', 'vms', '.mgr', 'manila_metadata', 'images']
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 09:59:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:21.410 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 1.635783082077052e-06 of space, bias 1.0, pg target 0.0003266113553880514 quantized to 32 (current 32)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 1.3631525683975433e-06 of space, bias 1.0, pg target 0.0002712673611111111 quantized to 32 (current 32)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.0905220547180346e-06 of space, bias 1.0, pg target 0.00021701388888888888 quantized to 32 (current 32)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 3.107987855946399e-05 of space, bias 4.0, pg target 0.024739583333333332 quantized to 16 (current 16)
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 09:59:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e212 e212: 6 total, 6 up, 6 in
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/.meta.tmp'
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/.meta.tmp' to config b'/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/.meta'
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "format": "json"}]: dispatch
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:22 np0005604215.localdomain ceph-mon[298604]: osdmap e211: 6 total, 6 up, 6 in
Feb 01 09:59:22 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/257626994' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:22 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/257626994' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:22 np0005604215.localdomain ceph-mon[298604]: osdmap e212: 6 total, 6 up, 6 in
Feb 01 09:59:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:59:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v425: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:59:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:22.538 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "format": "json"}]: dispatch
Feb 01 09:59:23 np0005604215.localdomain ceph-mon[298604]: pgmap v425: 177 pgs: 177 active+clean; 146 MiB data, 876 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:59:23 np0005604215.localdomain ceph-mgr[278126]: [devicehealth INFO root] Check health
Feb 01 09:59:24 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2418254129' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:24 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2418254129' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v426: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 25 KiB/s wr, 152 op/s
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:59:24 np0005604215.localdomain podman[315284]: 2026-02-01 09:59:24.86069793 +0000 UTC m=+0.073231653 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: tmp-crun.yBFP0U.mount: Deactivated successfully.
Feb 01 09:59:24 np0005604215.localdomain podman[315282]: 2026-02-01 09:59:24.918699598 +0000 UTC m=+0.132226823 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., distribution-scope=public, build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7)
Feb 01 09:59:24 np0005604215.localdomain podman[315282]: 2026-02-01 09:59:24.934617993 +0000 UTC m=+0.148145188 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9/ubi-minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:59:24 np0005604215.localdomain podman[315284]: 2026-02-01 09:59:24.972059191 +0000 UTC m=+0.184592934 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 09:59:24 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:59:25 np0005604215.localdomain podman[315283]: 2026-02-01 09:59:25.022027218 +0000 UTC m=+0.232046313 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:59:25 np0005604215.localdomain podman[315283]: 2026-02-01 09:59:25.027268091 +0000 UTC m=+0.237287176 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Feb 01 09:59:25 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:59:25 np0005604215.localdomain podman[315285]: 2026-02-01 09:59:25.075086042 +0000 UTC m=+0.280812053 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 09:59:25 np0005604215.localdomain podman[315285]: 2026-02-01 09:59:25.083275326 +0000 UTC m=+0.289001267 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:59:25 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:59:25 np0005604215.localdomain ceph-mon[298604]: pgmap v426: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 109 KiB/s rd, 25 KiB/s wr, 152 op/s
Feb 01 09:59:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e212 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve49", "tenant_id": "9d23e4ae23d44fac9f67906e518759ed", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 09:59:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve49, format:json, prefix:fs subvolume authorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, tenant_id:9d23e4ae23d44fac9f67906e518759ed, vol_name:cephfs) < ""
Feb 01 09:59:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve49", "format": "json"} v 0)
Feb 01 09:59:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 01 09:59:25 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID eve49 with tenant 9d23e4ae23d44fac9f67906e518759ed
Feb 01 09:59:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} v 0)
Feb 01 09:59:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve49, format:json, prefix:fs subvolume authorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, tenant_id:9d23e4ae23d44fac9f67906e518759ed, vol_name:cephfs) < ""
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve49", "tenant_id": "9d23e4ae23d44fac9f67906e518759ed", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve49", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3639167572' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3639167572' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v427: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 22 KiB/s wr, 129 op/s
Feb 01 09:59:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:26.448 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "96e32855-572c-434b-9f41-cf83f652dd08", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:96e32855-572c-434b-9f41-cf83f652dd08, vol_name:cephfs) < ""
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/96e32855-572c-434b-9f41-cf83f652dd08/.meta.tmp'
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/96e32855-572c-434b-9f41-cf83f652dd08/.meta.tmp' to config b'/volumes/_nogroup/96e32855-572c-434b-9f41-cf83f652dd08/.meta'
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:96e32855-572c-434b-9f41-cf83f652dd08, vol_name:cephfs) < ""
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "96e32855-572c-434b-9f41-cf83f652dd08", "format": "json"}]: dispatch
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:96e32855-572c-434b-9f41-cf83f652dd08, vol_name:cephfs) < ""
Feb 01 09:59:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:96e32855-572c-434b-9f41-cf83f652dd08, vol_name:cephfs) < ""
Feb 01 09:59:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e213 e213: 6 total, 6 up, 6 in
Feb 01 09:59:27 np0005604215.localdomain ceph-mon[298604]: pgmap v427: 177 pgs: 177 active+clean; 146 MiB data, 881 MiB used, 41 GiB / 42 GiB avail; 93 KiB/s rd, 22 KiB/s wr, 129 op/s
Feb 01 09:59:27 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:59:27 np0005604215.localdomain ceph-mon[298604]: osdmap e213: 6 total, 6 up, 6 in
Feb 01 09:59:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:27.542 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:27 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:27.938 2 INFO neutron.agent.securitygroups_rpc [None req-1f1f07c2-7f26-4f08-b340-54cc58b1fa0f ce67f2e1bfb142d8acccf95caf1fd7af ab1e856df66342919053583b6afafe11 - - default default] Security group member updated ['0924a62e-9fd4-48bb-ad08-68af324d32a1']
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "96e32855-572c-434b-9f41-cf83f652dd08", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "96e32855-572c-434b-9f41-cf83f652dd08", "format": "json"}]: dispatch
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e214 e214: 6 total, 6 up, 6 in
Feb 01 09:59:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v430: 177 pgs: 177 active+clean; 146 MiB data, 885 MiB used, 41 GiB / 42 GiB avail; 129 KiB/s rd, 41 KiB/s wr, 185 op/s
Feb 01 09:59:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve48", "tenant_id": "9d23e4ae23d44fac9f67906e518759ed", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 09:59:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve48, format:json, prefix:fs subvolume authorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, tenant_id:9d23e4ae23d44fac9f67906e518759ed, vol_name:cephfs) < ""
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve48", "format": "json"} v 0)
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 01 09:59:28 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID eve48 with tenant 9d23e4ae23d44fac9f67906e518759ed
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} v 0)
Feb 01 09:59:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve48, format:json, prefix:fs subvolume authorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, tenant_id:9d23e4ae23d44fac9f67906e518759ed, vol_name:cephfs) < ""
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: osdmap e214: 6 total, 6 up, 6 in
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: pgmap v430: 177 pgs: 177 active+clean; 146 MiB data, 885 MiB used, 41 GiB / 42 GiB avail; 129 KiB/s rd, 41 KiB/s wr, 185 op/s
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve48", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/104439940' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/104439940' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e215 e215: 6 total, 6 up, 6 in
Feb 01 09:59:29 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 01 09:59:30 np0005604215.localdomain podman[236852]: time="2026-02-01T09:59:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 09:59:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:59:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 09:59:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:09:59:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18302 "" "Go-http-client/1.1"
Feb 01 09:59:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e215 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve48", "tenant_id": "9d23e4ae23d44fac9f67906e518759ed", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 09:59:30 np0005604215.localdomain ceph-mon[298604]: osdmap e215: 6 total, 6 up, 6 in
Feb 01 09:59:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e216 e216: 6 total, 6 up, 6 in
Feb 01 09:59:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v433: 177 pgs: 177 active+clean; 146 MiB data, 885 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 32 KiB/s wr, 98 op/s
Feb 01 09:59:31 np0005604215.localdomain ceph-mon[298604]: osdmap e216: 6 total, 6 up, 6 in
Feb 01 09:59:31 np0005604215.localdomain ceph-mon[298604]: pgmap v433: 177 pgs: 177 active+clean; 146 MiB data, 885 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 32 KiB/s wr, 98 op/s
Feb 01 09:59:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e217 e217: 6 total, 6 up, 6 in
Feb 01 09:59:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:31.452 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:59:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 09:59:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:59:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   09:59:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 09:59:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: osdmap e217: 6 total, 6 up, 6 in
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e218 e218: 6 total, 6 up, 6 in
Feb 01 09:59:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v436: 177 pgs: 177 active+clean; 146 MiB data, 885 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2555453464' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2555453464' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:32.543 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2037329309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2037329309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 01 09:59:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:eve48, format:json, prefix:fs subvolume deauthorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve48", "format": "json"} v 0)
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve48"} v 0)
Feb 01 09:59:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:eve48, format:json, prefix:fs subvolume deauthorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:eve48, format:json, prefix:fs subvolume evict, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=eve48, client_metadata.root=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b
Feb 01 09:59:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 09:59:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:eve48, format:json, prefix:fs subvolume evict, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:33 np0005604215.localdomain neutron_sriov_agent[252054]: 2026-02-01 09:59:33.151 2 INFO neutron.agent.securitygroups_rpc [None req-842f7c09-bc9b-4f83-bb5e-50b631488f24 ce67f2e1bfb142d8acccf95caf1fd7af ab1e856df66342919053583b6afafe11 - - default default] Security group member updated ['0924a62e-9fd4-48bb-ad08-68af324d32a1']
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: osdmap e218: 6 total, 6 up, 6 in
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: pgmap v436: 177 pgs: 177 active+clean; 146 MiB data, 885 MiB used, 41 GiB / 42 GiB avail
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2555453464' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2555453464' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2037329309' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2037329309' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve48", "format": "json"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve48"} : dispatch
Feb 01 09:59:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve48"}]': finished
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96e32855-572c-434b-9f41-cf83f652dd08", "format": "json"}]: dispatch
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:96e32855-572c-434b-9f41-cf83f652dd08, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:96e32855-572c-434b-9f41-cf83f652dd08, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:59:34 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:59:34.111+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '96e32855-572c-434b-9f41-cf83f652dd08' of type subvolume
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '96e32855-572c-434b-9f41-cf83f652dd08' of type subvolume
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "96e32855-572c-434b-9f41-cf83f652dd08", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:96e32855-572c-434b-9f41-cf83f652dd08, vol_name:cephfs) < ""
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/96e32855-572c-434b-9f41-cf83f652dd08'' moved to trashcan
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:96e32855-572c-434b-9f41-cf83f652dd08, vol_name:cephfs) < ""
Feb 01 09:59:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e219 e219: 6 total, 6 up, 6 in
Feb 01 09:59:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v438: 177 pgs: 177 active+clean; 146 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 267 KiB/s rd, 40 KiB/s wr, 366 op/s
Feb 01 09:59:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 01 09:59:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve48", "format": "json"}]: dispatch
Feb 01 09:59:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1270358567' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:34 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1270358567' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e219 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "96e32855-572c-434b-9f41-cf83f652dd08", "format": "json"}]: dispatch
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "96e32855-572c-434b-9f41-cf83f652dd08", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: osdmap e219: 6 total, 6 up, 6 in
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: pgmap v438: 177 pgs: 177 active+clean; 146 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 267 KiB/s rd, 40 KiB/s wr, 366 op/s
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3616609875' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3616609875' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:35 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 09:59:35 np0005604215.localdomain systemd[1]: tmp-crun.1ohAsJ.mount: Deactivated successfully.
Feb 01 09:59:35 np0005604215.localdomain podman[315369]: 2026-02-01 09:59:35.866634026 +0000 UTC m=+0.078159647 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Feb 01 09:59:35 np0005604215.localdomain podman[315369]: 2026-02-01 09:59:35.879734844 +0000 UTC m=+0.091260515 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 09:59:35 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 09:59:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve47", "tenant_id": "9d23e4ae23d44fac9f67906e518759ed", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve47, format:json, prefix:fs subvolume authorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, tenant_id:9d23e4ae23d44fac9f67906e518759ed, vol_name:cephfs) < ""
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve47", "format": "json"} v 0)
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID eve47 with tenant 9d23e4ae23d44fac9f67906e518759ed
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} v 0)
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:eve47, format:json, prefix:fs subvolume authorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, tenant_id:9d23e4ae23d44fac9f67906e518759ed, vol_name:cephfs) < ""
Feb 01 09:59:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v439: 177 pgs: 177 active+clean; 146 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 183 KiB/s rd, 28 KiB/s wr, 250 op/s
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/371474734' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/371474734' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.eve47", "caps": ["mds", "allow rw path=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b", "osd", "allow rw pool=manila_data namespace=fsvolumens_9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "mon", "allow r"], "format": "json"}]': finished
Feb 01 09:59:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:36.459 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e220 e220: 6 total, 6 up, 6 in
Feb 01 09:59:36 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 01 09:59:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve47", "tenant_id": "9d23e4ae23d44fac9f67906e518759ed", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 09:59:37 np0005604215.localdomain ceph-mon[298604]: pgmap v439: 177 pgs: 177 active+clean; 146 MiB data, 908 MiB used, 41 GiB / 42 GiB avail; 183 KiB/s rd, 28 KiB/s wr, 250 op/s
Feb 01 09:59:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2908926970' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2908926970' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:37 np0005604215.localdomain ceph-mon[298604]: osdmap e220: 6 total, 6 up, 6 in
Feb 01 09:59:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:37.584 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v441: 177 pgs: 177 active+clean; 146 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 267 KiB/s rd, 65 KiB/s wr, 374 op/s
Feb 01 09:59:38 np0005604215.localdomain ceph-mon[298604]: pgmap v441: 177 pgs: 177 active+clean; 146 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 267 KiB/s rd, 65 KiB/s wr, 374 op/s
Feb 01 09:59:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 09:59:38 np0005604215.localdomain podman[315388]: 2026-02-01 09:59:38.867990695 +0000 UTC m=+0.079753226 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 09:59:38 np0005604215.localdomain podman[315388]: 2026-02-01 09:59:38.880392012 +0000 UTC m=+0.092154583 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 09:59:38 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:eve47, format:json, prefix:fs subvolume deauthorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve47", "format": "json"} v 0)
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve47"} v 0)
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:eve47, format:json, prefix:fs subvolume deauthorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:eve47, format:json, prefix:fs subvolume evict, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve47", "format": "json"} : dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve47"} : dispatch
Feb 01 09:59:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve47"}]': finished
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=eve47, client_metadata.root=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 09:59:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:eve47, format:json, prefix:fs subvolume evict, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e220 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:40 np0005604215.localdomain sudo[315414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:59:40 np0005604215.localdomain sudo[315414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:59:40 np0005604215.localdomain sudo[315414]: pam_unix(sudo:session): session closed for user root
Feb 01 09:59:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v442: 177 pgs: 177 active+clean; 146 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 202 KiB/s rd, 49 KiB/s wr, 283 op/s
Feb 01 09:59:40 np0005604215.localdomain sudo[315432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --image registry.redhat.io/rhceph/rhceph-7-rhel9:latest --timeout 895 ls
Feb 01 09:59:40 np0005604215.localdomain sudo[315432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve47", "format": "json"}]: dispatch
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: pgmap v442: 177 pgs: 177 active+clean; 146 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 202 KiB/s rd, 49 KiB/s wr, 283 op/s
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #34. Immutable memtables: 0.
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.938633) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 17] Flushing memtable with next log file: 34
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939980938742, "job": 17, "event": "flush_started", "num_memtables": 1, "num_entries": 2827, "num_deletes": 276, "total_data_size": 5484590, "memory_usage": 5668256, "flush_reason": "Manual Compaction"}
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 17] Level-0 flush table #35: started
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939980960021, "cf_name": "default", "job": 17, "event": "table_file_creation", "file_number": 35, "file_size": 3567095, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 22399, "largest_seqno": 25220, "table_properties": {"data_size": 3556054, "index_size": 7098, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2885, "raw_key_size": 25594, "raw_average_key_size": 22, "raw_value_size": 3533067, "raw_average_value_size": 3074, "num_data_blocks": 301, "num_entries": 1149, "num_filter_entries": 1149, "num_deletions": 276, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939851, "oldest_key_time": 1769939851, "file_creation_time": 1769939980, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 35, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 17] Flush lasted 21447 microseconds, and 10515 cpu microseconds.
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.960091) [db/flush_job.cc:967] [default] [JOB 17] Level-0 flush table #35: 3567095 bytes OK
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.960120) [db/memtable_list.cc:519] [default] Level-0 commit table #35 started
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.962144) [db/memtable_list.cc:722] [default] Level-0 commit table #35: memtable #1 done
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.962167) EVENT_LOG_v1 {"time_micros": 1769939980962161, "job": 17, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.962191) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 17] Try to delete WAL files size 5471569, prev total WAL file size 5471569, number of live WAL files 2.
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000031.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.963500) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132303438' seq:72057594037927935, type:22 .. '7061786F73003132333030' seq:0, type:0; will stop at (end)
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 18] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 17 Base level 0, inputs: [35(3483KB)], [33(18MB)]
Feb 01 09:59:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939980963555, "job": 18, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [35], "files_L6": [33], "score": -1, "input_data_size": 23027899, "oldest_snapshot_seqno": -1}
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 18] Generated table #36: 13243 keys, 21758735 bytes, temperature: kUnknown
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939981118671, "cf_name": "default", "job": 18, "event": "table_file_creation", "file_number": 36, "file_size": 21758735, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21681473, "index_size": 43049, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 33157, "raw_key_size": 355091, "raw_average_key_size": 26, "raw_value_size": 21454201, "raw_average_value_size": 1620, "num_data_blocks": 1623, "num_entries": 13243, "num_filter_entries": 13243, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939980, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 36, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.119011) [db/compaction/compaction_job.cc:1663] [default] [JOB 18] Compacted 1@0 + 1@6 files to L6 => 21758735 bytes
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.120774) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 148.4 rd, 140.2 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(3.4, 18.6 +0.0 blob) out(20.8 +0.0 blob), read-write-amplify(12.6) write-amplify(6.1) OK, records in: 13802, records dropped: 559 output_compression: NoCompression
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.120802) EVENT_LOG_v1 {"time_micros": 1769939981120789, "job": 18, "event": "compaction_finished", "compaction_time_micros": 155204, "compaction_time_cpu_micros": 55427, "output_level": 6, "num_output_files": 1, "total_output_size": 21758735, "num_input_records": 13802, "num_output_records": 13243, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000035.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939981121404, "job": 18, "event": "table_file_deletion", "file_number": 35}
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000033.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939981123868, "job": 18, "event": "table_file_deletion", "file_number": 33}
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:40.963447) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.124084) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.124095) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.124099) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.124103) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:41.124106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:41 np0005604215.localdomain podman[315522]: 2026-02-01 09:59:41.189219307 +0000 UTC m=+0.094416484 container exec 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, distribution-scope=public, io.openshift.tags=rhceph ceph, version=7, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vcs-type=git, RELEASE=main, vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, name=rhceph, io.k8s.description=Red Hat Ceph Storage 7, CEPH_POINT_RELEASE=, release=1764794109, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, GIT_CLEAN=True, org.opencontainers.image.created=2025-12-08T17:28:53Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., build-date=2025-12-08T17:28:53Z, description=Red Hat Ceph Storage 7, com.redhat.component=rhceph-container, architecture=x86_64, GIT_REPO=https://github.com/ceph/ceph-container.git, ceph=True, io.openshift.expose-services=, GIT_BRANCH=main, io.buildah.version=1.41.4, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 09:59:41 np0005604215.localdomain podman[315522]: 2026-02-01 09:59:41.322886772 +0000 UTC m=+0.228083919 container exec_died 39a3032afbb342ba7bd100116836fc9c218c1425256845645b2093290e19b07a (image=registry.redhat.io/rhceph/rhceph-7-rhel9:latest, name=ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-crash-np0005604215, io.openshift.expose-services=, summary=Provides the latest Red Hat Ceph Storage 7 on RHEL 9 in a fully featured and supported base image., vcs-ref=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, distribution-scope=public, vendor=Red Hat, Inc., GIT_BRANCH=main, name=rhceph, CEPH_POINT_RELEASE=, org.opencontainers.image.revision=688f299ec3c1ef405209c1d311a7fbd93d01b6f5, org.opencontainers.image.created=2025-12-08T17:28:53Z, GIT_COMMIT=12717c0777377369ea674892da98b0d85250f5b0, description=Red Hat Ceph Storage 7, io.buildah.version=1.41.4, maintainer=Guillaume Abrioux <gabrioux@redhat.com>, build-date=2025-12-08T17:28:53Z, ceph=True, release=1764794109, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, RELEASE=main, version=7, vcs-type=git, com.redhat.component=rhceph-container, io.k8s.description=Red Hat Ceph Storage 7, GIT_REPO=https://github.com/ceph/ceph-container.git, GIT_CLEAN=True, io.k8s.display-name=Red Hat Ceph Storage 7 on RHEL 9, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.openshift.tags=rhceph ceph)
Feb 01 09:59:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:41.460 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:41.776 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:59:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:41.777 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:59:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:41.777 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e221 e221: 6 total, 6 up, 6 in
Feb 01 09:59:41 np0005604215.localdomain sudo[315432]: pam_unix(sudo:session): session closed for user root
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 09:59:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 09:59:42 np0005604215.localdomain sudo[315644]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 09:59:42 np0005604215.localdomain sudo[315644]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:59:42 np0005604215.localdomain sudo[315644]: pam_unix(sudo:session): session closed for user root
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 09:59:42 np0005604215.localdomain sudo[315662]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 09:59:42 np0005604215.localdomain sudo[315662]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v444: 177 pgs: 177 active+clean; 146 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 29 KiB/s wr, 95 op/s
Feb 01 09:59:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:42.585 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:42 np0005604215.localdomain sudo[315662]: pam_unix(sudo:session): session closed for user root
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: osdmap e221: 6 total, 6 up, 6 in
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: pgmap v444: 177 pgs: 177 active+clean; 146 MiB data, 902 MiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 29 KiB/s wr, 95 op/s
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: [cephadm INFO root] Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [INF] : Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config set, name=osd_memory_target}] v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: [cephadm WARNING cephadm.serve] Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:59:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cephadm) log [WRN] : Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:59:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 09:59:43 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev f3a89213-8821-4f90-884a-6194ac204334 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:59:43 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev f3a89213-8821-4f90-884a-6194ac204334 (Updating node-proxy deployment (+3 -> 3))
Feb 01 09:59:43 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event f3a89213-8821-4f90-884a-6194ac204334 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain sudo[315711]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 09:59:43 np0005604215.localdomain sudo[315711]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 09:59:43 np0005604215.localdomain sudo[315711]: pam_unix(sudo:session): session closed for user root
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.2", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.5", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604215.localdomain to 836.6M
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604215.localdomain to 877243801: error parsing value: Value '877243801' is below minimum 939524096
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.0", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.1", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.4", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config rm", "who": "osd.3", "name": "osd_memory_target"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604213.localdomain to 836.6M
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: Adjusting osd_memory_target on np0005604212.localdomain to 836.6M
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604212.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: Unable to set osd_memory_target on np0005604213.localdomain to 877246668: error parsing value: Value '877246668' is below minimum 939524096
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:eve49, format:json, prefix:fs subvolume deauthorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.eve49", "format": "json"} v 0)
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.eve49"} v 0)
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:eve49, format:json, prefix:fs subvolume deauthorize, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:eve49, format:json, prefix:fs subvolume evict, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=eve49, client_metadata.root=/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e/cf2be5d6-9de6-4d4b-bfd9-aa075972364b
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:eve49, format:json, prefix:fs subvolume evict, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v445: 177 pgs: 177 active+clean; 146 MiB data, 920 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 40 KiB/s wr, 113 op/s
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "format": "json"}]: dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9ef26968-45d9-4b40-a5b1-d54b2ff71a2e' of type subvolume
Feb 01 09:59:44 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T09:59:44.451+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '9ef26968-45d9-4b40-a5b1-d54b2ff71a2e' of type subvolume
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/9ef26968-45d9-4b40-a5b1-d54b2ff71a2e'' moved to trashcan
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 09:59:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9ef26968-45d9-4b40-a5b1-d54b2ff71a2e, vol_name:cephfs) < ""
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/671789745' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/671789745' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3868590087' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3868590087' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.eve49", "format": "json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.eve49"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.eve49"}]': finished
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "auth_id": "eve49", "format": "json"}]: dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: pgmap v445: 177 pgs: 177 active+clean; 146 MiB data, 920 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 40 KiB/s wr, 113 op/s
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/671789745' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:44 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/671789745' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e221 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:59:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2006111227' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:59:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2006111227' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e222 e222: 6 total, 6 up, 6 in
Feb 01 09:59:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "format": "json"}]: dispatch
Feb 01 09:59:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9ef26968-45d9-4b40-a5b1-d54b2ff71a2e", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:46 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2006111227' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:46 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2006111227' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v447: 177 pgs: 177 active+clean; 146 MiB data, 920 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 11 KiB/s wr, 17 op/s
Feb 01 09:59:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:46.489 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:46 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 09:59:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 09:59:47 np0005604215.localdomain ceph-mon[298604]: osdmap e222: 6 total, 6 up, 6 in
Feb 01 09:59:47 np0005604215.localdomain ceph-mon[298604]: pgmap v447: 177 pgs: 177 active+clean; 146 MiB data, 920 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 11 KiB/s wr, 17 op/s
Feb 01 09:59:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 09:59:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e223 e223: 6 total, 6 up, 6 in
Feb 01 09:59:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:47.626 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v449: 177 pgs: 177 active+clean; 193 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 3.2 MiB/s rd, 3.3 MiB/s wr, 144 op/s
Feb 01 09:59:48 np0005604215.localdomain ceph-mon[298604]: osdmap e223: 6 total, 6 up, 6 in
Feb 01 09:59:49 np0005604215.localdomain ceph-mon[298604]: pgmap v449: 177 pgs: 177 active+clean; 193 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 3.2 MiB/s rd, 3.3 MiB/s wr, 144 op/s
Feb 01 09:59:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2512265319' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2512265319' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e224 e224: 6 total, 6 up, 6 in
Feb 01 09:59:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e224 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v451: 177 pgs: 177 active+clean; 193 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 3.6 MiB/s wr, 133 op/s
Feb 01 09:59:50 np0005604215.localdomain ceph-mon[298604]: osdmap e224: 6 total, 6 up, 6 in
Feb 01 09:59:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1766604859' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:50 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1766604859' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e225 e225: 6 total, 6 up, 6 in
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta.tmp'
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta.tmp' to config b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta'
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "format": "json"}]: dispatch
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: pgmap v451: 177 pgs: 177 active+clean; 193 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 3.6 MiB/s wr, 133 op/s
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: osdmap e225: 6 total, 6 up, 6 in
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3543788049' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3543788049' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "format": "json"}]: dispatch
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 09:59:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:59:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:59:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:59:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:59:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 09:59:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 09:59:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:51.535 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #37. Immutable memtables: 0.
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.824675) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 19] Flushing memtable with next log file: 37
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991824750, "job": 19, "event": "flush_started", "num_memtables": 1, "num_entries": 573, "num_deletes": 258, "total_data_size": 656164, "memory_usage": 668112, "flush_reason": "Manual Compaction"}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 19] Level-0 flush table #38: started
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991832021, "cf_name": "default", "job": 19, "event": "table_file_creation", "file_number": 38, "file_size": 430704, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25225, "largest_seqno": 25793, "table_properties": {"data_size": 427432, "index_size": 1127, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1093, "raw_key_size": 8498, "raw_average_key_size": 20, "raw_value_size": 420453, "raw_average_value_size": 1015, "num_data_blocks": 44, "num_entries": 414, "num_filter_entries": 414, "num_deletions": 258, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939981, "oldest_key_time": 1769939981, "file_creation_time": 1769939991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 38, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 19] Flush lasted 7385 microseconds, and 2525 cpu microseconds.
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.832070) [db/flush_job.cc:967] [default] [JOB 19] Level-0 flush table #38: 430704 bytes OK
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.832092) [db/memtable_list.cc:519] [default] Level-0 commit table #38 started
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.834197) [db/memtable_list.cc:722] [default] Level-0 commit table #38: memtable #1 done
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.834222) EVENT_LOG_v1 {"time_micros": 1769939991834215, "job": 19, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.834245) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 19] Try to delete WAL files size 652675, prev total WAL file size 652999, number of live WAL files 2.
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000034.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.835359) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034303133' seq:72057594037927935, type:22 .. '6C6F676D0034323635' seq:0, type:0; will stop at (end)
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 20] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 19 Base level 0, inputs: [38(420KB)], [36(20MB)]
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991835436, "job": 20, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [38], "files_L6": [36], "score": -1, "input_data_size": 22189439, "oldest_snapshot_seqno": -1}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 20] Generated table #39: 13112 keys, 21455668 bytes, temperature: kUnknown
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991960006, "cf_name": "default", "job": 20, "event": "table_file_creation", "file_number": 39, "file_size": 21455668, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21380240, "index_size": 41535, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 32837, "raw_key_size": 353602, "raw_average_key_size": 26, "raw_value_size": 21156178, "raw_average_value_size": 1613, "num_data_blocks": 1548, "num_entries": 13112, "num_filter_entries": 13112, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769939991, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 39, "seqno_to_time_mapping": "N/A"}}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.960414) [db/compaction/compaction_job.cc:1663] [default] [JOB 20] Compacted 1@0 + 1@6 files to L6 => 21455668 bytes
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.962368) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 178.0 rd, 172.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.4, 20.8 +0.0 blob) out(20.5 +0.0 blob), read-write-amplify(101.3) write-amplify(49.8) OK, records in: 13657, records dropped: 545 output_compression: NoCompression
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.962399) EVENT_LOG_v1 {"time_micros": 1769939991962386, "job": 20, "event": "compaction_finished", "compaction_time_micros": 124659, "compaction_time_cpu_micros": 56834, "output_level": 6, "num_output_files": 1, "total_output_size": 21455668, "num_input_records": 13657, "num_output_records": 13112, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000038.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991962613, "job": 20, "event": "table_file_deletion", "file_number": 38}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000036.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769939991965426, "job": 20, "event": "table_file_deletion", "file_number": 36}
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.835238) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.965462) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.965467) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.965472) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.965476) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:51 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-09:59:51.965479) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 09:59:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:52.095 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v453: 177 pgs: 177 active+clean; 193 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 3.6 MiB/s wr, 133 op/s
Feb 01 09:59:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:52.661 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:52 np0005604215.localdomain ceph-mon[298604]: pgmap v453: 177 pgs: 177 active+clean; 193 MiB data, 1006 MiB used, 41 GiB / 42 GiB avail; 86 KiB/s rd, 3.6 MiB/s wr, 133 op/s
Feb 01 09:59:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e226 e226: 6 total, 6 up, 6 in
Feb 01 09:59:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3187705350' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3187705350' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:54.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:54.099 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 09:59:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:54.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 09:59:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:54.122 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 09:59:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "snap_name": "967f54ee-5f61-45c6-877b-9621e93b6257", "format": "json"}]: dispatch
Feb 01 09:59:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:967f54ee-5f61-45c6-877b-9621e93b6257, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:967f54ee-5f61-45c6-877b-9621e93b6257, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v455: 177 pgs: 177 active+clean; 193 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 148 KiB/s rd, 13 KiB/s wr, 198 op/s
Feb 01 09:59:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:54.444 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 09:59:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:54.445 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:54 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 09:59:54.446 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 09:59:54 np0005604215.localdomain ceph-mon[298604]: osdmap e226: 6 total, 6 up, 6 in
Feb 01 09:59:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "snap_name": "967f54ee-5f61-45c6-877b-9621e93b6257", "format": "json"}]: dispatch
Feb 01 09:59:54 np0005604215.localdomain ceph-mon[298604]: pgmap v455: 177 pgs: 177 active+clean; 193 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 148 KiB/s rd, 13 KiB/s wr, 198 op/s
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.130 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.130 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.131 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.131 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.131 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:59:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e226 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 09:59:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:59:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4277162567' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:55.587 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.456s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:59:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4277162567' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2928283604' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 09:59:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 09:59:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 09:59:55 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 09:59:56 np0005604215.localdomain systemd[1]: tmp-crun.cpUAhx.mount: Deactivated successfully.
Feb 01 09:59:56 np0005604215.localdomain podman[315753]: 2026-02-01 09:59:56.047527206 +0000 UTC m=+0.081805702 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 01 09:59:56 np0005604215.localdomain podman[315754]: 2026-02-01 09:59:56.066142515 +0000 UTC m=+0.093985980 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2)
Feb 01 09:59:56 np0005604215.localdomain podman[315753]: 2026-02-01 09:59:56.126423334 +0000 UTC m=+0.160701860 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 09:59:56 np0005604215.localdomain podman[315752]: 2026-02-01 09:59:56.138531121 +0000 UTC m=+0.172064173 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, distribution-scope=public, vendor=Red Hat, Inc., version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, release=1769056855, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, container_name=openstack_network_exporter, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible)
Feb 01 09:59:56 np0005604215.localdomain podman[315754]: 2026-02-01 09:59:56.141903776 +0000 UTC m=+0.169747241 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.150 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.152 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11549MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.152 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.152 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 09:59:56 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 09:59:56 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 09:59:56 np0005604215.localdomain podman[315752]: 2026-02-01 09:59:56.217919015 +0000 UTC m=+0.251452137 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.openshift.expose-services=, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, container_name=openstack_network_exporter, version=9.7)
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.223 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.225 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 09:59:56 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.243 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 09:59:56 np0005604215.localdomain podman[315759]: 2026-02-01 09:59:56.221791316 +0000 UTC m=+0.246744560 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 09:59:56 np0005604215.localdomain podman[315759]: 2026-02-01 09:59:56.304783552 +0000 UTC m=+0.329736826 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 09:59:56 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 09:59:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v456: 177 pgs: 177 active+clean; 193 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 12 KiB/s wr, 171 op/s
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.537 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/345388689' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.684 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.441s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.691 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.707 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.709 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 09:59:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:56.710 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.557s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e227 e227: 6 total, 6 up, 6 in
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/32702901' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/32702901' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: pgmap v456: 177 pgs: 177 active+clean; 193 MiB data, 990 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 12 KiB/s wr, 171 op/s
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/740724679' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/345388689' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1114050395' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1114050395' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:56 np0005604215.localdomain ceph-mon[298604]: osdmap e227: 6 total, 6 up, 6 in
Feb 01 09:59:57 np0005604215.localdomain systemd[1]: tmp-crun.9byqj1.mount: Deactivated successfully.
Feb 01 09:59:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:57.663 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 09:59:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:57.710 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:57.710 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:57.711 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:57.711 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 09:59:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e228 e228: 6 total, 6 up, 6 in
Feb 01 09:59:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 09:59:58.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/142970444' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/142970444' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "snap_name": "967f54ee-5f61-45c6-877b-9621e93b6257_d3fd438f-2f2b-4121-930e-8bd318b9b3ac", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:967f54ee-5f61-45c6-877b-9621e93b6257_d3fd438f-2f2b-4121-930e-8bd318b9b3ac, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta.tmp'
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta.tmp' to config b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta'
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:967f54ee-5f61-45c6-877b-9621e93b6257_d3fd438f-2f2b-4121-930e-8bd318b9b3ac, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "snap_name": "967f54ee-5f61-45c6-877b-9621e93b6257", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:967f54ee-5f61-45c6-877b-9621e93b6257, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v459: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 29 KiB/s wr, 370 op/s
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta.tmp'
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta.tmp' to config b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b/.meta'
Feb 01 09:59:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:967f54ee-5f61-45c6-877b-9621e93b6257, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: osdmap e228: 6 total, 6 up, 6 in
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/142970444' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/142970444' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "snap_name": "967f54ee-5f61-45c6-877b-9621e93b6257_d3fd438f-2f2b-4121-930e-8bd318b9b3ac", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "snap_name": "967f54ee-5f61-45c6-877b-9621e93b6257", "force": true, "format": "json"}]: dispatch
Feb 01 09:59:58 np0005604215.localdomain ceph-mon[298604]: pgmap v459: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 3.7 MiB/s rd, 29 KiB/s wr, 370 op/s
Feb 01 09:59:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e229 e229: 6 total, 6 up, 6 in
Feb 01 10:00:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:00:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:00:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:00:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:00:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:00:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18305 "" "Go-http-client/1.1"
Feb 01 10:00:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e229 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v461: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 15 KiB/s wr, 171 op/s
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, vol_name:cephfs) < ""
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fc8c8d47-ce44-484d-a6aa-20ee79341f8c/.meta.tmp'
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fc8c8d47-ce44-484d-a6aa-20ee79341f8c/.meta.tmp' to config b'/volumes/_nogroup/fc8c8d47-ce44-484d-a6aa-20ee79341f8c/.meta'
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, vol_name:cephfs) < ""
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "format": "json"}]: dispatch
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, vol_name:cephfs) < ""
Feb 01 10:00:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, vol_name:cephfs) < ""
Feb 01 10:00:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e230 e230: 6 total, 6 up, 6 in
Feb 01 10:00:00 np0005604215.localdomain ceph-mon[298604]: osdmap e229: 6 total, 6 up, 6 in
Feb 01 10:00:00 np0005604215.localdomain ceph-mon[298604]: overall HEALTH_WARN 1 stray daemon(s) not managed by cephadm; 1 stray host(s) with 1 daemon(s) not managed by cephadm
Feb 01 10:00:00 np0005604215.localdomain ceph-mon[298604]: pgmap v461: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 3.5 MiB/s rd, 15 KiB/s wr, 171 op/s
Feb 01 10:00:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:01.095 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:01.448 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:00:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:01.578 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:00:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:00:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:00:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:00:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:00:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2444475700' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2444475700' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "format": "json"}]: dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:01 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:01.788+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6010311e-11a6-4c95-b3ff-674156fa7f2b' of type subvolume
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6010311e-11a6-4c95-b3ff-674156fa7f2b' of type subvolume
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/6010311e-11a6-4c95-b3ff-674156fa7f2b'' moved to trashcan
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6010311e-11a6-4c95-b3ff-674156fa7f2b, vol_name:cephfs) < ""
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "format": "json"}]: dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: osdmap e230: 6 total, 6 up, 6 in
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2444475700' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2444475700' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:02.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1007765160' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1007765160' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v463: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 17 KiB/s wr, 186 op/s
Feb 01 10:00:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:02.690 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "format": "json"}]: dispatch
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6010311e-11a6-4c95-b3ff-674156fa7f2b", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1007765160' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1007765160' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: pgmap v463: 177 pgs: 177 active+clean; 193 MiB data, 1008 MiB used, 41 GiB / 42 GiB avail; 3.8 MiB/s rd, 17 KiB/s wr, 186 op/s
Feb 01 10:00:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1636548431' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/246736743' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, vol_name:cephfs) < ""
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v464: 177 pgs: 177 active+clean; 193 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 3.3 MiB/s wr, 158 op/s
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/a75eaef7-5948-4b0c-93f4-48367ba74a09/.meta.tmp'
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a75eaef7-5948-4b0c-93f4-48367ba74a09/.meta.tmp' to config b'/volumes/_nogroup/a75eaef7-5948-4b0c-93f4-48367ba74a09/.meta'
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, vol_name:cephfs) < ""
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "format": "json"}]: dispatch
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, vol_name:cephfs) < ""
Feb 01 10:00:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, vol_name:cephfs) < ""
Feb 01 10:00:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:04 np0005604215.localdomain ceph-mon[298604]: pgmap v464: 177 pgs: 177 active+clean; 193 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 108 KiB/s rd, 3.3 MiB/s wr, 158 op/s
Feb 01 10:00:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "format": "json"}]: dispatch
Feb 01 10:00:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e230 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v465: 177 pgs: 177 active+clean; 193 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 2.7 MiB/s wr, 128 op/s
Feb 01 10:00:06 np0005604215.localdomain ceph-mon[298604]: pgmap v465: 177 pgs: 177 active+clean; 193 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 87 KiB/s rd, 2.7 MiB/s wr, 128 op/s
Feb 01 10:00:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:06.582 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:06 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:00:06 np0005604215.localdomain podman[315856]: 2026-02-01 10:00:06.827542693 +0000 UTC m=+0.084451263 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:00:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e231 e231: 6 total, 6 up, 6 in
Feb 01 10:00:06 np0005604215.localdomain podman[315856]: 2026-02-01 10:00:06.867853919 +0000 UTC m=+0.124762449 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:00:06 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:00:07 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:00:07Z|00248|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 01 10:00:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:07.694 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:07 np0005604215.localdomain ceph-mon[298604]: osdmap e231: 6 total, 6 up, 6 in
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "format": "json"}]: dispatch
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a75eaef7-5948-4b0c-93f4-48367ba74a09' of type subvolume
Feb 01 10:00:08 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:08.037+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a75eaef7-5948-4b0c-93f4-48367ba74a09' of type subvolume
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, vol_name:cephfs) < ""
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/a75eaef7-5948-4b0c-93f4-48367ba74a09'' moved to trashcan
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a75eaef7-5948-4b0c-93f4-48367ba74a09, vol_name:cephfs) < ""
Feb 01 10:00:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v467: 177 pgs: 177 active+clean; 193 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 147 op/s
Feb 01 10:00:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "format": "json"}]: dispatch
Feb 01 10:00:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a75eaef7-5948-4b0c-93f4-48367ba74a09", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:08 np0005604215.localdomain ceph-mon[298604]: pgmap v467: 177 pgs: 177 active+clean; 193 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 147 op/s
Feb 01 10:00:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:00:09 np0005604215.localdomain podman[315875]: 2026-02-01 10:00:09.867795745 +0000 UTC m=+0.083980959 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 10:00:09 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/264951471' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:09 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/264951471' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:09 np0005604215.localdomain podman[315875]: 2026-02-01 10:00:09.884693041 +0000 UTC m=+0.100878255 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:00:09 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v468: 177 pgs: 177 active+clean; 193 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.3 MiB/s wr, 125 op/s
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, vol_name:cephfs) < ""
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/cf68d29d-a061-4145-ba2b-6bee3a2be2df/.meta.tmp'
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/cf68d29d-a061-4145-ba2b-6bee3a2be2df/.meta.tmp' to config b'/volumes/_nogroup/cf68d29d-a061-4145-ba2b-6bee3a2be2df/.meta'
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, vol_name:cephfs) < ""
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "format": "json"}]: dispatch
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, vol_name:cephfs) < ""
Feb 01 10:00:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, vol_name:cephfs) < ""
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/765974474' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/765974474' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: pgmap v468: 177 pgs: 177 active+clean; 193 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.2 MiB/s rd, 2.3 MiB/s wr, 125 op/s
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2901407308' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:10 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2901407308' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "format": "json"}]: dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:11 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:11.319+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fc8c8d47-ce44-484d-a6aa-20ee79341f8c' of type subvolume
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fc8c8d47-ce44-484d-a6aa-20ee79341f8c' of type subvolume
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, vol_name:cephfs) < ""
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/fc8c8d47-ce44-484d-a6aa-20ee79341f8c'' moved to trashcan
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fc8c8d47-ce44-484d-a6aa-20ee79341f8c, vol_name:cephfs) < ""
Feb 01 10:00:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:11.585 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "format": "json"}]: dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "format": "json"}]: dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fc8c8d47-ce44-484d-a6aa-20ee79341f8c", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4179659550' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4179659550' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2967673012' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2967673012' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v469: 177 pgs: 177 active+clean; 193 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 117 op/s
Feb 01 10:00:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:12.730 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3236207941' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3236207941' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2967673012' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2967673012' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: pgmap v469: 177 pgs: 177 active+clean; 193 MiB data, 1009 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 117 op/s
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3236207941' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3236207941' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:13 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:13 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, vol_name:cephfs) < ""
Feb 01 10:00:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v470: 177 pgs: 177 active+clean; 193 MiB data, 1010 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 158 op/s
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/6ef0dca5-087f-47f5-b456-3a93c05421f7/.meta.tmp'
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/6ef0dca5-087f-47f5-b456-3a93c05421f7/.meta.tmp' to config b'/volumes/_nogroup/6ef0dca5-087f-47f5-b456-3a93c05421f7/.meta'
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:15 np0005604215.localdomain ceph-mon[298604]: pgmap v470: 177 pgs: 177 active+clean; 193 MiB data, 1010 MiB used, 41 GiB / 42 GiB avail; 2.1 MiB/s rd, 2.2 MiB/s wr, 158 op/s
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "format": "json"}]: dispatch
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e231 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta.tmp'
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta.tmp' to config b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta'
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "format": "json"}]: dispatch
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7f3137f1-669c-444c-94c7-6fef11988c8f, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7f3137f1-669c-444c-94c7-6fef11988c8f/.meta.tmp'
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7f3137f1-669c-444c-94c7-6fef11988c8f/.meta.tmp' to config b'/volumes/_nogroup/7f3137f1-669c-444c-94c7-6fef11988c8f/.meta'
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7f3137f1-669c-444c-94c7-6fef11988c8f, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "format": "json"}]: dispatch
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7f3137f1-669c-444c-94c7-6fef11988c8f, vol_name:cephfs) < ""
Feb 01 10:00:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7f3137f1-669c-444c-94c7-6fef11988c8f, vol_name:cephfs) < ""
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e232 e232: 6 total, 6 up, 6 in
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "format": "json"}]: dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "format": "json"}]: dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "format": "json"}]: dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v472: 177 pgs: 177 active+clean; 193 MiB data, 1010 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.3 MiB/s wr, 166 op/s
Feb 01 10:00:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:16.629 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:17 np0005604215.localdomain ceph-mon[298604]: osdmap e232: 6 total, 6 up, 6 in
Feb 01 10:00:17 np0005604215.localdomain ceph-mon[298604]: pgmap v472: 177 pgs: 177 active+clean; 193 MiB data, 1010 MiB used, 41 GiB / 42 GiB avail; 2.3 MiB/s rd, 2.3 MiB/s wr, 166 op/s
Feb 01 10:00:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:17 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/701977625' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:17 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/701977625' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "snap_name": "066e8414-0f7f-4d94-8ce3-b9f1dd2fb571", "format": "json"}]: dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:066e8414-0f7f-4d94-8ce3-b9f1dd2fb571, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:066e8414-0f7f-4d94-8ce3-b9f1dd2fb571, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "format": "json"}]: dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:17.612+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6ef0dca5-087f-47f5-b456-3a93c05421f7' of type subvolume
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6ef0dca5-087f-47f5-b456-3a93c05421f7' of type subvolume
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/6ef0dca5-087f-47f5-b456-3a93c05421f7'' moved to trashcan
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6ef0dca5-087f-47f5-b456-3a93c05421f7, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:17.781 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "format": "json"}]: dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7f3137f1-669c-444c-94c7-6fef11988c8f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7f3137f1-669c-444c-94c7-6fef11988c8f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:17.915+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7f3137f1-669c-444c-94c7-6fef11988c8f' of type subvolume
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7f3137f1-669c-444c-94c7-6fef11988c8f' of type subvolume
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7f3137f1-669c-444c-94c7-6fef11988c8f, vol_name:cephfs) < ""
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7f3137f1-669c-444c-94c7-6fef11988c8f'' moved to trashcan
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7f3137f1-669c-444c-94c7-6fef11988c8f, vol_name:cephfs) < ""
Feb 01 10:00:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/701977625' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:18 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/701977625' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:18 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "snap_name": "066e8414-0f7f-4d94-8ce3-b9f1dd2fb571", "format": "json"}]: dispatch
Feb 01 10:00:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v473: 177 pgs: 177 active+clean; 193 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 2.2 MiB/s wr, 188 op/s
Feb 01 10:00:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "format": "json"}]: dispatch
Feb 01 10:00:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6ef0dca5-087f-47f5-b456-3a93c05421f7", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "format": "json"}]: dispatch
Feb 01 10:00:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7f3137f1-669c-444c-94c7-6fef11988c8f", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:19 np0005604215.localdomain ceph-mon[298604]: pgmap v473: 177 pgs: 177 active+clean; 193 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 128 KiB/s rd, 2.2 MiB/s wr, 188 op/s
Feb 01 10:00:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e233 e233: 6 total, 6 up, 6 in
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1482730681' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1482730681' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e234 e234: 6 total, 6 up, 6 in
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: osdmap e233: 6 total, 6 up, 6 in
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1482730681' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1482730681' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v476: 177 pgs: 177 active+clean; 193 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 33 KiB/s wr, 74 op/s
Feb 01 10:00:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f7671ca1-51ec-4e8e-b389-8a8c50c13461/.meta.tmp'
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f7671ca1-51ec-4e8e-b389-8a8c50c13461/.meta.tmp' to config b'/volumes/_nogroup/f7671ca1-51ec-4e8e-b389-8a8c50c13461/.meta'
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "format": "json"}]: dispatch
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "snap_name": "066e8414-0f7f-4d94-8ce3-b9f1dd2fb571_3455eb51-47e6-416a-8a60-b0a5234ab5f1", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:066e8414-0f7f-4d94-8ce3-b9f1dd2fb571_3455eb51-47e6-416a-8a60-b0a5234ab5f1, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: osdmap e234: 6 total, 6 up, 6 in
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: pgmap v476: 177 pgs: 177 active+clean; 193 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 47 KiB/s rd, 33 KiB/s wr, 74 op/s
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta.tmp'
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta.tmp' to config b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta'
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:066e8414-0f7f-4d94-8ce3-b9f1dd2fb571_3455eb51-47e6-416a-8a60-b0a5234ab5f1, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "snap_name": "066e8414-0f7f-4d94-8ce3-b9f1dd2fb571", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:066e8414-0f7f-4d94-8ce3-b9f1dd2fb571, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta.tmp'
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta.tmp' to config b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db/.meta'
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:066e8414-0f7f-4d94-8ce3-b9f1dd2fb571, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:00:21
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['images', 'backups', '.mgr', 'manila_data', 'volumes', 'vms', 'manila_metadata']
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3695598619' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3695598619' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:21.676 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014871085752838264 of space, bias 1.0, pg target 0.29692601219833736 quantized to 32 (current 32)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.0905220547180346e-06 of space, bias 1.0, pg target 0.00021701388888888888 quantized to 32 (current 32)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.00011832164293690675 of space, bias 4.0, pg target 0.09418402777777778 quantized to 16 (current 16)
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:00:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "format": "json"}]: dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "snap_name": "066e8414-0f7f-4d94-8ce3-b9f1dd2fb571_3455eb51-47e6-416a-8a60-b0a5234ab5f1", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "snap_name": "066e8414-0f7f-4d94-8ce3-b9f1dd2fb571", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3695598619' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3695598619' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v477: 177 pgs: 177 active+clean; 193 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 32 KiB/s wr, 70 op/s
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/233384275' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:22 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/233384275' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:22.821 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4276124241' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/4276124241' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: pgmap v477: 177 pgs: 177 active+clean; 193 MiB data, 1014 MiB used, 41 GiB / 42 GiB avail; 45 KiB/s rd, 32 KiB/s wr, 70 op/s
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/233384275' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/233384275' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3520847683' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3520847683' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4276124241' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4276124241' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1249637887' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1249637887' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:24 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1249637887' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:24 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1249637887' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v478: 177 pgs: 177 active+clean; 194 MiB data, 1019 MiB used, 41 GiB / 42 GiB avail; 124 KiB/s rd, 59 KiB/s wr, 177 op/s
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "format": "json"}]: dispatch
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '53f5c0b7-79c3-4730-936f-6925a39bf1db' of type subvolume
Feb 01 10:00:24 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:24.374+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '53f5c0b7-79c3-4730-936f-6925a39bf1db' of type subvolume
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/53f5c0b7-79c3-4730-936f-6925a39bf1db'' moved to trashcan
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:53f5c0b7-79c3-4730-936f-6925a39bf1db, vol_name:cephfs) < ""
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2046523029' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2046523029' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "format": "json"}]: dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f7671ca1-51ec-4e8e-b389-8a8c50c13461' of type subvolume
Feb 01 10:00:25 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:25.104+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f7671ca1-51ec-4e8e-b389-8a8c50c13461' of type subvolume
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, vol_name:cephfs) < ""
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f7671ca1-51ec-4e8e-b389-8a8c50c13461'' moved to trashcan
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f7671ca1-51ec-4e8e-b389-8a8c50c13461, vol_name:cephfs) < ""
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e234 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: pgmap v478: 177 pgs: 177 active+clean; 194 MiB data, 1019 MiB used, 41 GiB / 42 GiB avail; 124 KiB/s rd, 59 KiB/s wr, 177 op/s
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "format": "json"}]: dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "53f5c0b7-79c3-4730-936f-6925a39bf1db", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2046523029' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:25 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2046523029' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "format": "json"}]: dispatch
Feb 01 10:00:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f7671ca1-51ec-4e8e-b389-8a8c50c13461", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v479: 177 pgs: 177 active+clean; 194 MiB data, 1019 MiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 34 KiB/s wr, 121 op/s
Feb 01 10:00:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:26.715 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:00:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:00:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:00:26 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:00:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e235 e235: 6 total, 6 up, 6 in
Feb 01 10:00:26 np0005604215.localdomain podman[315898]: 2026-02-01 10:00:26.879414352 +0000 UTC m=+0.092952618 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-type=git, distribution-scope=public, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter)
Feb 01 10:00:26 np0005604215.localdomain systemd[1]: tmp-crun.fIIOXS.mount: Deactivated successfully.
Feb 01 10:00:26 np0005604215.localdomain podman[315906]: 2026-02-01 10:00:26.914999472 +0000 UTC m=+0.117162693 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:00:26 np0005604215.localdomain podman[315898]: 2026-02-01 10:00:26.922545167 +0000 UTC m=+0.136083352 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, version=9.7, vendor=Red Hat, Inc., io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, release=1769056855, maintainer=Red Hat, Inc.)
Feb 01 10:00:26 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:00:26 np0005604215.localdomain podman[315900]: 2026-02-01 10:00:26.979174532 +0000 UTC m=+0.185143001 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible)
Feb 01 10:00:27 np0005604215.localdomain podman[315900]: 2026-02-01 10:00:27.02146142 +0000 UTC m=+0.227429909 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 10:00:27 np0005604215.localdomain podman[315899]: 2026-02-01 10:00:27.030239613 +0000 UTC m=+0.238831084 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Feb 01 10:00:27 np0005604215.localdomain podman[315899]: 2026-02-01 10:00:27.034916049 +0000 UTC m=+0.243507540 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Feb 01 10:00:27 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:00:27 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:00:27 np0005604215.localdomain podman[315906]: 2026-02-01 10:00:27.053519989 +0000 UTC m=+0.255683270 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:00:27 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:00:27 np0005604215.localdomain ceph-mon[298604]: pgmap v479: 177 pgs: 177 active+clean; 194 MiB data, 1019 MiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 34 KiB/s wr, 121 op/s
Feb 01 10:00:27 np0005604215.localdomain ceph-mon[298604]: osdmap e235: 6 total, 6 up, 6 in
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta.tmp'
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta.tmp' to config b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta'
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "format": "json"}]: dispatch
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:27.864 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:27 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e236 e236: 6 total, 6 up, 6 in
Feb 01 10:00:28 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:28 np0005604215.localdomain ceph-mon[298604]: osdmap e236: 6 total, 6 up, 6 in
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v482: 177 pgs: 177 active+clean; 194 MiB data, 1024 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 58 KiB/s wr, 190 op/s
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, vol_name:cephfs) < ""
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c7efa303-bbfc-4c39-877e-b2f35a82b5a0/.meta.tmp'
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c7efa303-bbfc-4c39-877e-b2f35a82b5a0/.meta.tmp' to config b'/volumes/_nogroup/c7efa303-bbfc-4c39-877e-b2f35a82b5a0/.meta'
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, vol_name:cephfs) < ""
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "format": "json"}]: dispatch
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, vol_name:cephfs) < ""
Feb 01 10:00:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, vol_name:cephfs) < ""
Feb 01 10:00:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "format": "json"}]: dispatch
Feb 01 10:00:29 np0005604215.localdomain ceph-mon[298604]: pgmap v482: 177 pgs: 177 active+clean; 194 MiB data, 1024 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 58 KiB/s wr, 190 op/s
Feb 01 10:00:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:00:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:00:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:00:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:00:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:00:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18296 "" "Go-http-client/1.1"
Feb 01 10:00:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e236 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:00:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "format": "json"}]: dispatch
Feb 01 10:00:30 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/184422499' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:30 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/184422499' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v483: 177 pgs: 177 active+clean; 194 MiB data, 1024 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 58 KiB/s wr, 190 op/s
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "format": "json"}]: dispatch
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "snap_name": "d633f359-53f0-46c3-94ff-90f64f1e4469", "format": "json"}]: dispatch
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:d633f359-53f0-46c3-94ff-90f64f1e4469, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:d633f359-53f0-46c3-94ff-90f64f1e4469, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mon[298604]: pgmap v483: 177 pgs: 177 active+clean; 194 MiB data, 1024 MiB used, 41 GiB / 42 GiB avail; 136 KiB/s rd, 58 KiB/s wr, 190 op/s
Feb 01 10:00:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3625315047' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3625315047' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:00:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:00:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:00:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:00:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:00:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/.meta.tmp'
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/.meta.tmp' to config b'/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/.meta'
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "format": "json"}]: dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:31.717 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "format": "json"}]: dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c7efa303-bbfc-4c39-877e-b2f35a82b5a0' of type subvolume
Feb 01 10:00:31 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:31.836+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c7efa303-bbfc-4c39-877e-b2f35a82b5a0' of type subvolume
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, vol_name:cephfs) < ""
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/c7efa303-bbfc-4c39-877e-b2f35a82b5a0'' moved to trashcan
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c7efa303-bbfc-4c39-877e-b2f35a82b5a0, vol_name:cephfs) < ""
Feb 01 10:00:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v484: 177 pgs: 177 active+clean; 194 MiB data, 1024 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 24 KiB/s wr, 68 op/s
Feb 01 10:00:32 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "format": "json"}]: dispatch
Feb 01 10:00:32 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "snap_name": "d633f359-53f0-46c3-94ff-90f64f1e4469", "format": "json"}]: dispatch
Feb 01 10:00:32 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:32.901 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "format": "json"}]: dispatch
Feb 01 10:00:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "format": "json"}]: dispatch
Feb 01 10:00:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c7efa303-bbfc-4c39-877e-b2f35a82b5a0", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:33 np0005604215.localdomain ceph-mon[298604]: pgmap v484: 177 pgs: 177 active+clean; 194 MiB data, 1024 MiB used, 41 GiB / 42 GiB avail; 48 KiB/s rd, 24 KiB/s wr, 68 op/s
Feb 01 10:00:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e237 e237: 6 total, 6 up, 6 in
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4ba7af79-0f85-4db2-8701-a9110d019002/.meta.tmp'
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4ba7af79-0f85-4db2-8701-a9110d019002/.meta.tmp' to config b'/volumes/_nogroup/4ba7af79-0f85-4db2-8701-a9110d019002/.meta'
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "format": "json"}]: dispatch
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v486: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 63 KiB/s wr, 124 op/s
Feb 01 10:00:34 np0005604215.localdomain ceph-mon[298604]: osdmap e237: 6 total, 6 up, 6 in
Feb 01 10:00:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "snap_name": "d633f359-53f0-46c3-94ff-90f64f1e4469_14a1ff06-8b18-4cb9-86da-d23110370acd", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:d633f359-53f0-46c3-94ff-90f64f1e4469_14a1ff06-8b18-4cb9-86da-d23110370acd, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta.tmp'
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta.tmp' to config b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta'
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:d633f359-53f0-46c3-94ff-90f64f1e4469_14a1ff06-8b18-4cb9-86da-d23110370acd, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "snap_name": "d633f359-53f0-46c3-94ff-90f64f1e4469", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:d633f359-53f0-46c3-94ff-90f64f1e4469, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta.tmp'
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta.tmp' to config b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193/.meta'
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:d633f359-53f0-46c3-94ff-90f64f1e4469, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/.meta.tmp'
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/.meta.tmp' to config b'/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/.meta'
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "format": "json"}]: dispatch
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, vol_name:cephfs) < ""
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e237 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/67f16e44-b42c-4fb2-9bca-83ca589fff39/.meta.tmp'
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/67f16e44-b42c-4fb2-9bca-83ca589fff39/.meta.tmp' to config b'/volumes/_nogroup/67f16e44-b42c-4fb2-9bca-83ca589fff39/.meta'
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, vol_name:cephfs) < ""
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, vol_name:cephfs) < ""
Feb 01 10:00:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, vol_name:cephfs) < ""
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: pgmap v486: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 85 KiB/s rd, 63 KiB/s wr, 124 op/s
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "snap_name": "d633f359-53f0-46c3-94ff-90f64f1e4469_14a1ff06-8b18-4cb9-86da-d23110370acd", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "snap_name": "d633f359-53f0-46c3-94ff-90f64f1e4469", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1571698682' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1571698682' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "format": "json"}]: dispatch
Feb 01 10:00:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v487: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 34 KiB/s wr, 45 op/s
Feb 01 10:00:36 np0005604215.localdomain ceph-mon[298604]: pgmap v487: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 30 KiB/s rd, 34 KiB/s wr, 45 op/s
Feb 01 10:00:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:36.744 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 01 10:00:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e238 e238: 6 total, 6 up, 6 in
Feb 01 10:00:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2729393937' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2729393937' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "format": "json"}]: dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7d49ea05-c26a-4630-aaf8-e7e0481c4193' of type subvolume
Feb 01 10:00:37 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:37.692+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7d49ea05-c26a-4630-aaf8-e7e0481c4193' of type subvolume
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7d49ea05-c26a-4630-aaf8-e7e0481c4193'' moved to trashcan
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7d49ea05-c26a-4630-aaf8-e7e0481c4193, vol_name:cephfs) < ""
Feb 01 10:00:37 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e239 e239: 6 total, 6 up, 6 in
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:37 np0005604215.localdomain podman[315983]: 2026-02-01 10:00:37.878516937 +0000 UTC m=+0.089661385 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: osdmap e238: 6 total, 6 up, 6 in
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2729393937' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2729393937' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1623823550' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1623823550' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:37 np0005604215.localdomain podman[315983]: 2026-02-01 10:00:37.914650813 +0000 UTC m=+0.125795191 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 10:00:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:37.949 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:37 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta.tmp'
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta.tmp' to config b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta'
Feb 01 10:00:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "format": "json"}]: dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v490: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 96 KiB/s wr, 176 op/s
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "auth_id": "Joe", "tenant_id": "d32ed6e558674454a1648ebe57d1a805", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, tenant_id:d32ed6e558674454a1648ebe57d1a805, vol_name:cephfs) < ""
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.Joe", "format": "json"} v 0)
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID Joe with tenant d32ed6e558674454a1648ebe57d1a805
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, tenant_id:d32ed6e558674454a1648ebe57d1a805, vol_name:cephfs) < ""
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "format": "json"}]: dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7d49ea05-c26a-4630-aaf8-e7e0481c4193", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: osdmap e239: 6 total, 6 up, 6 in
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "format": "json"}]: dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: pgmap v490: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 120 KiB/s rd, 96 KiB/s wr, 176 op/s
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:38 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.Joe", "caps": ["mds", "allow rw path=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e", "osd", "allow rw pool=manila_data namespace=fsvolumens_ef5904d0-6de5-446a-a091-edb3ad7abb31", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:00:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e240 e240: 6 total, 6 up, 6 in
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "format": "json"}]: dispatch
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:39 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:39.667+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '67f16e44-b42c-4fb2-9bca-83ca589fff39' of type subvolume
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '67f16e44-b42c-4fb2-9bca-83ca589fff39' of type subvolume
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, vol_name:cephfs) < ""
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/67f16e44-b42c-4fb2-9bca-83ca589fff39'' moved to trashcan
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:67f16e44-b42c-4fb2-9bca-83ca589fff39, vol_name:cephfs) < ""
Feb 01 10:00:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "auth_id": "Joe", "tenant_id": "d32ed6e558674454a1648ebe57d1a805", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:39 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2127634127' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:39 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2127634127' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:39 np0005604215.localdomain ceph-mon[298604]: osdmap e240: 6 total, 6 up, 6 in
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "format": "json"}]: dispatch
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4ba7af79-0f85-4db2-8701-a9110d019002, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4ba7af79-0f85-4db2-8701-a9110d019002, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:40 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:40.155+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4ba7af79-0f85-4db2-8701-a9110d019002' of type subvolume
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4ba7af79-0f85-4db2-8701-a9110d019002' of type subvolume
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4ba7af79-0f85-4db2-8701-a9110d019002'' moved to trashcan
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4ba7af79-0f85-4db2-8701-a9110d019002, vol_name:cephfs) < ""
Feb 01 10:00:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v492: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 48 KiB/s wr, 111 op/s
Feb 01 10:00:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:00:40 np0005604215.localdomain podman[316002]: 2026-02-01 10:00:40.874058173 +0000 UTC m=+0.085133564 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 10:00:40 np0005604215.localdomain podman[316002]: 2026-02-01 10:00:40.887656507 +0000 UTC m=+0.098731898 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:00:40 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:00:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "format": "json"}]: dispatch
Feb 01 10:00:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "67f16e44-b42c-4fb2-9bca-83ca589fff39", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "format": "json"}]: dispatch
Feb 01 10:00:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4ba7af79-0f85-4db2-8701-a9110d019002", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:40 np0005604215.localdomain ceph-mon[298604]: pgmap v492: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 48 KiB/s wr, 111 op/s
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "snap_name": "814fa479-ec06-41a9-8fda-1c2cd3e9cb9c", "format": "json"}]: dispatch
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:814fa479-ec06-41a9-8fda-1c2cd3e9cb9c, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:814fa479-ec06-41a9-8fda-1c2cd3e9cb9c, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/.meta.tmp'
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/.meta.tmp' to config b'/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/.meta'
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "format": "json"}]: dispatch
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:41.776 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:00:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:41.777 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:00:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:41.777 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:00:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:41.788 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "snap_name": "814fa479-ec06-41a9-8fda-1c2cd3e9cb9c", "format": "json"}]: dispatch
Feb 01 10:00:41 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3458269461' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:41 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3458269461' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v493: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 48 KiB/s wr, 111 op/s
Feb 01 10:00:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "format": "json"}]: dispatch
Feb 01 10:00:42 np0005604215.localdomain ceph-mon[298604]: pgmap v493: 177 pgs: 177 active+clean; 194 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 78 KiB/s rd, 48 KiB/s wr, 111 op/s
Feb 01 10:00:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:42.994 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:43 np0005604215.localdomain sudo[316024]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:00:43 np0005604215.localdomain sudo[316024]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:00:43 np0005604215.localdomain sudo[316024]: pam_unix(sudo:session): session closed for user root
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:43 np0005604215.localdomain sudo[316042]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:00:43 np0005604215.localdomain sudo[316042]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/bcbe03f2-a495-4378-a713-02be779827da/.meta.tmp'
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/bcbe03f2-a495-4378-a713-02be779827da/.meta.tmp' to config b'/volumes/_nogroup/bcbe03f2-a495-4378-a713-02be779827da/.meta'
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "format": "json"}]: dispatch
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:43 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:44 np0005604215.localdomain sudo[316042]: pam_unix(sudo:session): session closed for user root
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7740003d-21db-4610-b8f3-9babed626268", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7740003d-21db-4610-b8f3-9babed626268, vol_name:cephfs) < ""
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7740003d-21db-4610-b8f3-9babed626268/.meta.tmp'
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7740003d-21db-4610-b8f3-9babed626268/.meta.tmp' to config b'/volumes/_nogroup/7740003d-21db-4610-b8f3-9babed626268/.meta'
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7740003d-21db-4610-b8f3-9babed626268, vol_name:cephfs) < ""
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7740003d-21db-4610-b8f3-9babed626268", "format": "json"}]: dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7740003d-21db-4610-b8f3-9babed626268, vol_name:cephfs) < ""
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev e70c5903-4f05-4c3b-aada-7b3246cabb7b (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev e70c5903-4f05-4c3b-aada-7b3246cabb7b (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event e70c5903-4f05-4c3b-aada-7b3246cabb7b (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7740003d-21db-4610-b8f3-9babed626268, vol_name:cephfs) < ""
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v494: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 94 KiB/s wr, 165 op/s
Feb 01 10:00:44 np0005604215.localdomain sudo[316090]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:00:44 np0005604215.localdomain sudo[316090]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:00:44 np0005604215.localdomain sudo[316090]: pam_unix(sudo:session): session closed for user root
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "Joe", "tenant_id": "f62ab07d2055417db4484bccb101ac2e", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, tenant_id:f62ab07d2055417db4484bccb101ac2e, vol_name:cephfs) < ""
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.Joe", "format": "json"} v 0)
Feb 01 10:00:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: Joe is already in use
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:Joe, format:json, prefix:fs subvolume authorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, tenant_id:f62ab07d2055417db4484bccb101ac2e, vol_name:cephfs) < ""
Feb 01 10:00:44 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:44.833+0000 7f93ec23e640 -1 mgr.server reply reply (1) Operation not permitted auth ID: Joe is already in use
Feb 01 10:00:44 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (1) Operation not permitted auth ID: Joe is already in use
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "format": "json"}]: dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7740003d-21db-4610-b8f3-9babed626268", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7740003d-21db-4610-b8f3-9babed626268", "format": "json"}]: dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: pgmap v494: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.8 MiB/s rd, 94 KiB/s wr, 165 op/s
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "snap_name": "814fa479-ec06-41a9-8fda-1c2cd3e9cb9c_88bbe56d-e093-47bb-a7e1-b7e3d657fbd1", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:814fa479-ec06-41a9-8fda-1c2cd3e9cb9c_88bbe56d-e093-47bb-a7e1-b7e3d657fbd1, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta.tmp'
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta.tmp' to config b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta'
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:814fa479-ec06-41a9-8fda-1c2cd3e9cb9c_88bbe56d-e093-47bb-a7e1-b7e3d657fbd1, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "snap_name": "814fa479-ec06-41a9-8fda-1c2cd3e9cb9c", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:814fa479-ec06-41a9-8fda-1c2cd3e9cb9c, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e240 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta.tmp'
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta.tmp' to config b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824/.meta'
Feb 01 10:00:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:814fa479-ec06-41a9-8fda-1c2cd3e9cb9c, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:00:45 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 16K writes, 64K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s
                                                          Cumulative WAL: 16K writes, 5709 syncs, 2.98 writes per sync, written: 0.05 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 26.10 MB, 0.04 MB/s
                                                          Interval WAL: 11K writes, 4797 syncs, 2.34 writes per sync, written: 0.03 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 10:00:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "Joe", "tenant_id": "f62ab07d2055417db4484bccb101ac2e", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "snap_name": "814fa479-ec06-41a9-8fda-1c2cd3e9cb9c_88bbe56d-e093-47bb-a7e1-b7e3d657fbd1", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "snap_name": "814fa479-ec06-41a9-8fda-1c2cd3e9cb9c", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v495: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 49 KiB/s wr, 67 op/s
Feb 01 10:00:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 01 10:00:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:46 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:00:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:00:46 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:46.820 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e241 e241: 6 total, 6 up, 6 in
Feb 01 10:00:47 np0005604215.localdomain ceph-mon[298604]: pgmap v495: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 2.5 MiB/s rd, 49 KiB/s wr, 67 op/s
Feb 01 10:00:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:00:47 np0005604215.localdomain ceph-mon[298604]: osdmap e241: 6 total, 6 up, 6 in
Feb 01 10:00:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e242 e242: 6 total, 6 up, 6 in
Feb 01 10:00:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:48.018 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 01 10:00:48 np0005604215.localdomain ceph-mon[298604]: osdmap e242: 6 total, 6 up, 6 in
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v498: 177 pgs: 177 active+clean; 241 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 127 op/s
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "format": "json"}]: dispatch
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:d3128e87-da55-4740-ae37-00f9b18ac824, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:d3128e87-da55-4740-ae37-00f9b18ac824, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:48.570+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd3128e87-da55-4740-ae37-00f9b18ac824' of type subvolume
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd3128e87-da55-4740-ae37-00f9b18ac824' of type subvolume
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/d3128e87-da55-4740-ae37-00f9b18ac824'' moved to trashcan
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d3128e87-da55-4740-ae37-00f9b18ac824, vol_name:cephfs) < ""
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "tempest-cephx-id-2026360705", "tenant_id": "f62ab07d2055417db4484bccb101ac2e", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2026360705, format:json, prefix:fs subvolume authorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, tenant_id:f62ab07d2055417db4484bccb101ac2e, vol_name:cephfs) < ""
Feb 01 10:00:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} v 0)
Feb 01 10:00:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} : dispatch
Feb 01 10:00:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2026360705 with tenant f62ab07d2055417db4484bccb101ac2e
Feb 01 10:00:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:00:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2026360705, format:json, prefix:fs subvolume authorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, tenant_id:f62ab07d2055417db4484bccb101ac2e, vol_name:cephfs) < ""
Feb 01 10:00:49 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:49.054 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:00:49 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:49.055 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 10:00:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:49.089 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: pgmap v498: 177 pgs: 177 active+clean; 241 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 127 op/s
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} : dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2026360705", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3652661870' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3652661870' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7740003d-21db-4610-b8f3-9babed626268", "format": "json"}]: dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7740003d-21db-4610-b8f3-9babed626268, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7740003d-21db-4610-b8f3-9babed626268, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:49 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:49.482+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7740003d-21db-4610-b8f3-9babed626268' of type subvolume
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7740003d-21db-4610-b8f3-9babed626268' of type subvolume
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7740003d-21db-4610-b8f3-9babed626268", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7740003d-21db-4610-b8f3-9babed626268, vol_name:cephfs) < ""
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7740003d-21db-4610-b8f3-9babed626268'' moved to trashcan
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7740003d-21db-4610-b8f3-9babed626268, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bcbe03f2-a495-4378-a713-02be779827da", "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:bcbe03f2-a495-4378-a713-02be779827da, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:bcbe03f2-a495-4378-a713-02be779827da, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:50.129+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'bcbe03f2-a495-4378-a713-02be779827da' of type subvolume
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'bcbe03f2-a495-4378-a713-02be779827da' of type subvolume
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/bcbe03f2-a495-4378-a713-02be779827da'' moved to trashcan
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:bcbe03f2-a495-4378-a713-02be779827da, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e242 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:00:50 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 16K writes, 62K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s
                                                          Cumulative WAL: 16K writes, 5389 syncs, 3.08 writes per sync, written: 0.05 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 38K keys, 11K commit groups, 1.0 writes per commit group, ingest: 30.50 MB, 0.05 MB/s
                                                          Interval WAL: 11K writes, 4649 syncs, 2.40 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 10:00:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d3128e87-da55-4740-ae37-00f9b18ac824", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "tempest-cephx-id-2026360705", "tenant_id": "f62ab07d2055417db4484bccb101ac2e", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v499: 177 pgs: 177 active+clean; 241 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 127 op/s
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/.meta.tmp'
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/.meta.tmp' to config b'/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/.meta'
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "format": "json"}]: dispatch
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:51 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:00:51.058 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e243 e243: 6 total, 6 up, 6 in
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7740003d-21db-4610-b8f3-9babed626268", "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7740003d-21db-4610-b8f3-9babed626268", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "bcbe03f2-a495-4378-a713-02be779827da", "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "bcbe03f2-a495-4378-a713-02be779827da", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/447088343' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/447088343' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: pgmap v499: 177 pgs: 177 active+clean; 241 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 2.6 MiB/s rd, 2.7 MiB/s wr, 127 op/s
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes WARNING volumes.fs.operations.versions.subvolume_v1] deauthorized called for already-removed authID 'Joe' for subvolume 'ebeb9c1e-187e-4fbb-8711-dc250e4ab635'
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=Joe, client_metadata.root=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:00:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:51 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:51.853 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v501: 177 pgs: 177 active+clean; 241 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 3.6 MiB/s wr, 74 op/s
Feb 01 10:00:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "format": "json"}]: dispatch
Feb 01 10:00:52 np0005604215.localdomain ceph-mon[298604]: osdmap e243: 6 total, 6 up, 6 in
Feb 01 10:00:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:53.052 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1be67b6e-cf94-4bea-af19-93677534e470", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1be67b6e-cf94-4bea-af19-93677534e470, vol_name:cephfs) < ""
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1be67b6e-cf94-4bea-af19-93677534e470/.meta.tmp'
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1be67b6e-cf94-4bea-af19-93677534e470/.meta.tmp' to config b'/volumes/_nogroup/1be67b6e-cf94-4bea-af19-93677534e470/.meta'
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1be67b6e-cf94-4bea-af19-93677534e470, vol_name:cephfs) < ""
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1be67b6e-cf94-4bea-af19-93677534e470", "format": "json"}]: dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1be67b6e-cf94-4bea-af19-93677534e470, vol_name:cephfs) < ""
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1be67b6e-cf94-4bea-af19-93677534e470, vol_name:cephfs) < ""
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e244 e244: 6 total, 6 up, 6 in
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: pgmap v501: 177 pgs: 177 active+clean; 241 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 43 KiB/s rd, 3.6 MiB/s wr, 74 op/s
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "auth_id": "tempest-cephx-id-397577304", "tenant_id": "ff1159417622494a84300007e5ed57fa", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-397577304, format:json, prefix:fs subvolume authorize, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, tenant_id:ff1159417622494a84300007e5ed57fa, vol_name:cephfs) < ""
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} v 0)
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} : dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-397577304 with tenant ff1159417622494a84300007e5ed57fa
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:00:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-397577304, format:json, prefix:fs subvolume authorize, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, tenant_id:ff1159417622494a84300007e5ed57fa, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:54.069+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'cf68d29d-a061-4145-ba2b-6bee3a2be2df' of type subvolume
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'cf68d29d-a061-4145-ba2b-6bee3a2be2df' of type subvolume
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/cf68d29d-a061-4145-ba2b-6bee3a2be2df'' moved to trashcan
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:cf68d29d-a061-4145-ba2b-6bee3a2be2df, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:54.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v503: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 64 KiB/s wr, 82 op/s
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1be67b6e-cf94-4bea-af19-93677534e470", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1be67b6e-cf94-4bea-af19-93677534e470", "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: osdmap e244: 6 total, 6 up, 6 in
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} : dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-397577304", "caps": ["mds", "allow rw path=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12", "osd", "allow rw pool=manila_data namespace=fsvolumens_ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "auth_id": "tempest-cephx-id-397577304", "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-397577304, format:json, prefix:fs subvolume deauthorize, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} v 0)
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} : dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} v 0)
Feb 01 10:00:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} : dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-397577304, format:json, prefix:fs subvolume deauthorize, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "auth_id": "tempest-cephx-id-397577304", "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-397577304, format:json, prefix:fs subvolume evict, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-397577304, client_metadata.root=/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0/e2b746e7-0779-4d7e-9231-873a16270c12
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-397577304, format:json, prefix:fs subvolume evict, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:54.772+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ea35db83-15a2-4b5c-b6d7-6a25b52b26b0' of type subvolume
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ea35db83-15a2-4b5c-b6d7-6a25b52b26b0' of type subvolume
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ea35db83-15a2-4b5c-b6d7-6a25b52b26b0'' moved to trashcan
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ea35db83-15a2-4b5c-b6d7-6a25b52b26b0, vol_name:cephfs) < ""
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "tempest-cephx-id-2026360705", "format": "json"}]: dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2026360705, format:json, prefix:fs subvolume deauthorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:55.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} v 0)
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} v 0)
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2026360705, format:json, prefix:fs subvolume deauthorize, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "tempest-cephx-id-2026360705", "format": "json"}]: dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2026360705, format:json, prefix:fs subvolume evict, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e244 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2026360705, client_metadata.root=/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635/8b5ea12c-b039-40af-b27d-ed5446708464
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:00:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2026360705, format:json, prefix:fs subvolume evict, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "auth_id": "tempest-cephx-id-397577304", "tenant_id": "ff1159417622494a84300007e5ed57fa", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "format": "json"}]: dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cf68d29d-a061-4145-ba2b-6bee3a2be2df", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: pgmap v503: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 55 KiB/s rd, 64 KiB/s wr, 82 op/s
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-397577304", "format": "json"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-397577304"}]': finished
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2026360705", "format": "json"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"} : dispatch
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2026360705"}]': finished
Feb 01 10:00:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e245 e245: 6 total, 6 up, 6 in
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.127 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.128 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.154 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.154 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.155 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.155 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.155 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v505: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 69 KiB/s wr, 89 op/s
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "auth_id": "tempest-cephx-id-397577304", "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "auth_id": "tempest-cephx-id-397577304", "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ea35db83-15a2-4b5c-b6d7-6a25b52b26b0", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "tempest-cephx-id-2026360705", "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "auth_id": "tempest-cephx-id-2026360705", "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3561657365' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: osdmap e245: 6 total, 6 up, 6 in
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e246 e246: 6 total, 6 up, 6 in
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3082454826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.623 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.468s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1be67b6e-cf94-4bea-af19-93677534e470", "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1be67b6e-cf94-4bea-af19-93677534e470, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1be67b6e-cf94-4bea-af19-93677534e470, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1be67b6e-cf94-4bea-af19-93677534e470' of type subvolume
Feb 01 10:00:56 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:00:56.645+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1be67b6e-cf94-4bea-af19-93677534e470' of type subvolume
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1be67b6e-cf94-4bea-af19-93677534e470", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1be67b6e-cf94-4bea-af19-93677534e470, vol_name:cephfs) < ""
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1be67b6e-cf94-4bea-af19-93677534e470'' moved to trashcan
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:00:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1be67b6e-cf94-4bea-af19-93677534e470, vol_name:cephfs) < ""
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.835 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.837 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11539MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.838 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.838 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:00:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e247 e247: 6 total, 6 up, 6 in
Feb 01 10:00:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:56.886 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.136 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.137 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.316 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/767252392' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/767252392' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: pgmap v505: 177 pgs: 177 active+clean; 195 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 59 KiB/s rd, 69 KiB/s wr, 89 op/s
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: osdmap e246: 6 total, 6 up, 6 in
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3082454826' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1be67b6e-cf94-4bea-af19-93677534e470", "format": "json"}]: dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1be67b6e-cf94-4bea-af19-93677534e470", "force": true, "format": "json"}]: dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1012344780' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: osdmap e247: 6 total, 6 up, 6 in
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/767252392' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/767252392' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:00:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1927343437' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:00:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:00:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:00:57 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.790 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.474s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.798 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.821 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.824 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:00:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:57.825 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.986s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:00:57 np0005604215.localdomain podman[316157]: 2026-02-01 10:00:57.892760649 +0000 UTC m=+0.099653176 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20260127, io.buildah.version=1.41.3)
Feb 01 10:00:57 np0005604215.localdomain podman[316155]: 2026-02-01 10:00:57.96562528 +0000 UTC m=+0.175198251 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, distribution-scope=public, architecture=x86_64, name=ubi9/ubi-minimal, managed_by=edpm_ansible, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1769056855, com.redhat.component=ubi9-minimal-container, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z)
Feb 01 10:00:57 np0005604215.localdomain podman[316158]: 2026-02-01 10:00:57.988524613 +0000 UTC m=+0.197744533 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:00:57 np0005604215.localdomain podman[316158]: 2026-02-01 10:00:57.998219836 +0000 UTC m=+0.207439786 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 10:00:58 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:00:58 np0005604215.localdomain podman[316155]: 2026-02-01 10:00:58.047767599 +0000 UTC m=+0.257340640 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, version=9.7, vendor=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, managed_by=edpm_ansible, config_id=openstack_network_exporter)
Feb 01 10:00:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:58.053 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:00:58 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:00:58 np0005604215.localdomain podman[316156]: 2026-02-01 10:00:58.090726229 +0000 UTC m=+0.299633039 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:00:58 np0005604215.localdomain podman[316157]: 2026-02-01 10:00:58.099316897 +0000 UTC m=+0.306209424 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 10:00:58 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:00:58 np0005604215.localdomain podman[316156]: 2026-02-01 10:00:58.125853523 +0000 UTC m=+0.334760343 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:00:58 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v508: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 140 KiB/s wr, 107 op/s
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1927343437' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:00:58 np0005604215.localdomain ceph-mon[298604]: pgmap v508: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 68 KiB/s rd, 140 KiB/s wr, 107 op/s
Feb 01 10:00:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.Joe", "format": "json"} v 0)
Feb 01 10:00:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 01 10:00:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.Joe"} v 0)
Feb 01 10:00:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:Joe, format:json, prefix:fs subvolume deauthorize, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=Joe, client_metadata.root=/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31/a9d003d6-2a9d-4a04-919b-f3994828d27e
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:00:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:Joe, format:json, prefix:fs subvolume evict, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:00:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:58.797 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:58.798 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:58.798 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:58.799 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:00:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:00:59.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.Joe", "format": "json"} : dispatch
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.Joe"} : dispatch
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.Joe"}]': finished
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "auth_id": "Joe", "format": "json"}]: dispatch
Feb 01 10:00:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e248 e248: 6 total, 6 up, 6 in
Feb 01 10:00:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:00:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:dad20548-fd8b-498e-8859-9201c657d5e6, vol_name:cephfs) < ""
Feb 01 10:00:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/dad20548-fd8b-498e-8859-9201c657d5e6/.meta.tmp'
Feb 01 10:00:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/dad20548-fd8b-498e-8859-9201c657d5e6/.meta.tmp' to config b'/volumes/_nogroup/dad20548-fd8b-498e-8859-9201c657d5e6/.meta'
Feb 01 10:00:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:dad20548-fd8b-498e-8859-9201c657d5e6, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:dad20548-fd8b-498e-8859-9201c657d5e6, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:dad20548-fd8b-498e-8859-9201c657d5e6, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:01:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:01:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:01:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:01:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:01:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18299 "" "Go-http-client/1.1"
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e248 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/361f7103-8230-4c8b-80e5-41b9d7dd022d/.meta.tmp'
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/361f7103-8230-4c8b-80e5-41b9d7dd022d/.meta.tmp' to config b'/volumes/_nogroup/361f7103-8230-4c8b-80e5-41b9d7dd022d/.meta'
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, vol_name:cephfs) < ""
Feb 01 10:01:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v510: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 143 KiB/s wr, 109 op/s
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: osdmap e248: 6 total, 6 up, 6 in
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "format": "json"}]: dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:00 np0005604215.localdomain ceph-mon[298604]: pgmap v510: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 70 KiB/s rd, 143 KiB/s wr, 109 op/s
Feb 01 10:01:01 np0005604215.localdomain CROND[316238]: (root) CMD (run-parts /etc/cron.hourly)
Feb 01 10:01:01 np0005604215.localdomain run-parts[316241]: (/etc/cron.hourly) starting 0anacron
Feb 01 10:01:01 np0005604215.localdomain run-parts[316247]: (/etc/cron.hourly) finished 0anacron
Feb 01 10:01:01 np0005604215.localdomain CROND[316237]: (root) CMDEND (run-parts /etc/cron.hourly)
Feb 01 10:01:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:01:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:01:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:01:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:01:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:01:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:01:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e249 e249: 6 total, 6 up, 6 in
Feb 01 10:01:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "admin", "tenant_id": "d32ed6e558674454a1648ebe57d1a805", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:01:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:admin, format:json, prefix:fs subvolume authorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, tenant_id:d32ed6e558674454a1648ebe57d1a805, vol_name:cephfs) < ""
Feb 01 10:01:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin", "format": "json"} v 0)
Feb 01 10:01:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Feb 01 10:01:01 np0005604215.localdomain ceph-mgr[278126]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: admin exists and not created by mgr plugin. Not allowed to modify
Feb 01 10:01:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:admin, format:json, prefix:fs subvolume authorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, tenant_id:d32ed6e558674454a1648ebe57d1a805, vol_name:cephfs) < ""
Feb 01 10:01:01 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:01.823+0000 7f93ec23e640 -1 mgr.server reply reply (1) Operation not permitted auth ID: admin exists and not created by mgr plugin. Not allowed to modify
Feb 01 10:01:01 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (1) Operation not permitted auth ID: admin exists and not created by mgr plugin. Not allowed to modify
Feb 01 10:01:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e250 e250: 6 total, 6 up, 6 in
Feb 01 10:01:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:01.928 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v513: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 127 KiB/s wr, 97 op/s
Feb 01 10:01:02 np0005604215.localdomain ceph-mon[298604]: osdmap e249: 6 total, 6 up, 6 in
Feb 01 10:01:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "admin", "tenant_id": "d32ed6e558674454a1648ebe57d1a805", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:01:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin", "format": "json"} : dispatch
Feb 01 10:01:02 np0005604215.localdomain ceph-mon[298604]: osdmap e250: 6 total, 6 up, 6 in
Feb 01 10:01:02 np0005604215.localdomain ceph-mon[298604]: pgmap v513: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 62 KiB/s rd, 127 KiB/s wr, 97 op/s
Feb 01 10:01:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:03.069 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:03.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:03.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "format": "json"}]: dispatch
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:dad20548-fd8b-498e-8859-9201c657d5e6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:dad20548-fd8b-498e-8859-9201c657d5e6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:03 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:03.225+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'dad20548-fd8b-498e-8859-9201c657d5e6' of type subvolume
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'dad20548-fd8b-498e-8859-9201c657d5e6' of type subvolume
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:dad20548-fd8b-498e-8859-9201c657d5e6, vol_name:cephfs) < ""
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/dad20548-fd8b-498e-8859-9201c657d5e6'' moved to trashcan
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:dad20548-fd8b-498e-8859-9201c657d5e6, vol_name:cephfs) < ""
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:01:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:01:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "format": "json"}]: dispatch
Feb 01 10:01:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "dad20548-fd8b-498e-8859-9201c657d5e6", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/982801201' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2319460251' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:04.115 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v514: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 48 KiB/s wr, 94 op/s
Feb 01 10:01:05 np0005604215.localdomain ceph-mon[298604]: pgmap v514: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 65 KiB/s rd, 48 KiB/s wr, 94 op/s
Feb 01 10:01:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:05.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:05.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 10:01:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:05.118 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "format": "json"}]: dispatch
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:05.129+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '361f7103-8230-4c8b-80e5-41b9d7dd022d' of type subvolume
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '361f7103-8230-4c8b-80e5-41b9d7dd022d' of type subvolume
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, vol_name:cephfs) < ""
Feb 01 10:01:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e250 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/361f7103-8230-4c8b-80e5-41b9d7dd022d'' moved to trashcan
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:361f7103-8230-4c8b-80e5-41b9d7dd022d, vol_name:cephfs) < ""
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "david", "tenant_id": "d32ed6e558674454a1648ebe57d1a805", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, tenant_id:d32ed6e558674454a1648ebe57d1a805, vol_name:cephfs) < ""
Feb 01 10:01:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.david", "format": "json"} v 0)
Feb 01 10:01:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 01 10:01:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID david with tenant d32ed6e558674454a1648ebe57d1a805
Feb 01 10:01:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:01:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:01:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, tenant_id:d32ed6e558674454a1648ebe57d1a805, vol_name:cephfs) < ""
Feb 01 10:01:06 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "format": "json"}]: dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "361f7103-8230-4c8b-80e5-41b9d7dd022d", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.david", "caps": ["mds", "allow rw path=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0", "osd", "allow rw pool=manila_data namespace=fsvolumens_ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:01:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v515: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 43 KiB/s wr, 83 op/s
Feb 01 10:01:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "334d218b-dfca-41dc-92c9-1bb7ec15d360", "format": "json"}]: dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:334d218b-dfca-41dc-92c9-1bb7ec15d360, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:334d218b-dfca-41dc-92c9-1bb7ec15d360, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, vol_name:cephfs) < ""
Feb 01 10:01:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:06.970 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "david", "tenant_id": "d32ed6e558674454a1648ebe57d1a805", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:01:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4281456669' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:07 np0005604215.localdomain ceph-mon[298604]: pgmap v515: 177 pgs: 177 active+clean; 196 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 57 KiB/s rd, 43 KiB/s wr, 83 op/s
Feb 01 10:01:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/8cb94272-9b85-4a3b-a318-d4ded9f25bee/.meta.tmp'
Feb 01 10:01:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/8cb94272-9b85-4a3b-a318-d4ded9f25bee/.meta.tmp' to config b'/volumes/_nogroup/8cb94272-9b85-4a3b-a318-d4ded9f25bee/.meta'
Feb 01 10:01:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, vol_name:cephfs) < ""
Feb 01 10:01:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "format": "json"}]: dispatch
Feb 01 10:01:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, vol_name:cephfs) < ""
Feb 01 10:01:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, vol_name:cephfs) < ""
Feb 01 10:01:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:08.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:08.104 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "334d218b-dfca-41dc-92c9-1bb7ec15d360", "format": "json"}]: dispatch
Feb 01 10:01:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "format": "json"}]: dispatch
Feb 01 10:01:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v516: 177 pgs: 177 active+clean; 452 MiB data, 1.7 GiB used, 40 GiB / 42 GiB avail; 121 KiB/s rd, 32 MiB/s wr, 190 op/s
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c1ec6001-c4f0-42e8-a3ae-66c185a36061/.meta.tmp'
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c1ec6001-c4f0-42e8-a3ae-66c185a36061/.meta.tmp' to config b'/volumes/_nogroup/c1ec6001-c4f0-42e8-a3ae-66c185a36061/.meta'
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "format": "json"}]: dispatch
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:08 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:01:08 np0005604215.localdomain systemd[1]: tmp-crun.Z8Xvik.mount: Deactivated successfully.
Feb 01 10:01:08 np0005604215.localdomain podman[316248]: 2026-02-01 10:01:08.884204215 +0000 UTC m=+0.094255579 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 10:01:08 np0005604215.localdomain podman[316248]: 2026-02-01 10:01:08.89561181 +0000 UTC m=+0.105663134 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 01 10:01:08 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:01:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e251 e251: 6 total, 6 up, 6 in
Feb 01 10:01:09 np0005604215.localdomain ceph-mon[298604]: pgmap v516: 177 pgs: 177 active+clean; 452 MiB data, 1.7 GiB used, 40 GiB / 42 GiB avail; 121 KiB/s rd, 32 MiB/s wr, 190 op/s
Feb 01 10:01:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e251 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "format": "json"}]: dispatch
Feb 01 10:01:10 np0005604215.localdomain ceph-mon[298604]: osdmap e251: 6 total, 6 up, 6 in
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v518: 177 pgs: 177 active+clean; 452 MiB data, 1.7 GiB used, 40 GiB / 42 GiB avail; 114 KiB/s rd, 30 MiB/s wr, 179 op/s
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "334d218b-dfca-41dc-92c9-1bb7ec15d360_74b9c1d7-8ee2-4566-b420-d2cfb354ff64", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:334d218b-dfca-41dc-92c9-1bb7ec15d360_74b9c1d7-8ee2-4566-b420-d2cfb354ff64, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:334d218b-dfca-41dc-92c9-1bb7ec15d360_74b9c1d7-8ee2-4566-b420-d2cfb354ff64, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "334d218b-dfca-41dc-92c9-1bb7ec15d360", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:334d218b-dfca-41dc-92c9-1bb7ec15d360, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:334d218b-dfca-41dc-92c9-1bb7ec15d360, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:11 np0005604215.localdomain ceph-mon[298604]: pgmap v518: 177 pgs: 177 active+clean; 452 MiB data, 1.7 GiB used, 40 GiB / 42 GiB avail; 114 KiB/s rd, 30 MiB/s wr, 179 op/s
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "format": "json"}]: dispatch
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '8cb94272-9b85-4a3b-a318-d4ded9f25bee' of type subvolume
Feb 01 10:01:11 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:11.766+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '8cb94272-9b85-4a3b-a318-d4ded9f25bee' of type subvolume
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, vol_name:cephfs) < ""
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/8cb94272-9b85-4a3b-a318-d4ded9f25bee'' moved to trashcan
Feb 01 10:01:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:8cb94272-9b85-4a3b-a318-d4ded9f25bee, vol_name:cephfs) < ""
Feb 01 10:01:11 np0005604215.localdomain systemd[1]: tmp-crun.9KAhie.mount: Deactivated successfully.
Feb 01 10:01:11 np0005604215.localdomain podman[316267]: 2026-02-01 10:01:11.882352404 +0000 UTC m=+0.088727517 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:01:11 np0005604215.localdomain podman[316267]: 2026-02-01 10:01:11.891719555 +0000 UTC m=+0.098094628 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:01:11 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:01:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:11.971 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "auth_id": "david", "tenant_id": "f62ab07d2055417db4484bccb101ac2e", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, tenant_id:f62ab07d2055417db4484bccb101ac2e, vol_name:cephfs) < ""
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.david", "format": "json"} v 0)
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mgr[278126]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: david is already in use
Feb 01 10:01:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:david, format:json, prefix:fs subvolume authorize, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, tenant_id:f62ab07d2055417db4484bccb101ac2e, vol_name:cephfs) < ""
Feb 01 10:01:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:12.022+0000 7f93ec23e640 -1 mgr.server reply reply (1) Operation not permitted auth ID: david is already in use
Feb 01 10:01:12 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (1) Operation not permitted auth ID: david is already in use
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "334d218b-dfca-41dc-92c9-1bb7ec15d360_74b9c1d7-8ee2-4566-b420-d2cfb354ff64", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "334d218b-dfca-41dc-92c9-1bb7ec15d360", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4022670018' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4022670018' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v519: 177 pgs: 177 active+clean; 452 MiB data, 1.7 GiB used, 40 GiB / 42 GiB avail; 97 KiB/s rd, 26 MiB/s wr, 152 op/s
Feb 01 10:01:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:13.148 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "format": "json"}]: dispatch
Feb 01 10:01:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "8cb94272-9b85-4a3b-a318-d4ded9f25bee", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "auth_id": "david", "tenant_id": "f62ab07d2055417db4484bccb101ac2e", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:01:13 np0005604215.localdomain ceph-mon[298604]: pgmap v519: 177 pgs: 177 active+clean; 452 MiB data, 1.7 GiB used, 40 GiB / 42 GiB avail; 97 KiB/s rd, 26 MiB/s wr, 152 op/s
Feb 01 10:01:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v520: 177 pgs: 177 active+clean; 881 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 127 KiB/s rd, 68 MiB/s wr, 215 op/s
Feb 01 10:01:14 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2680760182' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:14 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2680760182' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:14 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e252 e252: 6 total, 6 up, 6 in
Feb 01 10:01:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e252 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes WARNING volumes.fs.operations.versions.subvolume_v1] deauthorized called for already-removed authID 'david' for subvolume 'c1ec6001-c4f0-42e8-a3ae-66c185a36061'
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=david, client_metadata.root=/volumes/_nogroup/c1ec6001-c4f0-42e8-a3ae-66c185a36061/f886650d-b51c-4718-ba32-df2300a26036
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:01:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:15 np0005604215.localdomain ceph-mon[298604]: pgmap v520: 177 pgs: 177 active+clean; 881 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 127 KiB/s rd, 68 MiB/s wr, 215 op/s
Feb 01 10:01:15 np0005604215.localdomain ceph-mon[298604]: osdmap e252: 6 total, 6 up, 6 in
Feb 01 10:01:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v522: 177 pgs: 177 active+clean; 881 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 87 KiB/s rd, 54 MiB/s wr, 150 op/s
Feb 01 10:01:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "0463ec36-25b3-41dc-9d07-408b582c340a", "format": "json"}]: dispatch
Feb 01 10:01:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:0463ec36-25b3-41dc-9d07-408b582c340a, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:0463ec36-25b3-41dc-9d07-408b582c340a, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e253 e253: 6 total, 6 up, 6 in
Feb 01 10:01:16 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:16.974 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:17 np0005604215.localdomain ceph-mon[298604]: pgmap v522: 177 pgs: 177 active+clean; 881 MiB data, 2.9 GiB used, 39 GiB / 42 GiB avail; 87 KiB/s rd, 54 MiB/s wr, 150 op/s
Feb 01 10:01:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "0463ec36-25b3-41dc-9d07-408b582c340a", "format": "json"}]: dispatch
Feb 01 10:01:17 np0005604215.localdomain ceph-mon[298604]: osdmap e253: 6 total, 6 up, 6 in
Feb 01 10:01:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:18.190 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v524: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 171 KiB/s rd, 96 MiB/s wr, 286 op/s
Feb 01 10:01:18 np0005604215.localdomain ceph-mon[298604]: pgmap v524: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 171 KiB/s rd, 96 MiB/s wr, 286 op/s
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.david", "format": "json"} v 0)
Feb 01 10:01:18 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 01 10:01:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.david"} v 0)
Feb 01 10:01:18 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:david, format:json, prefix:fs subvolume deauthorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=david, client_metadata.root=/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f/28cffc9e-f368-4287-bbca-51fb2339a1c0
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:david, format:json, prefix:fs subvolume evict, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:4294967296, sub_name:90d0a515-163b-4cd0-9158-f05911007a1a, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/90d0a515-163b-4cd0-9158-f05911007a1a/.meta.tmp'
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/90d0a515-163b-4cd0-9158-f05911007a1a/.meta.tmp' to config b'/volumes/_nogroup/90d0a515-163b-4cd0-9158-f05911007a1a/.meta'
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:4294967296, sub_name:90d0a515-163b-4cd0-9158-f05911007a1a, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "format": "json"}]: dispatch
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:90d0a515-163b-4cd0-9158-f05911007a1a, vol_name:cephfs) < ""
Feb 01 10:01:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:90d0a515-163b-4cd0-9158-f05911007a1a, vol_name:cephfs) < ""
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.david", "format": "json"} : dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.david"} : dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.david"}]': finished
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "david", "format": "json"}]: dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "size": 4294967296, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "format": "json"}]: dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e254 e254: 6 total, 6 up, 6 in
Feb 01 10:01:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "204f122a-bf12-4f0e-934c-5da070005009", "format": "json"}]: dispatch
Feb 01 10:01:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:204f122a-bf12-4f0e-934c-5da070005009, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:204f122a-bf12-4f0e-934c-5da070005009, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e254 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v526: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 112 KiB/s rd, 57 MiB/s wr, 181 op/s
Feb 01 10:01:20 np0005604215.localdomain ceph-mon[298604]: osdmap e254: 6 total, 6 up, 6 in
Feb 01 10:01:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "204f122a-bf12-4f0e-934c-5da070005009", "format": "json"}]: dispatch
Feb 01 10:01:20 np0005604215.localdomain ceph-mon[298604]: pgmap v526: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 112 KiB/s rd, 57 MiB/s wr, 181 op/s
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:01:21
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['vms', '.mgr', 'backups', 'volumes', 'manila_metadata', 'manila_data', 'images']
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:01:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e255 e255: 6 total, 6 up, 6 in
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014874720826353993 of space, bias 1.0, pg target 0.2969985924995347 quantized to 32 (current 32)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 0.07146954390005583 of space, bias 1.0, pg target 14.222439236111109 quantized to 32 (current 32)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 2.453674623115578e-06 of space, bias 1.0, pg target 0.0004539298052763819 quantized to 32 (current 32)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0003500575795644891 of space, bias 4.0, pg target 0.25904260887772196 quantized to 16 (current 16)
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:01:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e256 e256: 6 total, 6 up, 6 in
Feb 01 10:01:21 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:21.978 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta.tmp'
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta.tmp' to config b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta'
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v529: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 122 KiB/s rd, 62 MiB/s wr, 198 op/s
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:3221225472, sub_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: osdmap e255: 6 total, 6 up, 6 in
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: osdmap e256: 6 total, 6 up, 6 in
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: pgmap v529: 177 pgs: 177 active+clean; 1.2 GiB data, 4.0 GiB used, 38 GiB / 42 GiB avail; 122 KiB/s rd, 62 MiB/s wr, 198 op/s
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/62834cfd-0e6e-4a9a-8e7a-94535f5d68c6/.meta.tmp'
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/62834cfd-0e6e-4a9a-8e7a-94535f5d68c6/.meta.tmp' to config b'/volumes/_nogroup/62834cfd-0e6e-4a9a-8e7a-94535f5d68c6/.meta'
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:3221225472, sub_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:22.668+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c1ec6001-c4f0-42e8-a3ae-66c185a36061' of type subvolume
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c1ec6001-c4f0-42e8-a3ae-66c185a36061' of type subvolume
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/c1ec6001-c4f0-42e8-a3ae-66c185a36061'' moved to trashcan
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c1ec6001-c4f0-42e8-a3ae-66c185a36061, vol_name:cephfs) < ""
Feb 01 10:01:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e257 e257: 6 total, 6 up, 6 in
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2454881203' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2454881203' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:23.223 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "8a7eb109-e3b5-473d-add7-9e1ee2a73cfe", "format": "json"}]: dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:8a7eb109-e3b5-473d-add7-9e1ee2a73cfe, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "size": 3221225472, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "format": "json"}]: dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "format": "json"}]: dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c1ec6001-c4f0-42e8-a3ae-66c185a36061", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: osdmap e257: 6 total, 6 up, 6 in
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2454881203' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2454881203' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:8a7eb109-e3b5-473d-add7-9e1ee2a73cfe, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v531: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 76 KiB/s wr, 180 op/s
Feb 01 10:01:24 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "8a7eb109-e3b5-473d-add7-9e1ee2a73cfe", "format": "json"}]: dispatch
Feb 01 10:01:24 np0005604215.localdomain ceph-mon[298604]: pgmap v531: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 76 KiB/s wr, 180 op/s
Feb 01 10:01:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e257 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "snap_name": "e6f47d41-176f-47be-9b1d-71861fa50733", "format": "json"}]: dispatch
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:e6f47d41-176f-47be-9b1d-71861fa50733, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:e6f47d41-176f-47be-9b1d-71861fa50733, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e258 e258: 6 total, 6 up, 6 in
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "format": "json"}]: dispatch
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:25.610+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ebeb9c1e-187e-4fbb-8711-dc250e4ab635' of type subvolume
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ebeb9c1e-187e-4fbb-8711-dc250e4ab635' of type subvolume
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ebeb9c1e-187e-4fbb-8711-dc250e4ab635'' moved to trashcan
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ebeb9c1e-187e-4fbb-8711-dc250e4ab635, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "format": "json"}]: dispatch
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:90d0a515-163b-4cd0-9158-f05911007a1a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:90d0a515-163b-4cd0-9158-f05911007a1a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:25.768+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '90d0a515-163b-4cd0-9158-f05911007a1a' of type subvolume
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '90d0a515-163b-4cd0-9158-f05911007a1a' of type subvolume
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:90d0a515-163b-4cd0-9158-f05911007a1a, vol_name:cephfs) < ""
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/90d0a515-163b-4cd0-9158-f05911007a1a'' moved to trashcan
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:90d0a515-163b-4cd0-9158-f05911007a1a, vol_name:cephfs) < ""
Feb 01 10:01:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v533: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 76 KiB/s wr, 182 op/s
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "snap_name": "e6f47d41-176f-47be-9b1d-71861fa50733", "format": "json"}]: dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: osdmap e258: 6 total, 6 up, 6 in
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4212286376' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/4212286376' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "format": "json"}]: dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ebeb9c1e-187e-4fbb-8711-dc250e4ab635", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "format": "json"}]: dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "90d0a515-163b-4cd0-9158-f05911007a1a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: pgmap v533: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 88 KiB/s rd, 76 KiB/s wr, 182 op/s
Feb 01 10:01:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e259 e259: 6 total, 6 up, 6 in
Feb 01 10:01:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:26.980 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "77f0dc61-dc6e-408c-89f4-a2dacf94e1df", "format": "json"}]: dispatch
Feb 01 10:01:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:77f0dc61-dc6e-408c-89f4-a2dacf94e1df, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:77f0dc61-dc6e-408c-89f4-a2dacf94e1df, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:27 np0005604215.localdomain ceph-mon[298604]: osdmap e259: 6 total, 6 up, 6 in
Feb 01 10:01:27 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "77f0dc61-dc6e-408c-89f4-a2dacf94e1df", "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:28.264 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v535: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 145 KiB/s rd, 130 KiB/s wr, 253 op/s
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta.tmp'
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta.tmp' to config b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta'
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:01:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:01:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:01:28 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:01:28 np0005604215.localdomain podman[316293]: 2026-02-01 10:01:28.882774313 +0000 UTC m=+0.087743166 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:28.892+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ef5904d0-6de5-446a-a091-edb3ad7abb31' of type subvolume
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ef5904d0-6de5-446a-a091-edb3ad7abb31' of type subvolume
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ef5904d0-6de5-446a-a091-edb3ad7abb31'' moved to trashcan
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ef5904d0-6de5-446a-a091-edb3ad7abb31, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mon[298604]: pgmap v535: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 145 KiB/s rd, 130 KiB/s wr, 253 op/s
Feb 01 10:01:28 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/663981012' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/663981012' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:28 np0005604215.localdomain podman[316292]: 2026-02-01 10:01:28.938058346 +0000 UTC m=+0.146889409 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1769056855, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, vcs-type=git)
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:28.964+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '62834cfd-0e6e-4a9a-8e7a-94535f5d68c6' of type subvolume
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '62834cfd-0e6e-4a9a-8e7a-94535f5d68c6' of type subvolume
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, vol_name:cephfs) < ""
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/62834cfd-0e6e-4a9a-8e7a-94535f5d68c6'' moved to trashcan
Feb 01 10:01:28 np0005604215.localdomain podman[316294]: 2026-02-01 10:01:28.989531741 +0000 UTC m=+0.191384227 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.vendor=CentOS, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0)
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:62834cfd-0e6e-4a9a-8e7a-94535f5d68c6, vol_name:cephfs) < ""
Feb 01 10:01:29 np0005604215.localdomain podman[316293]: 2026-02-01 10:01:29.00203993 +0000 UTC m=+0.207008773 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:01:29 np0005604215.localdomain podman[316292]: 2026-02-01 10:01:29.011845166 +0000 UTC m=+0.220676229 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1769056855, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, architecture=x86_64, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7)
Feb 01 10:01:29 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:01:29 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:01:29 np0005604215.localdomain podman[316300]: 2026-02-01 10:01:29.087139842 +0000 UTC m=+0.284323582 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 10:01:29 np0005604215.localdomain podman[316300]: 2026-02-01 10:01:29.096090802 +0000 UTC m=+0.293274582 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:01:29 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:01:29 np0005604215.localdomain podman[316294]: 2026-02-01 10:01:29.117913841 +0000 UTC m=+0.319766307 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:01:29 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "snap_name": "e6f47d41-176f-47be-9b1d-71861fa50733_1bedbdd0-9afe-4342-8012-179fbaf0a969", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:e6f47d41-176f-47be-9b1d-71861fa50733_1bedbdd0-9afe-4342-8012-179fbaf0a969, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta.tmp'
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta.tmp' to config b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta'
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:e6f47d41-176f-47be-9b1d-71861fa50733_1bedbdd0-9afe-4342-8012-179fbaf0a969, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "snap_name": "e6f47d41-176f-47be-9b1d-71861fa50733", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:e6f47d41-176f-47be-9b1d-71861fa50733, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta.tmp'
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta.tmp' to config b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a/.meta'
Feb 01 10:01:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:e6f47d41-176f-47be-9b1d-71861fa50733, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ef5904d0-6de5-446a-a091-edb3ad7abb31", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "62834cfd-0e6e-4a9a-8e7a-94535f5d68c6", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e260 e260: 6 total, 6 up, 6 in
Feb 01 10:01:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:01:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:01:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:01:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:01:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:01:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18316 "" "Go-http-client/1.1"
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e260 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v537: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 69 KiB/s wr, 107 op/s
Feb 01 10:01:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "28ed57d1-d4dd-4eee-89e7-1676bfca130b", "format": "json"}]: dispatch
Feb 01 10:01:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:28ed57d1-d4dd-4eee-89e7-1676bfca130b, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:30 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:28ed57d1-d4dd-4eee-89e7-1676bfca130b, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "snap_name": "e6f47d41-176f-47be-9b1d-71861fa50733_1bedbdd0-9afe-4342-8012-179fbaf0a969", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "snap_name": "e6f47d41-176f-47be-9b1d-71861fa50733", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: osdmap e260: 6 total, 6 up, 6 in
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: pgmap v537: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 69 KiB/s wr, 107 op/s
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/461966862' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:30 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/461966862' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:01:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:01:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:01:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:01:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:01:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:01:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "snap_name": "5e09fab8-143b-4edc-a899-7a58c4eb5f0b", "format": "json"}]: dispatch
Feb 01 10:01:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:5e09fab8-143b-4edc-a899-7a58c4eb5f0b, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:5e09fab8-143b-4edc-a899-7a58c4eb5f0b, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e261 e261: 6 total, 6 up, 6 in
Feb 01 10:01:31 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:31.983 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:32 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "28ed57d1-d4dd-4eee-89e7-1676bfca130b", "format": "json"}]: dispatch
Feb 01 10:01:32 np0005604215.localdomain ceph-mon[298604]: osdmap e261: 6 total, 6 up, 6 in
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "admin", "format": "json"}]: dispatch
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:admin, format:json, prefix:fs subvolume deauthorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes ERROR volumes.fs.operations.versions.subvolume_v1] auth ID: admin doesn't exist
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:admin, format:json, prefix:fs subvolume deauthorize, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:32.332+0000 7f93ec23e640 -1 mgr.server reply reply (2) No such file or directory auth ID: admin doesn't exist
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (2) No such file or directory auth ID: admin doesn't exist
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v539: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 69 KiB/s wr, 107 op/s
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "format": "json"}]: dispatch
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:32.481+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f' of type subvolume
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f' of type subvolume
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f'' moved to trashcan
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "format": "json"}]: dispatch
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:32.727+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '49b736f9-feaf-4b7f-9d80-10ecfc8b132a' of type subvolume
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '49b736f9-feaf-4b7f-9d80-10ecfc8b132a' of type subvolume
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/49b736f9-feaf-4b7f-9d80-10ecfc8b132a'' moved to trashcan
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:49b736f9-feaf-4b7f-9d80-10ecfc8b132a, vol_name:cephfs) < ""
Feb 01 10:01:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e262 e262: 6 total, 6 up, 6 in
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "snap_name": "5e09fab8-143b-4edc-a899-7a58c4eb5f0b", "format": "json"}]: dispatch
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "auth_id": "admin", "format": "json"}]: dispatch
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: pgmap v539: 177 pgs: 177 active+clean; 197 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 74 KiB/s rd, 69 KiB/s wr, 107 op/s
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1152719004' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1152719004' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: osdmap e262: 6 total, 6 up, 6 in
Feb 01 10:01:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:33.305 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e263 e263: 6 total, 6 up, 6 in
Feb 01 10:01:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "format": "json"}]: dispatch
Feb 01 10:01:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ebb94a5c-b1fe-4a17-9f7d-7893b96d0e3f", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "format": "json"}]: dispatch
Feb 01 10:01:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "49b736f9-feaf-4b7f-9d80-10ecfc8b132a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:34 np0005604215.localdomain ceph-mon[298604]: osdmap e263: 6 total, 6 up, 6 in
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "28ed57d1-d4dd-4eee-89e7-1676bfca130b_517eb85e-a959-4dd5-be18-fb3a9c2f1228", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:28ed57d1-d4dd-4eee-89e7-1676bfca130b_517eb85e-a959-4dd5-be18-fb3a9c2f1228, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:28ed57d1-d4dd-4eee-89e7-1676bfca130b_517eb85e-a959-4dd5-be18-fb3a9c2f1228, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "28ed57d1-d4dd-4eee-89e7-1676bfca130b", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:28ed57d1-d4dd-4eee-89e7-1676bfca130b, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:28ed57d1-d4dd-4eee-89e7-1676bfca130b, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v542: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 104 KiB/s rd, 105 KiB/s wr, 148 op/s
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "snap_name": "5e09fab8-143b-4edc-a899-7a58c4eb5f0b_efe3ba74-42c3-4ea0-9c14-8d6107d85424", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:5e09fab8-143b-4edc-a899-7a58c4eb5f0b_efe3ba74-42c3-4ea0-9c14-8d6107d85424, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta.tmp'
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta.tmp' to config b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta'
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:5e09fab8-143b-4edc-a899-7a58c4eb5f0b_efe3ba74-42c3-4ea0-9c14-8d6107d85424, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "snap_name": "5e09fab8-143b-4edc-a899-7a58c4eb5f0b", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:5e09fab8-143b-4edc-a899-7a58c4eb5f0b, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "28ed57d1-d4dd-4eee-89e7-1676bfca130b_517eb85e-a959-4dd5-be18-fb3a9c2f1228", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "28ed57d1-d4dd-4eee-89e7-1676bfca130b", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:35 np0005604215.localdomain ceph-mon[298604]: pgmap v542: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 104 KiB/s rd, 105 KiB/s wr, 148 op/s
Feb 01 10:01:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/212262338' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/212262338' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta.tmp'
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta.tmp' to config b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3/.meta'
Feb 01 10:01:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:5e09fab8-143b-4edc-a899-7a58c4eb5f0b, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e263 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:35 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:35Z|00249|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 01 10:01:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "snap_name": "5e09fab8-143b-4edc-a899-7a58c4eb5f0b_efe3ba74-42c3-4ea0-9c14-8d6107d85424", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "snap_name": "5e09fab8-143b-4edc-a899-7a58c4eb5f0b", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v543: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 76 KiB/s rd, 77 KiB/s wr, 109 op/s
Feb 01 10:01:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e264 e264: 6 total, 6 up, 6 in
Feb 01 10:01:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, vol_name:cephfs) < ""
Feb 01 10:01:36 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:36.985 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a/.meta.tmp'
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a/.meta.tmp' to config b'/volumes/_nogroup/87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a/.meta'
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "format": "json"}]: dispatch
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mon[298604]: pgmap v543: 177 pgs: 177 active+clean; 198 MiB data, 1.0 GiB used, 41 GiB / 42 GiB avail; 76 KiB/s rd, 77 KiB/s wr, 109 op/s
Feb 01 10:01:37 np0005604215.localdomain ceph-mon[298604]: osdmap e264: 6 total, 6 up, 6 in
Feb 01 10:01:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3606390078' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:37 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3606390078' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "77f0dc61-dc6e-408c-89f4-a2dacf94e1df_70668b5f-bf0f-414e-9eef-698d029be6a6", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:77f0dc61-dc6e-408c-89f4-a2dacf94e1df_70668b5f-bf0f-414e-9eef-698d029be6a6, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:77f0dc61-dc6e-408c-89f4-a2dacf94e1df_70668b5f-bf0f-414e-9eef-698d029be6a6, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "77f0dc61-dc6e-408c-89f4-a2dacf94e1df", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:77f0dc61-dc6e-408c-89f4-a2dacf94e1df, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:77f0dc61-dc6e-408c-89f4-a2dacf94e1df, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e265 e265: 6 total, 6 up, 6 in
Feb 01 10:01:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "format": "json"}]: dispatch
Feb 01 10:01:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "77f0dc61-dc6e-408c-89f4-a2dacf94e1df_70668b5f-bf0f-414e-9eef-698d029be6a6", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "77f0dc61-dc6e-408c-89f4-a2dacf94e1df", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:38 np0005604215.localdomain ceph-mon[298604]: osdmap e265: 6 total, 6 up, 6 in
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "format": "json"}]: dispatch
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:41d26af5-3d45-417d-aa39-08707e23e8c3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:41d26af5-3d45-417d-aa39-08707e23e8c3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:38 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:38.261+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '41d26af5-3d45-417d-aa39-08707e23e8c3' of type subvolume
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '41d26af5-3d45-417d-aa39-08707e23e8c3' of type subvolume
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/41d26af5-3d45-417d-aa39-08707e23e8c3'' moved to trashcan
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:41d26af5-3d45-417d-aa39-08707e23e8c3, vol_name:cephfs) < ""
Feb 01 10:01:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:38.335 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v546: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 108 KiB/s wr, 97 op/s
Feb 01 10:01:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e266 e266: 6 total, 6 up, 6 in
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "format": "json"}]: dispatch
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "41d26af5-3d45-417d-aa39-08707e23e8c3", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: pgmap v546: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 66 KiB/s rd, 108 KiB/s wr, 97 op/s
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: osdmap e266: 6 total, 6 up, 6 in
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/626846641' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/626846641' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:39 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:01:39 np0005604215.localdomain podman[316374]: 2026-02-01 10:01:39.865470465 +0000 UTC m=+0.078225099 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_managed=true)
Feb 01 10:01:39 np0005604215.localdomain podman[316374]: 2026-02-01 10:01:39.900427234 +0000 UTC m=+0.113181868 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 10:01:39 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:01:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e267 e267: 6 total, 6 up, 6 in
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/626846641' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/626846641' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: osdmap e267: 6 total, 6 up, 6 in
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #40. Immutable memtables: 0.
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.162322) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 21] Flushing memtable with next log file: 40
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100162364, "job": 21, "event": "flush_started", "num_memtables": 1, "num_entries": 2621, "num_deletes": 265, "total_data_size": 3790027, "memory_usage": 3851312, "flush_reason": "Manual Compaction"}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 21] Level-0 flush table #41: started
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e267 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100178329, "cf_name": "default", "job": 21, "event": "table_file_creation", "file_number": 41, "file_size": 2461212, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 25798, "largest_seqno": 28414, "table_properties": {"data_size": 2450128, "index_size": 7013, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 3013, "raw_key_size": 27370, "raw_average_key_size": 22, "raw_value_size": 2426588, "raw_average_value_size": 2018, "num_data_blocks": 299, "num_entries": 1202, "num_filter_entries": 1202, "num_deletions": 265, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939991, "oldest_key_time": 1769939991, "file_creation_time": 1769940100, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 41, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 21] Flush lasted 16055 microseconds, and 6282 cpu microseconds.
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.178375) [db/flush_job.cc:967] [default] [JOB 21] Level-0 flush table #41: 2461212 bytes OK
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.178398) [db/memtable_list.cc:519] [default] Level-0 commit table #41 started
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.180368) [db/memtable_list.cc:722] [default] Level-0 commit table #41: memtable #1 done
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.180394) EVENT_LOG_v1 {"time_micros": 1769940100180387, "job": 21, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.180414) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 21] Try to delete WAL files size 3777511, prev total WAL file size 3777806, number of live WAL files 2.
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000037.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.181366) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132323939' seq:72057594037927935, type:22 .. '7061786F73003132353531' seq:0, type:0; will stop at (end)
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 22] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 21 Base level 0, inputs: [41(2403KB)], [39(20MB)]
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100181404, "job": 22, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [41], "files_L6": [39], "score": -1, "input_data_size": 23916880, "oldest_snapshot_seqno": -1}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 22] Generated table #42: 13772 keys, 22547164 bytes, temperature: kUnknown
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100295153, "cf_name": "default", "job": 22, "event": "table_file_creation", "file_number": 42, "file_size": 22547164, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22465632, "index_size": 46039, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 34437, "raw_key_size": 370026, "raw_average_key_size": 26, "raw_value_size": 22228286, "raw_average_value_size": 1614, "num_data_blocks": 1725, "num_entries": 13772, "num_filter_entries": 13772, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769940100, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 42, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.295470) [db/compaction/compaction_job.cc:1663] [default] [JOB 22] Compacted 1@0 + 1@6 files to L6 => 22547164 bytes
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.348346) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 210.1 rd, 198.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.3, 20.5 +0.0 blob) out(21.5 +0.0 blob), read-write-amplify(18.9) write-amplify(9.2) OK, records in: 14314, records dropped: 542 output_compression: NoCompression
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.348378) EVENT_LOG_v1 {"time_micros": 1769940100348365, "job": 22, "event": "compaction_finished", "compaction_time_micros": 113821, "compaction_time_cpu_micros": 53254, "output_level": 6, "num_output_files": 1, "total_output_size": 22547164, "num_input_records": 14314, "num_output_records": 13772, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000041.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100348848, "job": 22, "event": "table_file_deletion", "file_number": 41}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000039.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940100351895, "job": 22, "event": "table_file_deletion", "file_number": 39}
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.181265) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.352005) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.352012) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.352016) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.352019) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:01:40.352022) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v549: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 90 KiB/s rd, 147 KiB/s wr, 132 op/s
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "8a7eb109-e3b5-473d-add7-9e1ee2a73cfe_b745ea45-6be9-407d-90cc-c523f0225057", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8a7eb109-e3b5-473d-add7-9e1ee2a73cfe_b745ea45-6be9-407d-90cc-c523f0225057, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8a7eb109-e3b5-473d-add7-9e1ee2a73cfe_b745ea45-6be9-407d-90cc-c523f0225057, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "8a7eb109-e3b5-473d-add7-9e1ee2a73cfe", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8a7eb109-e3b5-473d-add7-9e1ee2a73cfe, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8a7eb109-e3b5-473d-add7-9e1ee2a73cfe, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/253343049' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:01:40 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/253343049' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "format": "json"}]: dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:40.839+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a' of type subvolume
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a' of type subvolume
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, vol_name:cephfs) < ""
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a'' moved to trashcan
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a, vol_name:cephfs) < ""
Feb 01 10:01:41 np0005604215.localdomain ceph-mon[298604]: pgmap v549: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 90 KiB/s rd, 147 KiB/s wr, 132 op/s
Feb 01 10:01:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "8a7eb109-e3b5-473d-add7-9e1ee2a73cfe_b745ea45-6be9-407d-90cc-c523f0225057", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:41 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/253343049' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:01:41 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/253343049' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:95f0c0d1-c008-4f85-a891-030f31cdce50, vol_name:cephfs) < ""
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/95f0c0d1-c008-4f85-a891-030f31cdce50/.meta.tmp'
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/95f0c0d1-c008-4f85-a891-030f31cdce50/.meta.tmp' to config b'/volumes/_nogroup/95f0c0d1-c008-4f85-a891-030f31cdce50/.meta'
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:95f0c0d1-c008-4f85-a891-030f31cdce50, vol_name:cephfs) < ""
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "format": "json"}]: dispatch
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:95f0c0d1-c008-4f85-a891-030f31cdce50, vol_name:cephfs) < ""
Feb 01 10:01:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:95f0c0d1-c008-4f85-a891-030f31cdce50, vol_name:cephfs) < ""
Feb 01 10:01:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:41.777 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:01:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:41.778 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:01:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:41.778 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:01:41 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e268 e268: 6 total, 6 up, 6 in
Feb 01 10:01:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:41.988 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "8a7eb109-e3b5-473d-add7-9e1ee2a73cfe", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "format": "json"}]: dispatch
Feb 01 10:01:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "87aeeb6c-6bc9-4b4f-8e85-f9e7d23d4c9a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:42 np0005604215.localdomain ceph-mon[298604]: osdmap e268: 6 total, 6 up, 6 in
Feb 01 10:01:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v551: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:01:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:01:42 np0005604215.localdomain podman[316393]: 2026-02-01 10:01:42.865672416 +0000 UTC m=+0.077133255 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 10:01:42 np0005604215.localdomain podman[316393]: 2026-02-01 10:01:42.874864122 +0000 UTC m=+0.086324971 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:01:42 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:01:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e269 e269: 6 total, 6 up, 6 in
Feb 01 10:01:43 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:43 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "format": "json"}]: dispatch
Feb 01 10:01:43 np0005604215.localdomain ceph-mon[298604]: pgmap v551: 177 pgs: 177 active+clean; 198 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:01:43 np0005604215.localdomain ceph-mon[298604]: osdmap e269: 6 total, 6 up, 6 in
Feb 01 10:01:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:43.368 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "204f122a-bf12-4f0e-934c-5da070005009_162585e0-9e26-4972-8197-83daa42ea5eb", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:204f122a-bf12-4f0e-934c-5da070005009_162585e0-9e26-4972-8197-83daa42ea5eb, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:204f122a-bf12-4f0e-934c-5da070005009_162585e0-9e26-4972-8197-83daa42ea5eb, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "204f122a-bf12-4f0e-934c-5da070005009", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:204f122a-bf12-4f0e-934c-5da070005009, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:204f122a-bf12-4f0e-934c-5da070005009, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v553: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 94 KiB/s wr, 86 op/s
Feb 01 10:01:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "204f122a-bf12-4f0e-934c-5da070005009_162585e0-9e26-4972-8197-83daa42ea5eb", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "204f122a-bf12-4f0e-934c-5da070005009", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:44 np0005604215.localdomain ceph-mon[298604]: pgmap v553: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 61 KiB/s rd, 94 KiB/s wr, 86 op/s
Feb 01 10:01:44 np0005604215.localdomain sudo[316414]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:01:44 np0005604215.localdomain sudo[316414]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:01:44 np0005604215.localdomain sudo[316414]: pam_unix(sudo:session): session closed for user root
Feb 01 10:01:44 np0005604215.localdomain sudo[316432]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:01:44 np0005604215.localdomain sudo[316432]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "format": "json"}]: dispatch
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:95f0c0d1-c008-4f85-a891-030f31cdce50, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:95f0c0d1-c008-4f85-a891-030f31cdce50, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:44 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:44.940+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '95f0c0d1-c008-4f85-a891-030f31cdce50' of type subvolume
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '95f0c0d1-c008-4f85-a891-030f31cdce50' of type subvolume
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:95f0c0d1-c008-4f85-a891-030f31cdce50, vol_name:cephfs) < ""
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/95f0c0d1-c008-4f85-a891-030f31cdce50'' moved to trashcan
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:95f0c0d1-c008-4f85-a891-030f31cdce50, vol_name:cephfs) < ""
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e269 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:45 np0005604215.localdomain sudo[316432]: pam_unix(sudo:session): session closed for user root
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "format": "json"}]: dispatch
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "95f0c0d1-c008-4f85-a891-030f31cdce50", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:01:45 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 2e901022-96b3-45db-8af4-9f332f38248c (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:01:45 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 2e901022-96b3-45db-8af4-9f332f38248c (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:01:45 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 2e901022-96b3-45db-8af4-9f332f38248c (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:01:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:01:45 np0005604215.localdomain sudo[316481]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:01:45 np0005604215.localdomain sudo[316481]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:01:45 np0005604215.localdomain sudo[316481]: pam_unix(sudo:session): session closed for user root
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v554: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 79 KiB/s wr, 73 op/s
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: pgmap v554: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 52 KiB/s rd, 79 KiB/s wr, 73 op/s
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "0463ec36-25b3-41dc-9d07-408b582c340a_f121621c-4b0c-4a5e-93bc-9760f71241c7", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:0463ec36-25b3-41dc-9d07-408b582c340a_f121621c-4b0c-4a5e-93bc-9760f71241c7, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e270 e270: 6 total, 6 up, 6 in
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:0463ec36-25b3-41dc-9d07-408b582c340a_f121621c-4b0c-4a5e-93bc-9760f71241c7, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "0463ec36-25b3-41dc-9d07-408b582c340a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:0463ec36-25b3-41dc-9d07-408b582c340a, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:47.045 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp'
Feb 01 10:01:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta.tmp' to config b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e/.meta'
Feb 01 10:01:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:0463ec36-25b3-41dc-9d07-408b582c340a, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:01:47 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "0463ec36-25b3-41dc-9d07-408b582c340a_f121621c-4b0c-4a5e-93bc-9760f71241c7", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:47 np0005604215.localdomain ceph-mon[298604]: osdmap e270: 6 total, 6 up, 6 in
Feb 01 10:01:47 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "snap_name": "0463ec36-25b3-41dc-9d07-408b582c340a", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e271 e271: 6 total, 6 up, 6 in
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp'
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp' to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta'
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "format": "json"}]: dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v557: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 156 KiB/s wr, 85 op/s
Feb 01 10:01:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:48.406 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp'
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp' to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta'
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "format": "json"}]: dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:01:48 np0005604215.localdomain ceph-mon[298604]: osdmap e271: 6 total, 6 up, 6 in
Feb 01 10:01:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "format": "json"}]: dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mon[298604]: pgmap v557: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 56 KiB/s rd, 156 KiB/s wr, 85 op/s
Feb 01 10:01:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:01:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:01:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "format": "json"}]: dispatch
Feb 01 10:01:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e272 e272: 6 total, 6 up, 6 in
Feb 01 10:01:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:50.126 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.125 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.127 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.128 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:01:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e272 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "format": "json"}]: dispatch
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:50.316+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fde4b04f-3cda-4612-9edf-b8d93a1f6d0e' of type subvolume
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fde4b04f-3cda-4612-9edf-b8d93a1f6d0e' of type subvolume
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/fde4b04f-3cda-4612-9edf-b8d93a1f6d0e'' moved to trashcan
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fde4b04f-3cda-4612-9edf-b8d93a1f6d0e, vol_name:cephfs) < ""
Feb 01 10:01:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v559: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 853 B/s rd, 71 KiB/s wr, 6 op/s
Feb 01 10:01:50 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:01:50.429 259225 INFO neutron.agent.linux.ip_lib [None req-41f8917a-dfb6-4337-850d-d6d5540db9cb - - - - - -] Device tap87d2d119-c6 cannot be used as it has no MAC address
Feb 01 10:01:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:50.454 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:50 np0005604215.localdomain kernel: device tap87d2d119-c6 entered promiscuous mode
Feb 01 10:01:50 np0005604215.localdomain NetworkManager[5972]: <info>  [1769940110.4651] manager: (tap87d2d119-c6): new Generic device (/org/freedesktop/NetworkManager/Devices/45)
Feb 01 10:01:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:50.465 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:50Z|00250|binding|INFO|Claiming lport 87d2d119-c67b-45d9-ae55-273f194d0fcf for this chassis.
Feb 01 10:01:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:50Z|00251|binding|INFO|87d2d119-c67b-45d9-ae55-273f194d0fcf: Claiming unknown
Feb 01 10:01:50 np0005604215.localdomain systemd-udevd[316509]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.480 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02ac6c4d149e42a78d91221782aba2a7', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b87d577-bb4d-4fa7-8fd1-8b8b7a56f357, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=87d2d119-c67b-45d9-ae55-273f194d0fcf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.482 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 87d2d119-c67b-45d9-ae55-273f194d0fcf in datapath fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36 bound to our chassis
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.484 158655 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599
Feb 01 10:01:50 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:01:50.485 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[33837964-5288-4976-bed5-d7381d56e207]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:50Z|00252|binding|INFO|Setting lport 87d2d119-c67b-45d9-ae55-273f194d0fcf ovn-installed in OVS
Feb 01 10:01:50 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:50Z|00253|binding|INFO|Setting lport 87d2d119-c67b-45d9-ae55-273f194d0fcf up in Southbound
Feb 01 10:01:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:50.500 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tap87d2d119-c6: No such device
Feb 01 10:01:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:50.540 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:50.574 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:51 np0005604215.localdomain ceph-mon[298604]: osdmap e272: 6 total, 6 up, 6 in
Feb 01 10:01:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "format": "json"}]: dispatch
Feb 01 10:01:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fde4b04f-3cda-4612-9edf-b8d93a1f6d0e", "force": true, "format": "json"}]: dispatch
Feb 01 10:01:51 np0005604215.localdomain ceph-mon[298604]: pgmap v559: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 853 B/s rd, 71 KiB/s wr, 6 op/s
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:01:51 np0005604215.localdomain podman[316580]: 
Feb 01 10:01:51 np0005604215.localdomain podman[316580]: 2026-02-01 10:01:51.492354382 +0000 UTC m=+0.095542229 container create 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac", "format": "json"}]: dispatch
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:01:51 np0005604215.localdomain systemd[1]: Started libpod-conmon-77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0.scope.
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:01:51 np0005604215.localdomain podman[316580]: 2026-02-01 10:01:51.446790642 +0000 UTC m=+0.049978519 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 10:01:51 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 10:01:51 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2ab4f9ef39bfd1dad63e076d392e8797c8a2c365aac7cee21797b99cee32b651/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 10:01:51 np0005604215.localdomain podman[316580]: 2026-02-01 10:01:51.565327216 +0000 UTC m=+0.168515073 container init 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:01:51 np0005604215.localdomain podman[316580]: 2026-02-01 10:01:51.573678426 +0000 UTC m=+0.176866273 container start 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:01:51 np0005604215.localdomain dnsmasq[316598]: started, version 2.85 cachesize 150
Feb 01 10:01:51 np0005604215.localdomain dnsmasq[316598]: DNS service limited to local subnets
Feb 01 10:01:51 np0005604215.localdomain dnsmasq[316598]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 10:01:51 np0005604215.localdomain dnsmasq[316598]: warning: no upstream servers configured
Feb 01 10:01:51 np0005604215.localdomain dnsmasq-dhcp[316598]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 10:01:51 np0005604215.localdomain dnsmasq[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/addn_hosts - 0 addresses
Feb 01 10:01:51 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/host
Feb 01 10:01:51 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/opts
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27", "format": "json"}]: dispatch
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:01:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:01:51 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:01:51.805 259225 INFO neutron.agent.dhcp.agent [None req-814cfd38-6d39-486a-9b27-2873a8122081 - - - - - -] DHCP configuration for ports {'b5df0a2f-b375-49b7-b539-001a87c34d16'} is completed
Feb 01 10:01:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e273 e273: 6 total, 6 up, 6 in
Feb 01 10:01:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:52.094 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v561: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 944 B/s rd, 78 KiB/s wr, 7 op/s
Feb 01 10:01:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac", "format": "json"}]: dispatch
Feb 01 10:01:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27", "format": "json"}]: dispatch
Feb 01 10:01:52 np0005604215.localdomain ceph-mon[298604]: osdmap e273: 6 total, 6 up, 6 in
Feb 01 10:01:52 np0005604215.localdomain ceph-mon[298604]: pgmap v561: 177 pgs: 177 active+clean; 199 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 944 B/s rd, 78 KiB/s wr, 7 op/s
Feb 01 10:01:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:53.443 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:53 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:01:53.721 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:01:53Z, description=, device_id=f89734fc-059a-400e-996c-2c8f8ad88e03, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322d2ee0>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f00322d26a0>], id=87b6ab84-336f-4b01-a756-e11432a5bed7, ip_allocation=immediate, mac_address=fa:16:3e:18:1c:30, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:01:49Z, description=, dns_domain=, id=fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1492791037-network, port_security_enabled=True, project_id=02ac6c4d149e42a78d91221782aba2a7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25226, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3443, status=ACTIVE, subnets=['fdd813ba-81d9-4f06-b4e4-b6261e5d2046'], tags=[], tenant_id=02ac6c4d149e42a78d91221782aba2a7, updated_at=2026-02-01T10:01:49Z, vlan_transparent=None, network_id=fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, port_security_enabled=False, project_id=02ac6c4d149e42a78d91221782aba2a7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3470, status=DOWN, tags=[], tenant_id=02ac6c4d149e42a78d91221782aba2a7, updated_at=2026-02-01T10:01:53Z on network fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36
Feb 01 10:01:53 np0005604215.localdomain dnsmasq[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/addn_hosts - 1 addresses
Feb 01 10:01:53 np0005604215.localdomain podman[316617]: 2026-02-01 10:01:53.940628145 +0000 UTC m=+0.060993832 container kill 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:01:53 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/host
Feb 01 10:01:53 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/opts
Feb 01 10:01:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:01:54.185 259225 INFO neutron.agent.dhcp.agent [None req-bcf64f2e-2990-498f-ab5c-5c988f5208b3 - - - - - -] DHCP configuration for ports {'87b6ab84-336f-4b01-a756-e11432a5bed7'} is completed
Feb 01 10:01:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v562: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 478 B/s rd, 72 KiB/s wr, 6 op/s
Feb 01 10:01:54 np0005604215.localdomain ceph-mon[298604]: pgmap v562: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 478 B/s rd, 72 KiB/s wr, 6 op/s
Feb 01 10:01:54 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:01:54.599 259225 INFO neutron.agent.dhcp.agent [-] Trigger reload_allocations for port admin_state_up=True, allowed_address_pairs=[], binding:host_id=, binding:profile=, binding:vif_details=, binding:vif_type=unbound, binding:vnic_type=normal, created_at=2026-02-01T10:01:53Z, description=, device_id=f89734fc-059a-400e-996c-2c8f8ad88e03, device_owner=network:router_interface, dns_assignment=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032308070>], dns_domain=, dns_name=, extra_dhcp_opts=[], fixed_ips=[<neutron.agent.linux.dhcp.DictModel object at 0x7f0032308310>], id=87b6ab84-336f-4b01-a756-e11432a5bed7, ip_allocation=immediate, mac_address=fa:16:3e:18:1c:30, name=, network=admin_state_up=True, availability_zone_hints=[], availability_zones=[], created_at=2026-02-01T10:01:49Z, description=, dns_domain=, id=fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, ipv4_address_scope=None, ipv6_address_scope=None, l2_adjacency=True, mtu=1442, name=tempest-TelemetryAlarmingAPIAdminTest-1492791037-network, port_security_enabled=True, project_id=02ac6c4d149e42a78d91221782aba2a7, provider:network_type=geneve, provider:physical_network=None, provider:segmentation_id=25226, qos_policy_id=None, revision_number=2, router:external=False, shared=False, standard_attr_id=3443, status=ACTIVE, subnets=['fdd813ba-81d9-4f06-b4e4-b6261e5d2046'], tags=[], tenant_id=02ac6c4d149e42a78d91221782aba2a7, updated_at=2026-02-01T10:01:49Z, vlan_transparent=None, network_id=fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, port_security_enabled=False, project_id=02ac6c4d149e42a78d91221782aba2a7, qos_network_policy_id=None, qos_policy_id=None, resource_request=None, revision_number=1, security_groups=[], standard_attr_id=3470, status=DOWN, tags=[], tenant_id=02ac6c4d149e42a78d91221782aba2a7, updated_at=2026-02-01T10:01:53Z on network fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36
Feb 01 10:01:54 np0005604215.localdomain dnsmasq[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/addn_hosts - 1 addresses
Feb 01 10:01:54 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/host
Feb 01 10:01:54 np0005604215.localdomain podman[316655]: 2026-02-01 10:01:54.811409693 +0000 UTC m=+0.061606861 container kill 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 10:01:54 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/opts
Feb 01 10:01:54 np0005604215.localdomain systemd[1]: tmp-crun.VqGlmT.mount: Deactivated successfully.
Feb 01 10:01:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac", "target_sub_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:01:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, target_sub_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:01:55.028 259225 INFO neutron.agent.dhcp.agent [None req-e757e397-3195-4d08-8709-372e72089e4d - - - - - -] DHCP configuration for ports {'87b6ab84-336f-4b01-a756-e11432a5bed7'} is completed
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 273 bytes to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp' to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.clone_index] tracking-id 6737350b-dbf1-446c-8a6f-4f1713a6d13c for path b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 246 bytes to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp' to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, target_sub_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.105+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.105+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.105+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.105+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.105+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.118 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27", "target_sub_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, target_sub_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e273 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] cloning to subvolume path: /volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] starting clone: (cephfs, None, 85bf3dc4-239a-4ea6-b907-935513f36b9b)
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 273 bytes to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp' to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.clone_index] tracking-id c629a215-9f83-4c2c-abaf-00ba2a6ad08e for path b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 246 bytes to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp' to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, target_sub_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.372+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.372+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.372+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.372+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:01:55.372+0000 7f93f0a47640 -1 client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: client.0 error registering admin socket command: (17) File exists
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] cloning to subvolume path: /volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508
Feb 01 10:01:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] starting clone: (cephfs, None, 9c1c4137-22b3-4b8a-9eaf-875da7fa2508)
Feb 01 10:01:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:55Z|00254|ovn_bfd|INFO|Enabled BFD on interface ovn-2186fb-0
Feb 01 10:01:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:55Z|00255|ovn_bfd|INFO|Enabled BFD on interface ovn-e1cc33-0
Feb 01 10:01:55 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:01:55Z|00256|ovn_bfd|INFO|Enabled BFD on interface ovn-45aa31-0
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.476 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.496 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.500 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.508 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac", "target_sub_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27", "target_sub_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.583 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:55.592 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:56.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:56.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:01:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:56.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:01:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:56.114 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:01:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v563: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 58 KiB/s wr, 4 op/s
Feb 01 10:01:56 np0005604215.localdomain ceph-mon[298604]: mgrmap e61: np0005604215.uhhqtv(active, since 12m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 10:01:56 np0005604215.localdomain ceph-mon[298604]: pgmap v563: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 58 KiB/s wr, 4 op/s
Feb 01 10:01:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:56.572 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:56 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:56.581 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e274 e274: 6 total, 6 up, 6 in
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.096 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.123 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.123 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.124 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.209 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:01:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/636120249' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.593 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.469s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.777 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.779 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11545MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.780 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:01:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:57.780 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:01:57 np0005604215.localdomain ceph-mon[298604]: osdmap e274: 6 total, 6 up, 6 in
Feb 01 10:01:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1080088364' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/636120249' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.225 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.226 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.245 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:01:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v565: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 89 KiB/s wr, 8 op/s
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.492 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:01:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:01:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/196178825' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.706 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.462s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.713 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.733 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.736 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:01:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:58.736 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:01:58 np0005604215.localdomain ceph-mon[298604]: pgmap v565: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 89 KiB/s wr, 8 op/s
Feb 01 10:01:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/196178825' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1177841147' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:01:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] Delayed cloning (cephfs, None, 85bf3dc4-239a-4ea6-b907-935513f36b9b) -- by 0 seconds
Feb 01 10:01:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:01:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 277 bytes to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp'
Feb 01 10:01:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp' to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta'
Feb 01 10:01:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:59.737 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:59.738 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:59.738 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:01:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:01:59.739 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:01:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:01:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:01:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:01:59 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:01:59 np0005604215.localdomain podman[316749]: 2026-02-01 10:01:59.857993603 +0000 UTC m=+0.063814870 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true)
Feb 01 10:01:59 np0005604215.localdomain systemd[1]: tmp-crun.kzZ2Ew.mount: Deactivated successfully.
Feb 01 10:01:59 np0005604215.localdomain podman[316749]: 2026-02-01 10:01:59.915282618 +0000 UTC m=+0.121103925 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:01:59 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:01:59 np0005604215.localdomain podman[316748]: 2026-02-01 10:01:59.916103183 +0000 UTC m=+0.123066326 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 10:01:59 np0005604215.localdomain podman[316747]: 2026-02-01 10:01:59.974000158 +0000 UTC m=+0.183348305 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, build-date=2026-01-22T05:09:47Z, org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vendor=Red Hat, Inc., container_name=openstack_network_exporter, release=1769056855, io.openshift.tags=minimal rhel9, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Feb 01 10:01:59 np0005604215.localdomain podman[316755]: 2026-02-01 10:01:59.894710237 +0000 UTC m=+0.088444517 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:01:59 np0005604215.localdomain podman[316748]: 2026-02-01 10:01:59.999881985 +0000 UTC m=+0.206845128 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent)
Feb 01 10:02:00 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:02:00 np0005604215.localdomain podman[316747]: 2026-02-01 10:02:00.014745608 +0000 UTC m=+0.224093735 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, distribution-scope=public, version=9.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, architecture=x86_64, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Feb 01 10:02:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:02:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:02:00 np0005604215.localdomain podman[316755]: 2026-02-01 10:02:00.027723152 +0000 UTC m=+0.221457412 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:02:00 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:02:00 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:02:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:02:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 157180 "" "Go-http-client/1.1"
Feb 01 10:02:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:02:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18812 "" "Go-http-client/1.1"
Feb 01 10:02:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v566: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 605 B/s rd, 84 KiB/s wr, 8 op/s
Feb 01 10:02:00 np0005604215.localdomain ceph-mon[298604]: pgmap v566: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 605 B/s rd, 84 KiB/s wr, 8 op/s
Feb 01 10:02:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:01.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:02:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:02:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:02:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:02:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:02:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:02:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, vol_name:cephfs) < ""
Feb 01 10:02:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:02.097 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v567: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 71 KiB/s wr, 6 op/s
Feb 01 10:02:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:02 np0005604215.localdomain ceph-mon[298604]: pgmap v567: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 71 KiB/s wr, 6 op/s
Feb 01 10:02:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:02:03Z|00257|ovn_bfd|INFO|Disabled BFD on interface ovn-2186fb-0
Feb 01 10:02:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:02:03Z|00258|ovn_bfd|INFO|Disabled BFD on interface ovn-e1cc33-0
Feb 01 10:02:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:02:03Z|00259|ovn_bfd|INFO|Disabled BFD on interface ovn-45aa31-0
Feb 01 10:02:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:03.269 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:03.274 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:03.291 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:03 np0005604215.localdomain dnsmasq[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/addn_hosts - 0 addresses
Feb 01 10:02:03 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/host
Feb 01 10:02:03 np0005604215.localdomain podman[316847]: 2026-02-01 10:02:03.410965363 +0000 UTC m=+0.059012651 container kill 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 10:02:03 np0005604215.localdomain dnsmasq-dhcp[316598]: read /var/lib/neutron/dhcp/fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36/opts
Feb 01 10:02:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:03.526 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:03.595 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:02:03Z|00260|binding|INFO|Releasing lport 87d2d119-c67b-45d9-ae55-273f194d0fcf from this chassis (sb_readonly=0)
Feb 01 10:02:03 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:02:03Z|00261|binding|INFO|Setting lport 87d2d119-c67b-45d9-ae55-273f194d0fcf down in Southbound
Feb 01 10:02:03 np0005604215.localdomain kernel: device tap87d2d119-c6 left promiscuous mode
Feb 01 10:02:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:03.604 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '02ac6c4d149e42a78d91221782aba2a7', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b87d577-bb4d-4fa7-8fd1-8b8b7a56f357, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=87d2d119-c67b-45d9-ae55-273f194d0fcf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:02:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:03.606 158655 INFO neutron.agent.ovn.metadata.agent [-] Port 87d2d119-c67b-45d9-ae55-273f194d0fcf in datapath fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36 unbound from our chassis
Feb 01 10:02:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:03.609 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 10:02:03 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:03.610 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[ca189bc7-01e9-4cf8-bf2f-0389fcc85792]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 10:02:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:03.617 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:04.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v568: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 42 KiB/s wr, 5 op/s
Feb 01 10:02:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] Delayed cloning (cephfs, None, 9c1c4137-22b3-4b8a-9eaf-875da7fa2508) -- by 0 seconds
Feb 01 10:02:04 np0005604215.localdomain ceph-mon[298604]: pgmap v568: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 42 KiB/s wr, 5 op/s
Feb 01 10:02:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1742243905' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:05 np0005604215.localdomain dnsmasq[316598]: exiting on receipt of SIGTERM
Feb 01 10:02:05 np0005604215.localdomain podman[316885]: 2026-02-01 10:02:05.878089013 +0000 UTC m=+0.058027200 container kill 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:02:05 np0005604215.localdomain systemd[1]: libpod-77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0.scope: Deactivated successfully.
Feb 01 10:02:05 np0005604215.localdomain podman[316899]: 2026-02-01 10:02:05.946616388 +0000 UTC m=+0.053721395 container died 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Feb 01 10:02:05 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0-userdata-shm.mount: Deactivated successfully.
Feb 01 10:02:05 np0005604215.localdomain podman[316899]: 2026-02-01 10:02:05.980033429 +0000 UTC m=+0.087138396 container cleanup 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 10:02:05 np0005604215.localdomain systemd[1]: libpod-conmon-77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0.scope: Deactivated successfully.
Feb 01 10:02:06 np0005604215.localdomain podman[316901]: 2026-02-01 10:02:06.022202864 +0000 UTC m=+0.122744517 container remove 77bd20e1269b5f386b0f73f6b117c1c70f23e235af2a80da658b2e5150aa51a0 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-fdca4e7a-a2ed-4b3f-98b0-3078a16d3a36, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127)
Feb 01 10:02:06 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:02:06.059 259225 INFO neutron.agent.dhcp.agent [None req-3e38382e-78d3-418e-aff8-c11187cd272b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 10:02:06 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:02:06.059 259225 INFO neutron.agent.dhcp.agent [None req-3e38382e-78d3-418e-aff8-c11187cd272b - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 10:02:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:06.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:06.296 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v569: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 42 KiB/s wr, 5 op/s
Feb 01 10:02:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1867099647' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:06 np0005604215.localdomain ceph-mon[298604]: pgmap v569: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 716 B/s rd, 42 KiB/s wr, 5 op/s
Feb 01 10:02:06 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-2ab4f9ef39bfd1dad63e076d392e8797c8a2c365aac7cee21797b99cee32b651-merged.mount: Deactivated successfully.
Feb 01 10:02:06 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2dfdca4e7a\x2da2ed\x2d4b3f\x2d98b0\x2d3078a16d3a36.mount: Deactivated successfully.
Feb 01 10:02:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:07.099 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v570: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 715 B/s rd, 48 KiB/s wr, 5 op/s
Feb 01 10:02:08 np0005604215.localdomain ceph-mon[298604]: pgmap v570: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 715 B/s rd, 48 KiB/s wr, 5 op/s
Feb 01 10:02:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:08.553 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] copying data from b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.snap/7c62edf1-e706-4a7c-a38f-d41949f0e0ac/2fc8045e-224c-42d5-9fbe-50f9c8c8434b' to b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/3dbfb658-cf3e-4815-af2c-a2a39448d949'
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/fe07adb4-ff99-4e52-87fe-1c3a91d3e012/.meta.tmp'
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/fe07adb4-ff99-4e52-87fe-1c3a91d3e012/.meta.tmp' to config b'/volumes/_nogroup/fe07adb4-ff99-4e52-87fe-1c3a91d3e012/.meta'
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, vol_name:cephfs) < ""
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "format": "json"}]: dispatch
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, vol_name:cephfs) < ""
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 277 bytes to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp'
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp' to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta'
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 274 bytes to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp'
Feb 01 10:02:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp' to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta'
Feb 01 10:02:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v571: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 25 KiB/s wr, 2 op/s
Feb 01 10:02:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "format": "json"}]: dispatch
Feb 01 10:02:10 np0005604215.localdomain ceph-mon[298604]: pgmap v571: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 25 KiB/s wr, 2 op/s
Feb 01 10:02:10 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:02:10 np0005604215.localdomain podman[316927]: 2026-02-01 10:02:10.871322879 +0000 UTC m=+0.085343312 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127)
Feb 01 10:02:10 np0005604215.localdomain podman[316927]: 2026-02-01 10:02:10.885777619 +0000 UTC m=+0.099798052 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:02:10 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:02:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:12.137 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v572: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 25 KiB/s wr, 2 op/s
Feb 01 10:02:12 np0005604215.localdomain ceph-mon[298604]: pgmap v572: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 25 KiB/s wr, 2 op/s
Feb 01 10:02:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:13.597 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:13 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:02:13 np0005604215.localdomain podman[316946]: 2026-02-01 10:02:13.862855611 +0000 UTC m=+0.077471835 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:02:13 np0005604215.localdomain podman[316946]: 2026-02-01 10:02:13.897616404 +0000 UTC m=+0.112232668 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:02:13 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v573: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 853 B/s rd, 37 KiB/s wr, 4 op/s
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] copying data from b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.snap/399a0ea4-3929-4405-9bc1-c3a475bd2a27/1fb3ab05-8c33-4b3d-b12a-80ef113677d2' to b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/427f0da2-51c7-4572-8a2f-14669b0cde52'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.clone_index] untracking 6737350b-dbf1-446c-8a6f-4f1713a6d13c
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp' to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 151 bytes to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta.tmp' to config b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] finished clone: (cephfs, None, 85bf3dc4-239a-4ea6-b907-935513f36b9b)
Feb 01 10:02:14 np0005604215.localdomain ceph-mon[298604]: pgmap v573: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 853 B/s rd, 37 KiB/s wr, 4 op/s
Feb 01 10:02:14 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 274 bytes to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp' to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.clone_index] untracking c629a215-9f83-4c2c-abaf-00ba2a6ad08e
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp' to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 151 bytes to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta.tmp' to config b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] finished clone: (cephfs, None, 9c1c4137-22b3-4b8a-9eaf-875da7fa2508)
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta.tmp' to config b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/554e72a1-8e1b-418d-8d4e-df4ac0aff10b/.meta.tmp'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/554e72a1-8e1b-418d-8d4e-df4ac0aff10b/.meta.tmp' to config b'/volumes/_nogroup/554e72a1-8e1b-418d-8d4e-df4ac0aff10b/.meta'
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "format": "json"}]: dispatch
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "format": "json"}]: dispatch
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, vol_name:cephfs) < ""
Feb 01 10:02:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, vol_name:cephfs) < ""
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0ca5077e-5800-458a-bcdd-debc86c3a775, vol_name:cephfs) < ""
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0ca5077e-5800-458a-bcdd-debc86c3a775/.meta.tmp'
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0ca5077e-5800-458a-bcdd-debc86c3a775/.meta.tmp' to config b'/volumes/_nogroup/0ca5077e-5800-458a-bcdd-debc86c3a775/.meta'
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0ca5077e-5800-458a-bcdd-debc86c3a775, vol_name:cephfs) < ""
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0ca5077e-5800-458a-bcdd-debc86c3a775, vol_name:cephfs) < ""
Feb 01 10:02:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0ca5077e-5800-458a-bcdd-debc86c3a775, vol_name:cephfs) < ""
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "format": "json"}]: dispatch
Feb 01 10:02:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "snap_name": "79f87f1d-b2e3-46b8-8f19-152c3b678c27", "format": "json"}]: dispatch
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:79f87f1d-b2e3-46b8-8f19-152c3b678c27, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:79f87f1d-b2e3-46b8-8f19-152c3b678c27, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v574: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 24 KiB/s wr, 3 op/s
Feb 01 10:02:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "snap_name": "79f87f1d-b2e3-46b8-8f19-152c3b678c27", "format": "json"}]: dispatch
Feb 01 10:02:16 np0005604215.localdomain ceph-mon[298604]: pgmap v574: 177 pgs: 177 active+clean; 200 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 24 KiB/s wr, 3 op/s
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "format": "json"}]: dispatch
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:0ca5077e-5800-458a-bcdd-debc86c3a775, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:0ca5077e-5800-458a-bcdd-debc86c3a775, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:16 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:16.914+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0ca5077e-5800-458a-bcdd-debc86c3a775' of type subvolume
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0ca5077e-5800-458a-bcdd-debc86c3a775' of type subvolume
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0ca5077e-5800-458a-bcdd-debc86c3a775, vol_name:cephfs) < ""
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/0ca5077e-5800-458a-bcdd-debc86c3a775'' moved to trashcan
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0ca5077e-5800-458a-bcdd-debc86c3a775, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:17.168 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:17.170+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fe07adb4-ff99-4e52-87fe-1c3a91d3e012' of type subvolume
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'fe07adb4-ff99-4e52-87fe-1c3a91d3e012' of type subvolume
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/fe07adb4-ff99-4e52-87fe-1c3a91d3e012'' moved to trashcan
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:fe07adb4-ff99-4e52-87fe-1c3a91d3e012, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:17.514+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '554e72a1-8e1b-418d-8d4e-df4ac0aff10b' of type subvolume
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '554e72a1-8e1b-418d-8d4e-df4ac0aff10b' of type subvolume
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/554e72a1-8e1b-418d-8d4e-df4ac0aff10b'' moved to trashcan
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:554e72a1-8e1b-418d-8d4e-df4ac0aff10b, vol_name:cephfs) < ""
Feb 01 10:02:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0ca5077e-5800-458a-bcdd-debc86c3a775", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "format": "json"}]: dispatch
Feb 01 10:02:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "fe07adb4-ff99-4e52-87fe-1c3a91d3e012", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v575: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 79 KiB/s wr, 8 op/s
Feb 01 10:02:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:18.662 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:18 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "format": "json"}]: dispatch
Feb 01 10:02:18 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "554e72a1-8e1b-418d-8d4e-df4ac0aff10b", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:18 np0005604215.localdomain ceph-mon[298604]: pgmap v575: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 79 KiB/s wr, 8 op/s
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "snap_name": "79f87f1d-b2e3-46b8-8f19-152c3b678c27_f112651e-0368-4663-86fe-3d087644550a", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:79f87f1d-b2e3-46b8-8f19-152c3b678c27_f112651e-0368-4663-86fe-3d087644550a, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta.tmp'
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta.tmp' to config b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta'
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:79f87f1d-b2e3-46b8-8f19-152c3b678c27_f112651e-0368-4663-86fe-3d087644550a, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "snap_name": "79f87f1d-b2e3-46b8-8f19-152c3b678c27", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:79f87f1d-b2e3-46b8-8f19-152c3b678c27, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta.tmp'
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta.tmp' to config b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f/.meta'
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:79f87f1d-b2e3-46b8-8f19-152c3b678c27, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e274 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7de08017-7d25-4fb7-a96f-e3746cdc7d6f/.meta.tmp'
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7de08017-7d25-4fb7-a96f-e3746cdc7d6f/.meta.tmp' to config b'/volumes/_nogroup/7de08017-7d25-4fb7-a96f-e3746cdc7d6f/.meta'
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "format": "json"}]: dispatch
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, vol_name:cephfs) < ""
Feb 01 10:02:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v576: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 68 KiB/s wr, 7 op/s
Feb 01 10:02:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:02:21
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['volumes', 'manila_metadata', 'backups', '.mgr', 'images', 'vms', 'manila_data']
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:02:21 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "snap_name": "79f87f1d-b2e3-46b8-8f19-152c3b678c27_f112651e-0368-4663-86fe-3d087644550a", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:21 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "snap_name": "79f87f1d-b2e3-46b8-8f19-152c3b678c27", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:21 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:21 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "format": "json"}]: dispatch
Feb 01 10:02:21 np0005604215.localdomain ceph-mon[298604]: pgmap v576: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 68 KiB/s wr, 7 op/s
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 2.7263051367950866e-06 of space, bias 1.0, pg target 0.0005425347222222222 quantized to 32 (current 32)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0006395911850921273 of space, bias 4.0, pg target 0.5091145833333334 quantized to 16 (current 16)
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:02:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:02:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:22.198 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v577: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 68 KiB/s wr, 7 op/s
Feb 01 10:02:22 np0005604215.localdomain ceph-mon[298604]: pgmap v577: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 682 B/s rd, 68 KiB/s wr, 7 op/s
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "format": "json"}]: dispatch
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:b862115e-9a5c-498d-a7c3-95dba802af7f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:b862115e-9a5c-498d-a7c3-95dba802af7f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:23 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:23.279+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'b862115e-9a5c-498d-a7c3-95dba802af7f' of type subvolume
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'b862115e-9a5c-498d-a7c3-95dba802af7f' of type subvolume
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/b862115e-9a5c-498d-a7c3-95dba802af7f'' moved to trashcan
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b862115e-9a5c-498d-a7c3-95dba802af7f, vol_name:cephfs) < ""
Feb 01 10:02:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:23.710 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "format": "json"}]: dispatch
Feb 01 10:02:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b862115e-9a5c-498d-a7c3-95dba802af7f", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "format": "json"}]: dispatch
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:24 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:24.008+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7de08017-7d25-4fb7-a96f-e3746cdc7d6f' of type subvolume
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7de08017-7d25-4fb7-a96f-e3746cdc7d6f' of type subvolume
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, vol_name:cephfs) < ""
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7de08017-7d25-4fb7-a96f-e3746cdc7d6f'' moved to trashcan
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7de08017-7d25-4fb7-a96f-e3746cdc7d6f, vol_name:cephfs) < ""
Feb 01 10:02:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v578: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 92 KiB/s wr, 10 op/s
Feb 01 10:02:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e275 e275: 6 total, 6 up, 6 in
Feb 01 10:02:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "format": "json"}]: dispatch
Feb 01 10:02:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7de08017-7d25-4fb7-a96f-e3746cdc7d6f", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:25 np0005604215.localdomain ceph-mon[298604]: pgmap v578: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 92 KiB/s wr, 10 op/s
Feb 01 10:02:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:26 np0005604215.localdomain ceph-mon[298604]: osdmap e275: 6 total, 6 up, 6 in
Feb 01 10:02:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v580: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 96 KiB/s wr, 10 op/s
Feb 01 10:02:27 np0005604215.localdomain ceph-mon[298604]: pgmap v580: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 96 KiB/s wr, 10 op/s
Feb 01 10:02:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:27.238 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v581: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 68 KiB/s wr, 6 op/s
Feb 01 10:02:28 np0005604215.localdomain ceph-mon[298604]: pgmap v581: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 68 KiB/s wr, 6 op/s
Feb 01 10:02:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:28.754 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/.meta.tmp'
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/.meta.tmp' to config b'/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/.meta'
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "format": "json"}]: dispatch
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "format": "json"}]: dispatch
Feb 01 10:02:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:02:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:02:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:02:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:02:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:02:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18336 "" "Go-http-client/1.1"
Feb 01 10:02:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e275 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v582: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 68 KiB/s wr, 6 op/s
Feb 01 10:02:30 np0005604215.localdomain ceph-mon[298604]: pgmap v582: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 68 KiB/s wr, 6 op/s
Feb 01 10:02:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:02:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:02:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:02:30 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:02:30 np0005604215.localdomain podman[316969]: 2026-02-01 10:02:30.881260717 +0000 UTC m=+0.093155934 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, vcs-type=git, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.expose-services=, build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, architecture=x86_64, container_name=openstack_network_exporter)
Feb 01 10:02:30 np0005604215.localdomain podman[316969]: 2026-02-01 10:02:30.892330272 +0000 UTC m=+0.104225519 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, architecture=x86_64, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.7, build-date=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, managed_by=edpm_ansible, distribution-scope=public, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, release=1769056855, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 10:02:30 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:02:30 np0005604215.localdomain podman[316970]: 2026-02-01 10:02:30.938547402 +0000 UTC m=+0.145311670 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Feb 01 10:02:30 np0005604215.localdomain podman[316970]: 2026-02-01 10:02:30.947681996 +0000 UTC m=+0.154446274 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Feb 01 10:02:30 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:02:30 np0005604215.localdomain podman[316977]: 2026-02-01 10:02:30.998643885 +0000 UTC m=+0.199942553 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:02:31 np0005604215.localdomain podman[316977]: 2026-02-01 10:02:31.006878562 +0000 UTC m=+0.208177240 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:02:31 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:02:31 np0005604215.localdomain podman[316971]: 2026-02-01 10:02:31.094859643 +0000 UTC m=+0.299969199 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Feb 01 10:02:31 np0005604215.localdomain podman[316971]: 2026-02-01 10:02:31.185881941 +0000 UTC m=+0.390991427 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Feb 01 10:02:31 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:02:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:02:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:02:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:02:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:02:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:02:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:02:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e276 e276: 6 total, 6 up, 6 in
Feb 01 10:02:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:02:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:32 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:32.282 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v584: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 49 KiB/s wr, 3 op/s
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: osdmap e276: 6 total, 6 up, 6 in
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:02:32 np0005604215.localdomain ceph-mon[298604]: pgmap v584: 177 pgs: 177 active+clean; 201 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 49 KiB/s wr, 3 op/s
Feb 01 10:02:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:02:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:33.794 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v585: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 326 B/s rd, 72 KiB/s wr, 5 op/s
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, vol_name:cephfs) < ""
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, vol_name:cephfs) < ""
Feb 01 10:02:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:02:34 np0005604215.localdomain ceph-mon[298604]: pgmap v585: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 326 B/s rd, 72 KiB/s wr, 5 op/s
Feb 01 10:02:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:02:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:02:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:02:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1468791672' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:02:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1468791672' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:02:36 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:02:36Z|00262|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Feb 01 10:02:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v586: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 67 KiB/s wr, 4 op/s
Feb 01 10:02:36 np0005604215.localdomain ceph-mon[298604]: pgmap v586: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 67 KiB/s wr, 4 op/s
Feb 01 10:02:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:37.314 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v587: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 45 KiB/s wr, 3 op/s
Feb 01 10:02:38 np0005604215.localdomain ceph-mon[298604]: pgmap v587: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 45 KiB/s wr, 3 op/s
Feb 01 10:02:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:38.837 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:02:39 np0005604215.localdomain systemd-journald[47940]: Data hash table of /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal has a fill level at 75.0 (53723 of 71630 items, 25165824 file size, 468 bytes per hash table item), suggesting rotation.
Feb 01 10:02:39 np0005604215.localdomain systemd-journald[47940]: /run/log/journal/00836dadc27b01f9fb0a211cca69e688/system.journal: Journal header limits reached or header out-of-date, rotating.
Feb 01 10:02:39 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:39 np0005604215.localdomain rsyslogd[760]: imjournal: journal files changed, reloading...  [v8.2102.0-111.el9 try https://www.rsyslog.com/e/0 ]
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/85bf3dc4-239a-4ea6-b907-935513f36b9b'' moved to trashcan
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:85bf3dc4-239a-4ea6-b907-935513f36b9b, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:02:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:02:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:02:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e276 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v588: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 45 KiB/s wr, 3 op/s
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac_275fdc7f-d005-4a46-b0bc-2dc898355e68", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac_275fdc7f-d005-4a46-b0bc-2dc898355e68, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp'
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp' to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta'
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac_275fdc7f-d005-4a46-b0bc-2dc898355e68, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "format": "json"}]: dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "85bf3dc4-239a-4ea6-b907-935513f36b9b", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:40 np0005604215.localdomain ceph-mon[298604]: pgmap v588: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 45 KiB/s wr, 3 op/s
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp'
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta.tmp' to config b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2/.meta'
Feb 01 10:02:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:7c62edf1-e706-4a7c-a38f-d41949f0e0ac, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, vol_name:cephfs) < ""
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/9c1c4137-22b3-4b8a-9eaf-875da7fa2508'' moved to trashcan
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:9c1c4137-22b3-4b8a-9eaf-875da7fa2508, vol_name:cephfs) < ""
Feb 01 10:02:41 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:02:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:41.778 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:02:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:41.779 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:02:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:02:41.779 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:02:41 np0005604215.localdomain systemd[1]: tmp-crun.z1jA5m.mount: Deactivated successfully.
Feb 01 10:02:41 np0005604215.localdomain podman[317055]: 2026-02-01 10:02:41.87620353 +0000 UTC m=+0.094396594 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 10:02:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac_275fdc7f-d005-4a46-b0bc-2dc898355e68", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "snap_name": "7c62edf1-e706-4a7c-a38f-d41949f0e0ac", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "format": "json"}]: dispatch
Feb 01 10:02:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "9c1c4137-22b3-4b8a-9eaf-875da7fa2508", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:41 np0005604215.localdomain podman[317055]: 2026-02-01 10:02:41.915757042 +0000 UTC m=+0.133950106 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:02:41 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:02:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:42.348 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v589: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 98 B/s rd, 44 KiB/s wr, 3 op/s
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:02:42 np0005604215.localdomain ceph-mon[298604]: pgmap v589: 177 pgs: 177 active+clean; 202 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 98 B/s rd, 44 KiB/s wr, 3 op/s
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "format": "json"}]: dispatch
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:43.724+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd7c65253-5d6c-4617-9070-0a8b5ac1c2b2' of type subvolume
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'd7c65253-5d6c-4617-9070-0a8b5ac1c2b2' of type subvolume
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/d7c65253-5d6c-4617-9070-0a8b5ac1c2b2'' moved to trashcan
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:d7c65253-5d6c-4617-9070-0a8b5ac1c2b2, vol_name:cephfs) < ""
Feb 01 10:02:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:43.870 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27_86b1b596-8b78-46fd-b8a1-a58d7c899be4", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27_86b1b596-8b78-46fd-b8a1-a58d7c899be4, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp'
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp' to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta'
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27_86b1b596-8b78-46fd-b8a1-a58d7c899be4, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp'
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta.tmp' to config b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb/.meta'
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:399a0ea4-3929-4405-9bc1-c3a475bd2a27, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:02:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v590: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 107 KiB/s wr, 7 op/s
Feb 01 10:02:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e277 e277: 6 total, 6 up, 6 in
Feb 01 10:02:44 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:02:44 np0005604215.localdomain systemd[1]: tmp-crun.fu87bi.mount: Deactivated successfully.
Feb 01 10:02:44 np0005604215.localdomain podman[317074]: 2026-02-01 10:02:44.845020543 +0000 UTC m=+0.067571886 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 10:02:44 np0005604215.localdomain podman[317074]: 2026-02-01 10:02:44.853423765 +0000 UTC m=+0.075975128 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 10:02:44 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e277 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "d7c65253-5d6c-4617-9070-0a8b5ac1c2b2", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27_86b1b596-8b78-46fd-b8a1-a58d7c899be4", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "snap_name": "399a0ea4-3929-4405-9bc1-c3a475bd2a27", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: pgmap v590: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 107 KiB/s wr, 7 op/s
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: osdmap e277: 6 total, 6 up, 6 in
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e278 e278: 6 total, 6 up, 6 in
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:02:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:45 np0005604215.localdomain sudo[317098]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:02:45 np0005604215.localdomain sudo[317098]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:02:45 np0005604215.localdomain sudo[317098]: pam_unix(sudo:session): session closed for user root
Feb 01 10:02:45 np0005604215.localdomain sudo[317116]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:02:45 np0005604215.localdomain sudo[317116]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:02:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v593: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 104 KiB/s wr, 7 op/s
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: osdmap e278: 6 total, 6 up, 6 in
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: pgmap v593: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 104 KiB/s wr, 7 op/s
Feb 01 10:02:46 np0005604215.localdomain sudo[317116]: pam_unix(sudo:session): session closed for user root
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:02:46 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev add8c773-a56c-48c6-af63-e78b8a024ad6 (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:02:46 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev add8c773-a56c-48c6-af63-e78b8a024ad6 (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:02:46 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event add8c773-a56c-48c6-af63-e78b8a024ad6 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:02:46 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:02:47 np0005604215.localdomain sudo[317165]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:02:47 np0005604215.localdomain sudo[317165]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:02:47 np0005604215.localdomain sudo[317165]: pam_unix(sudo:session): session closed for user root
Feb 01 10:02:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:47.385 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "format": "json"}]: dispatch
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:0871d823-23d6-4b37-9920-427f6d28d0fb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:0871d823-23d6-4b37-9920-427f6d28d0fb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:02:47 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:02:47.444+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0871d823-23d6-4b37-9920-427f6d28d0fb' of type subvolume
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0871d823-23d6-4b37-9920-427f6d28d0fb' of type subvolume
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/0871d823-23d6-4b37-9920-427f6d28d0fb'' moved to trashcan
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:02:47 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0871d823-23d6-4b37-9920-427f6d28d0fb, vol_name:cephfs) < ""
Feb 01 10:02:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:02:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:02:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:02:47 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:02:47 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "format": "json"}]: dispatch
Feb 01 10:02:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v594: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 185 KiB/s wr, 15 op/s
Feb 01 10:02:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0871d823-23d6-4b37-9920-427f6d28d0fb", "force": true, "format": "json"}]: dispatch
Feb 01 10:02:48 np0005604215.localdomain ceph-mon[298604]: pgmap v594: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 185 KiB/s wr, 15 op/s
Feb 01 10:02:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:02:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:02:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:02:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:02:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:48.875 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:49 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:02:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:02:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e278 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v595: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 185 KiB/s wr, 15 op/s
Feb 01 10:02:50 np0005604215.localdomain ceph-mon[298604]: pgmap v595: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.4 KiB/s rd, 185 KiB/s wr, 15 op/s
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:02:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:02:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:02:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e279 e279: 6 total, 6 up, 6 in
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v597: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 81 KiB/s wr, 8 op/s
Feb 01 10:02:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:52.432 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: osdmap e279: 6 total, 6 up, 6 in
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:52 np0005604215.localdomain ceph-mon[298604]: pgmap v597: 177 pgs: 177 active+clean; 203 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.0 KiB/s rd, 81 KiB/s wr, 8 op/s
Feb 01 10:02:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:53.910 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v598: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 136 KiB/s wr, 12 op/s
Feb 01 10:02:54 np0005604215.localdomain ceph-mon[298604]: pgmap v598: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 136 KiB/s wr, 12 op/s
Feb 01 10:02:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:55.121 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:02:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:02:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:02:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:02:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v599: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 121 KiB/s wr, 11 op/s
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.114 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.115 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.115 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:02:57 np0005604215.localdomain ceph-mon[298604]: pgmap v599: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 121 KiB/s wr, 11 op/s
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.206 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.207 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.224 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.224 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.225 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.225 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.226 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.487 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:02:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1141303840' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.668 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.867 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.868 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11536MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.869 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.869 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.961 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.962 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:02:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:57.992 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.009 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.009 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.047 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.104 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.125 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3549663207' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1141303840' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v600: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 89 KiB/s wr, 7 op/s
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2492180565' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.568 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.442s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.576 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.607 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta.tmp'
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta.tmp' to config b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta'
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.610 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.611 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.742s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "format": "json"}]: dispatch
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:58.952 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:02:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: pgmap v600: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 89 KiB/s wr, 7 op/s
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2492180565' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2860799475' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:02:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp'
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp' to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta'
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "format": "json"}]: dispatch
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:02:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:59.505 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:59.506 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:02:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:02:59.506 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:03:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:03:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:03:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:03:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:03:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18330 "" "Go-http-client/1.1"
Feb 01 10:03:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:00.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:00.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:03:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "format": "json"}]: dispatch
Feb 01 10:03:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v601: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 89 KiB/s wr, 7 op/s
Feb 01 10:03:01 np0005604215.localdomain ceph-mon[298604]: pgmap v601: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 89 KiB/s wr, 7 op/s
Feb 01 10:03:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:03:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:03:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:03:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:03:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:03:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:03:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:03:01.635 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:03:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:01.635 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:01 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:03:01.637 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 10:03:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "snap_name": "2868f0e0-7db3-4bfb-b89b-d896cb2f8687", "format": "json"}]: dispatch
Feb 01 10:03:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:2868f0e0-7db3-4bfb-b89b-d896cb2f8687, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:2868f0e0-7db3-4bfb-b89b-d896cb2f8687, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: tmp-crun.quAkkL.mount: Deactivated successfully.
Feb 01 10:03:01 np0005604215.localdomain podman[317231]: 2026-02-01 10:03:01.884020107 +0000 UTC m=+0.087311670 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: tmp-crun.SqW7XY.mount: Deactivated successfully.
Feb 01 10:03:01 np0005604215.localdomain podman[317232]: 2026-02-01 10:03:01.929395335 +0000 UTC m=+0.128814678 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.build-date=20260127, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:03:01 np0005604215.localdomain podman[317243]: 2026-02-01 10:03:01.89248108 +0000 UTC m=+0.081141059 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:03:01 np0005604215.localdomain podman[317232]: 2026-02-01 10:03:01.958616731 +0000 UTC m=+0.158036094 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:03:01 np0005604215.localdomain podman[317231]: 2026-02-01 10:03:01.968728345 +0000 UTC m=+0.172019948 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Feb 01 10:03:01 np0005604215.localdomain podman[317243]: 2026-02-01 10:03:01.976833507 +0000 UTC m=+0.165493486 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:03:01 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:03:02 np0005604215.localdomain podman[317230]: 2026-02-01 10:03:02.041910276 +0000 UTC m=+0.246383596 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, architecture=x86_64, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, distribution-scope=public, name=ubi9/ubi-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1769056855, version=9.7, config_id=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 10:03:02 np0005604215.localdomain podman[317230]: 2026-02-01 10:03:02.058656166 +0000 UTC m=+0.263129486 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9/ubi-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, managed_by=edpm_ansible, config_id=openstack_network_exporter, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Feb 01 10:03:02 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:03:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:02.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:02 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v602: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 294 B/s rd, 86 KiB/s wr, 7 op/s
Feb 01 10:03:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:02.490 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39", "format": "json"}]: dispatch
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:03:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:03:02 np0005604215.localdomain systemd[1]: tmp-crun.IVPB8c.mount: Deactivated successfully.
Feb 01 10:03:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "snap_name": "2868f0e0-7db3-4bfb-b89b-d896cb2f8687", "format": "json"}]: dispatch
Feb 01 10:03:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.409 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:03:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:03:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:03.987 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:04 np0005604215.localdomain ceph-mon[298604]: pgmap v602: 177 pgs: 177 active+clean; 204 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 294 B/s rd, 86 KiB/s wr, 7 op/s
Feb 01 10:03:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39", "format": "json"}]: dispatch
Feb 01 10:03:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v603: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 131 KiB/s wr, 11 op/s
Feb 01 10:03:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3906417371' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e279 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "snap_name": "2868f0e0-7db3-4bfb-b89b-d896cb2f8687_a43bf7eb-b41e-4214-bacc-27135e2bb93d", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2868f0e0-7db3-4bfb-b89b-d896cb2f8687_a43bf7eb-b41e-4214-bacc-27135e2bb93d, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta.tmp'
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta.tmp' to config b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta'
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2868f0e0-7db3-4bfb-b89b-d896cb2f8687_a43bf7eb-b41e-4214-bacc-27135e2bb93d, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "snap_name": "2868f0e0-7db3-4bfb-b89b-d896cb2f8687", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2868f0e0-7db3-4bfb-b89b-d896cb2f8687, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta.tmp'
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta.tmp' to config b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573/.meta'
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:2868f0e0-7db3-4bfb-b89b-d896cb2f8687, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:03:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:06 np0005604215.localdomain ceph-mon[298604]: pgmap v603: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 131 KiB/s wr, 11 op/s
Feb 01 10:03:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/477873194' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:03:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39", "target_sub_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, target_sub_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, vol_name:cephfs) < ""
Feb 01 10:03:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:06.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 273 bytes to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp'
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp' to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta'
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.clone_index] tracking-id 7dfead6c-cfba-4d88-894e-6b3b4ee708c8 for path b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895'
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 246 bytes to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp'
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp' to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta'
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_clone(format:json, prefix:fs subvolume snapshot clone, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, target_sub_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, vol_name:cephfs) < ""
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] cloning to subvolume path: /volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] starting clone: (cephfs, None, a68d53cc-1ebe-4c8a-93d3-742bd1afa895)
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v604: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 84 KiB/s wr, 7 op/s
Feb 01 10:03:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "snap_name": "2868f0e0-7db3-4bfb-b89b-d896cb2f8687_a43bf7eb-b41e-4214-bacc-27135e2bb93d", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "snap_name": "2868f0e0-7db3-4bfb-b89b-d896cb2f8687", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot clone", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39", "target_sub_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:07.529 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:08 np0005604215.localdomain ceph-mon[298604]: pgmap v604: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 84 KiB/s wr, 7 op/s
Feb 01 10:03:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v605: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 128 KiB/s wr, 10 op/s
Feb 01 10:03:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:09.031 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] Delayed cloning (cephfs, None, a68d53cc-1ebe-4c8a-93d3-742bd1afa895) -- by 0 seconds
Feb 01 10:03:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 277 bytes to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp'
Feb 01 10:03:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp' to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta'
Feb 01 10:03:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:09 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:03:09.640 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:03:10 np0005604215.localdomain ceph-mon[298604]: pgmap v605: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 128 KiB/s wr, 10 op/s
Feb 01 10:03:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e280 e280: 6 total, 6 up, 6 in
Feb 01 10:03:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v607: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 121 KiB/s wr, 9 op/s
Feb 01 10:03:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:11 np0005604215.localdomain ceph-mon[298604]: osdmap e280: 6 total, 6 up, 6 in
Feb 01 10:03:11 np0005604215.localdomain ceph-mon[298604]: pgmap v607: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 121 KiB/s wr, 9 op/s
Feb 01 10:03:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v608: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 121 KiB/s wr, 9 op/s
Feb 01 10:03:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:12.532 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:12 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:03:12 np0005604215.localdomain podman[317314]: 2026-02-01 10:03:12.875085631 +0000 UTC m=+0.087038912 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_managed=true)
Feb 01 10:03:12 np0005604215.localdomain podman[317314]: 2026-02-01 10:03:12.890911072 +0000 UTC m=+0.102864323 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 10:03:12 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:03:13 np0005604215.localdomain ceph-mon[298604]: pgmap v608: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 121 KiB/s wr, 9 op/s
Feb 01 10:03:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:14.083 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v609: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 69 KiB/s wr, 6 op/s
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] copying data from b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.snap/4fa60cbb-7815-4e58-abbf-0715923dbf39/f0336b65-9c01-40bb-a388-6b61617a489b' to b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/1314091c-97e9-4a58-b3cc-ccc6f0168b91'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "format": "json"}]: dispatch
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 274 bytes to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp' to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.clone_index] untracking 7dfead6c-cfba-4d88-894e-6b3b4ee708c8
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp' to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 151 bytes to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta.tmp' to config b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895/.meta'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_cloner] finished clone: (cephfs, None, a68d53cc-1ebe-4c8a-93d3-742bd1afa895)
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "format": "json"}]: dispatch
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:03:14.639+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'cad5faf1-ff59-4c07-a06b-c60dd8871573' of type subvolume
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'cad5faf1-ff59-4c07-a06b-c60dd8871573' of type subvolume
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/cad5faf1-ff59-4c07-a06b-c60dd8871573'' moved to trashcan
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:cad5faf1-ff59-4c07-a06b-c60dd8871573, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:14 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:14 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:14 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:14 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:14 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e280 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: pgmap v609: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 69 KiB/s wr, 6 op/s
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "format": "json"}]: dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "format": "json"}]: dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "cad5faf1-ff59-4c07-a06b-c60dd8871573", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:15 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:15 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:03:15 np0005604215.localdomain podman[317333]: 2026-02-01 10:03:15.859080365 +0000 UTC m=+0.076234837 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:03:15 np0005604215.localdomain podman[317333]: 2026-02-01 10:03:15.867372622 +0000 UTC m=+0.084527114 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 10:03:15 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v610: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 69 KiB/s wr, 6 op/s
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e281 e281: 6 total, 6 up, 6 in
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #43. Immutable memtables: 0.
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.024588) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 23] Flushing memtable with next log file: 43
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197024627, "job": 23, "event": "flush_started", "num_memtables": 1, "num_entries": 1952, "num_deletes": 259, "total_data_size": 3027823, "memory_usage": 3166192, "flush_reason": "Manual Compaction"}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 23] Level-0 flush table #44: started
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197035141, "cf_name": "default", "job": 23, "event": "table_file_creation", "file_number": 44, "file_size": 1603814, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 28419, "largest_seqno": 30366, "table_properties": {"data_size": 1597125, "index_size": 3518, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2117, "raw_key_size": 18887, "raw_average_key_size": 22, "raw_value_size": 1581988, "raw_average_value_size": 1878, "num_data_blocks": 153, "num_entries": 842, "num_filter_entries": 842, "num_deletions": 259, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940100, "oldest_key_time": 1769940100, "file_creation_time": 1769940197, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 44, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 23] Flush lasted 10610 microseconds, and 5244 cpu microseconds.
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.035197) [db/flush_job.cc:967] [default] [JOB 23] Level-0 flush table #44: 1603814 bytes OK
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.035220) [db/memtable_list.cc:519] [default] Level-0 commit table #44 started
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.037366) [db/memtable_list.cc:722] [default] Level-0 commit table #44: memtable #1 done
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.037388) EVENT_LOG_v1 {"time_micros": 1769940197037381, "job": 23, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.037408) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 23] Try to delete WAL files size 3018436, prev total WAL file size 3018436, number of live WAL files 2.
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000040.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.038519) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6D6772737461740034303130' seq:72057594037927935, type:22 .. '6D6772737461740034323631' seq:0, type:0; will stop at (end)
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 24] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 23 Base level 0, inputs: [44(1566KB)], [42(21MB)]
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197038568, "job": 24, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [44], "files_L6": [42], "score": -1, "input_data_size": 24150978, "oldest_snapshot_seqno": -1}
Feb 01 10:03:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "696b3ce7-12ee-4387-9824-52b489c80aa5", "format": "json"}]: dispatch
Feb 01 10:03:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:696b3ce7-12ee-4387-9824-52b489c80aa5, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:696b3ce7-12ee-4387-9824-52b489c80aa5, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 24] Generated table #45: 14116 keys, 22384912 bytes, temperature: kUnknown
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197186888, "cf_name": "default", "job": 24, "event": "table_file_creation", "file_number": 45, "file_size": 22384912, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22303771, "index_size": 44752, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35333, "raw_key_size": 378432, "raw_average_key_size": 26, "raw_value_size": 22063143, "raw_average_value_size": 1562, "num_data_blocks": 1670, "num_entries": 14116, "num_filter_entries": 14116, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769940197, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 45, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.187215) [db/compaction/compaction_job.cc:1663] [default] [JOB 24] Compacted 1@0 + 1@6 files to L6 => 22384912 bytes
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.189089) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 162.7 rd, 150.8 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.5, 21.5 +0.0 blob) out(21.3 +0.0 blob), read-write-amplify(29.0) write-amplify(14.0) OK, records in: 14614, records dropped: 498 output_compression: NoCompression
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.189116) EVENT_LOG_v1 {"time_micros": 1769940197189104, "job": 24, "event": "compaction_finished", "compaction_time_micros": 148411, "compaction_time_cpu_micros": 57141, "output_level": 6, "num_output_files": 1, "total_output_size": 22384912, "num_input_records": 14614, "num_output_records": 14116, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000044.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197189483, "job": 24, "event": "table_file_deletion", "file_number": 44}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000042.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940197192163, "job": 24, "event": "table_file_deletion", "file_number": 42}
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.038415) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.192275) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.192281) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.192310) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.192315) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:17 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:17.192319) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:17 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:17.536 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:18 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:18 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:18 np0005604215.localdomain ceph-mon[298604]: pgmap v610: 177 pgs: 177 active+clean; 205 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 69 KiB/s wr, 6 op/s
Feb 01 10:03:18 np0005604215.localdomain ceph-mon[298604]: osdmap e281: 6 total, 6 up, 6 in
Feb 01 10:03:18 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "696b3ce7-12ee-4387-9824-52b489c80aa5", "format": "json"}]: dispatch
Feb 01 10:03:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v612: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 989 B/s rd, 145 KiB/s wr, 11 op/s
Feb 01 10:03:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:19.111 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:03:19 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:19 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:19 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:20 np0005604215.localdomain ceph-mon[298604]: pgmap v612: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 989 B/s rd, 145 KiB/s wr, 11 op/s
Feb 01 10:03:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0", "format": "json"}]: dispatch
Feb 01 10:03:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v613: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 120 KiB/s wr, 9 op/s
Feb 01 10:03:21 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:03:21
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['volumes', 'vms', '.mgr', 'backups', 'manila_data', 'images', 'manila_metadata']
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.9084135957565606e-06 of space, bias 1.0, pg target 0.00037977430555555556 quantized to 32 (current 32)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.001023727578866555 of space, bias 4.0, pg target 0.8148871527777778 quantized to 16 (current 16)
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:03:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:03:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0", "format": "json"}]: dispatch
Feb 01 10:03:22 np0005604215.localdomain ceph-mon[298604]: pgmap v613: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 120 KiB/s wr, 9 op/s
Feb 01 10:03:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v614: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 120 KiB/s wr, 9 op/s
Feb 01 10:03:22 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:22.573 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:23 np0005604215.localdomain ceph-mon[298604]: pgmap v614: 177 pgs: 177 active+clean; 206 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 818 B/s rd, 120 KiB/s wr, 9 op/s
Feb 01 10:03:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:03:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:23 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:03:23 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0_d8325836-905a-4ab3-a5ea-7befa865b70c", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0_d8325836-905a-4ab3-a5ea-7befa865b70c, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:24.127 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0_d8325836-905a-4ab3-a5ea-7befa865b70c, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:03:24 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0_d8325836-905a-4ab3-a5ea-7befa865b70c", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v615: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 132 KiB/s wr, 10 op/s
Feb 01 10:03:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "848c4aa1-8ab7-4a67-9f9a-90c43fa4d8b0", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:25 np0005604215.localdomain ceph-mon[298604]: pgmap v615: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 132 KiB/s wr, 10 op/s
Feb 01 10:03:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:03:26 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:26 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v616: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 132 KiB/s wr, 10 op/s
Feb 01 10:03:27 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:27 np0005604215.localdomain ceph-mon[298604]: pgmap v616: 177 pgs: 177 active+clean; 207 MiB data, 1.1 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 132 KiB/s wr, 10 op/s
Feb 01 10:03:27 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "8d37ec79-a49b-4fd8-8bd5-a62773d06fd4", "format": "json"}]: dispatch
Feb 01 10:03:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:8d37ec79-a49b-4fd8-8bd5-a62773d06fd4, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:27 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:8d37ec79-a49b-4fd8-8bd5-a62773d06fd4, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:27 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:27.616 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:28 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "8d37ec79-a49b-4fd8-8bd5-a62773d06fd4", "format": "json"}]: dispatch
Feb 01 10:03:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v617: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 627 B/s rd, 194 KiB/s wr, 14 op/s
Feb 01 10:03:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:29.162 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:29 np0005604215.localdomain ceph-mon[298604]: pgmap v617: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 627 B/s rd, 194 KiB/s wr, 14 op/s
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:03:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:03:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:03:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:03:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:03:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:03:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:03:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18330 "" "Go-http-client/1.1"
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e281 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e282 e282: 6 total, 6 up, 6 in
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:03:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:03:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v619: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 118 KiB/s wr, 9 op/s
Feb 01 10:03:31 np0005604215.localdomain ceph-mon[298604]: osdmap e282: 6 total, 6 up, 6 in
Feb 01 10:03:31 np0005604215.localdomain ceph-mon[298604]: pgmap v619: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 118 KiB/s wr, 9 op/s
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "8d37ec79-a49b-4fd8-8bd5-a62773d06fd4_b5ce561f-cd5d-411f-bf75-a44ec13260f2", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8d37ec79-a49b-4fd8-8bd5-a62773d06fd4_b5ce561f-cd5d-411f-bf75-a44ec13260f2, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8d37ec79-a49b-4fd8-8bd5-a62773d06fd4_b5ce561f-cd5d-411f-bf75-a44ec13260f2, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "8d37ec79-a49b-4fd8-8bd5-a62773d06fd4", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8d37ec79-a49b-4fd8-8bd5-a62773d06fd4, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:8d37ec79-a49b-4fd8-8bd5-a62773d06fd4, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:03:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:03:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:03:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:03:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:03:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:03:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v620: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 118 KiB/s wr, 9 op/s
Feb 01 10:03:32 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:32.652 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:03:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:03:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:03:32 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:03:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:03:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:32 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:32 np0005604215.localdomain podman[317360]: 2026-02-01 10:03:32.893885268 +0000 UTC m=+0.105557447 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:03:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:32 np0005604215.localdomain podman[317360]: 2026-02-01 10:03:32.927115569 +0000 UTC m=+0.138787738 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 01 10:03:32 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:03:32 np0005604215.localdomain podman[317361]: 2026-02-01 10:03:32.954556321 +0000 UTC m=+0.162707540 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 01 10:03:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:32 np0005604215.localdomain podman[317361]: 2026-02-01 10:03:32.999202756 +0000 UTC m=+0.207353975 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 10:03:33 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:03:33 np0005604215.localdomain podman[317359]: 2026-02-01 10:03:33.034997406 +0000 UTC m=+0.249947527 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, build-date=2026-01-22T05:09:47Z, version=9.7, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, io.openshift.tags=minimal rhel9, config_id=openstack_network_exporter, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9/ubi-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 10:03:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "8d37ec79-a49b-4fd8-8bd5-a62773d06fd4_b5ce561f-cd5d-411f-bf75-a44ec13260f2", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "8d37ec79-a49b-4fd8-8bd5-a62773d06fd4", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:33 np0005604215.localdomain podman[317359]: 2026-02-01 10:03:33.046185503 +0000 UTC m=+0.261135644 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, architecture=x86_64, version=9.7, vcs-type=git, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.buildah.version=1.33.7, distribution-scope=public, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.expose-services=, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, release=1769056855, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 10:03:33 np0005604215.localdomain podman[317363]: 2026-02-01 10:03:33.003584971 +0000 UTC m=+0.206167387 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:03:33 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:03:33 np0005604215.localdomain podman[317363]: 2026-02-01 10:03:33.086629409 +0000 UTC m=+0.289211815 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:03:33 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:03:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:34 np0005604215.localdomain ceph-mon[298604]: pgmap v620: 177 pgs: 177 active+clean; 207 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 409 B/s rd, 118 KiB/s wr, 9 op/s
Feb 01 10:03:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:34.211 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, vol_name:cephfs) < ""
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, vol_name:cephfs) < ""
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v621: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 134 KiB/s wr, 9 op/s
Feb 01 10:03:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:03:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1594616730' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:03:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:03:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1594616730' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "13f167ab-ebf7-4b96-af44-07e5f74ed867", "format": "json"}]: dispatch
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:13f167ab-ebf7-4b96-af44-07e5f74ed867, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:34 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:13f167ab-ebf7-4b96-af44-07e5f74ed867, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e283 e283: 6 total, 6 up, 6 in
Feb 01 10:03:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1594616730' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:03:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1594616730' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:03:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e283 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: pgmap v621: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 134 KiB/s wr, 9 op/s
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "13f167ab-ebf7-4b96-af44-07e5f74ed867", "format": "json"}]: dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: osdmap e283: 6 total, 6 up, 6 in
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:03:36 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v623: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 56 KiB/s wr, 3 op/s
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/637d1a6c-3835-4ba9-9fda-e6c8c27dede1/.meta.tmp'
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/637d1a6c-3835-4ba9-9fda-e6c8c27dede1/.meta.tmp' to config b'/volumes/_nogroup/637d1a6c-3835-4ba9-9fda-e6c8c27dede1/.meta'
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "format": "json"}]: dispatch
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e284 e284: 6 total, 6 up, 6 in
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:37 np0005604215.localdomain ceph-mon[298604]: osdmap e284: 6 total, 6 up, 6 in
Feb 01 10:03:37 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:37.716 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/.meta.tmp'
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/.meta.tmp' to config b'/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/.meta'
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "format": "json"}]: dispatch
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:03:37 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:03:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mon[298604]: pgmap v623: 177 pgs: 177 active+clean; 208 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 56 KiB/s wr, 3 op/s
Feb 01 10:03:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "format": "json"}]: dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "13f167ab-ebf7-4b96-af44-07e5f74ed867_8c33dabf-d3cc-43d1-b075-e6959d64286e", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:13f167ab-ebf7-4b96-af44-07e5f74ed867_8c33dabf-d3cc-43d1-b075-e6959d64286e, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:13f167ab-ebf7-4b96-af44-07e5f74ed867_8c33dabf-d3cc-43d1-b075-e6959d64286e, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "13f167ab-ebf7-4b96-af44-07e5f74ed867", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:13f167ab-ebf7-4b96-af44-07e5f74ed867, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:13f167ab-ebf7-4b96-af44-07e5f74ed867, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v625: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 155 KiB/s wr, 11 op/s
Feb 01 10:03:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "format": "json"}]: dispatch
Feb 01 10:03:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:39.262 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:03:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:39 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:39 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:39 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 01 10:03:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:2147483648, prefix:fs subvolume resize, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "13f167ab-ebf7-4b96-af44-07e5f74ed867_8c33dabf-d3cc-43d1-b075-e6959d64286e", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "13f167ab-ebf7-4b96-af44-07e5f74ed867", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: pgmap v625: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 155 KiB/s wr, 11 op/s
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e284 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v626: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 155 KiB/s wr, 11 op/s
Feb 01 10:03:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "new_size": 2147483648, "format": "json"}]: dispatch
Feb 01 10:03:41 np0005604215.localdomain ceph-mon[298604]: pgmap v626: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 155 KiB/s wr, 11 op/s
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/.meta.tmp'
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/.meta.tmp' to config b'/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/.meta'
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "format": "json"}]: dispatch
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "6f784fe6-beb9-4d74-808e-938471da4202", "format": "json"}]: dispatch
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:6f784fe6-beb9-4d74-808e-938471da4202, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:41 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:6f784fe6-beb9-4d74-808e-938471da4202, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:03:41.779 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:03:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:03:41.780 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:03:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:03:41.780 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e285 e285: 6 total, 6 up, 6 in
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "format": "json"}]: dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "6f784fe6-beb9-4d74-808e-938471da4202", "format": "json"}]: dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: osdmap e285: 6 total, 6 up, 6 in
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v628: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 416 B/s rd, 107 KiB/s wr, 8 op/s
Feb 01 10:03:42 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:42.719 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:03:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e286 e286: 6 total, 6 up, 6 in
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: pgmap v628: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 416 B/s rd, 107 KiB/s wr, 8 op/s
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mon[298604]: osdmap e286: 6 total, 6 up, 6 in
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "format": "json"}]: dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:43 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:03:43.262+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '637d1a6c-3835-4ba9-9fda-e6c8c27dede1' of type subvolume
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '637d1a6c-3835-4ba9-9fda-e6c8c27dede1' of type subvolume
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/637d1a6c-3835-4ba9-9fda-e6c8c27dede1'' moved to trashcan
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:03:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:637d1a6c-3835-4ba9-9fda-e6c8c27dede1, vol_name:cephfs) < ""
Feb 01 10:03:43 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:03:43 np0005604215.localdomain systemd[1]: tmp-crun.QNQy6T.mount: Deactivated successfully.
Feb 01 10:03:43 np0005604215.localdomain podman[317447]: 2026-02-01 10:03:43.874491058 +0000 UTC m=+0.089172248 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 10:03:43 np0005604215.localdomain podman[317447]: 2026-02-01 10:03:43.884766296 +0000 UTC m=+0.099447516 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Feb 01 10:03:43 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:03:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "format": "json"}]: dispatch
Feb 01 10:03:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "637d1a6c-3835-4ba9-9fda-e6c8c27dede1", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:44.305 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v630: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 829 B/s rd, 208 KiB/s wr, 15 op/s
Feb 01 10:03:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:03:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:03:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:44 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:03:44 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:44 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:44 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: pgmap v630: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 829 B/s rd, 208 KiB/s wr, 15 op/s
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:45 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122", "osd", "allow rw pool=manila_data namespace=fsvolumens_f8deb5d1-795e-4dac-88f0-806d00540ce4", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "6f784fe6-beb9-4d74-808e-938471da4202_ad0c4f3d-50bd-4842-99d7-23d08be6e9c0", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:6f784fe6-beb9-4d74-808e-938471da4202_ad0c4f3d-50bd-4842-99d7-23d08be6e9c0, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:6f784fe6-beb9-4d74-808e-938471da4202_ad0c4f3d-50bd-4842-99d7-23d08be6e9c0, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "6f784fe6-beb9-4d74-808e-938471da4202", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:6f784fe6-beb9-4d74-808e-938471da4202, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:6f784fe6-beb9-4d74-808e-938471da4202, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "6f784fe6-beb9-4d74-808e-938471da4202_ad0c4f3d-50bd-4842-99d7-23d08be6e9c0", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "6f784fe6-beb9-4d74-808e-938471da4202", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v631: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 94 KiB/s wr, 7 op/s
Feb 01 10:03:46 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:03:46 np0005604215.localdomain podman[317467]: 2026-02-01 10:03:46.87168371 +0000 UTC m=+0.087932040 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:03:46 np0005604215.localdomain podman[317467]: 2026-02-01 10:03:46.886690545 +0000 UTC m=+0.102938865 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:03:46 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:03:47 np0005604215.localdomain sudo[317491]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:03:47 np0005604215.localdomain sudo[317491]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:03:47 np0005604215.localdomain sudo[317491]: pam_unix(sudo:session): session closed for user root
Feb 01 10:03:47 np0005604215.localdomain sudo[317509]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:03:47 np0005604215.localdomain sudo[317509]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:03:47 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:47.723 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:47 np0005604215.localdomain sudo[317509]: pam_unix(sudo:session): session closed for user root
Feb 01 10:03:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:03:47 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:03:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:03:47 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:03:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 9c4a265e-2d9e-43c5-81d0-15b176e4af64 (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 9c4a265e-2d9e-43c5-81d0-15b176e4af64 (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 9c4a265e-2d9e-43c5-81d0-15b176e4af64 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: pgmap v631: 177 pgs: 177 active+clean; 209 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 383 B/s rd, 94 KiB/s wr, 7 op/s
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain sudo[317559]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:03:48 np0005604215.localdomain sudo[317559]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:03:48 np0005604215.localdomain sudo[317559]: pam_unix(sudo:session): session closed for user root
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:03:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4/99a33b3d-5cc1-4f1d-bed0-5e5402dbd122
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v632: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 239 KiB/s wr, 16 op/s
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "format": "json"}]: dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:03:48.446+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f8deb5d1-795e-4dac-88f0-806d00540ce4' of type subvolume
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f8deb5d1-795e-4dac-88f0-806d00540ce4' of type subvolume
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f8deb5d1-795e-4dac-88f0-806d00540ce4'' moved to trashcan
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f8deb5d1-795e-4dac-88f0-806d00540ce4, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/36e5267c-3b42-4026-8937-3923e0f02444/.meta.tmp'
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/36e5267c-3b42-4026-8937-3923e0f02444/.meta.tmp' to config b'/volumes/_nogroup/36e5267c-3b42-4026-8937-3923e0f02444/.meta'
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "format": "json"}]: dispatch
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:49.340 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:03:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b", "format": "json"}]: dispatch
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: pgmap v632: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 239 KiB/s wr, 16 op/s
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "format": "json"}]: dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f8deb5d1-795e-4dac-88f0-806d00540ce4", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "format": "json"}]: dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e286 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e287 e287: 6 total, 6 up, 6 in
Feb 01 10:03:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v634: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 239 KiB/s wr, 16 op/s
Feb 01 10:03:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:51 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b", "format": "json"}]: dispatch
Feb 01 10:03:51 np0005604215.localdomain ceph-mon[298604]: osdmap e287: 6 total, 6 up, 6 in
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/.meta.tmp'
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/.meta.tmp' to config b'/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/.meta'
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "format": "json"}]: dispatch
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:03:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Feb 01 10:03:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_resize(format:json, new_size:1073741824, no_shrink:True, prefix:fs subvolume resize, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_resize(format:json, new_size:1073741824, no_shrink:True, prefix:fs subvolume resize, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e288 e288: 6 total, 6 up, 6 in
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: pgmap v634: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 639 B/s rd, 239 KiB/s wr, 16 op/s
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: osdmap e288: 6 total, 6 up, 6 in
Feb 01 10:03:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v636: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 145 KiB/s wr, 8 op/s
Feb 01 10:03:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:52 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:52.777 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #46. Immutable memtables: 0.
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.851332) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 25] Flushing memtable with next log file: 46
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232851462, "job": 25, "event": "flush_started", "num_memtables": 1, "num_entries": 1053, "num_deletes": 253, "total_data_size": 1122006, "memory_usage": 1162800, "flush_reason": "Manual Compaction"}
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 25] Level-0 flush table #47: started
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232860501, "cf_name": "default", "job": 25, "event": "table_file_creation", "file_number": 47, "file_size": 726391, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 30371, "largest_seqno": 31419, "table_properties": {"data_size": 721755, "index_size": 2107, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1477, "raw_key_size": 12343, "raw_average_key_size": 21, "raw_value_size": 711657, "raw_average_value_size": 1235, "num_data_blocks": 92, "num_entries": 576, "num_filter_entries": 576, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940197, "oldest_key_time": 1769940197, "file_creation_time": 1769940232, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 47, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 25] Flush lasted 9198 microseconds, and 4716 cpu microseconds.
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.860566) [db/flush_job.cc:967] [default] [JOB 25] Level-0 flush table #47: 726391 bytes OK
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.860600) [db/memtable_list.cc:519] [default] Level-0 commit table #47 started
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.862464) [db/memtable_list.cc:722] [default] Level-0 commit table #47: memtable #1 done
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.862489) EVENT_LOG_v1 {"time_micros": 1769940232862482, "job": 25, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.862515) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 25] Try to delete WAL files size 1116446, prev total WAL file size 1116446, number of live WAL files 2.
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000043.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.863820) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132353530' seq:72057594037927935, type:22 .. '7061786F73003132383032' seq:0, type:0; will stop at (end)
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 26] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 25 Base level 0, inputs: [47(709KB)], [45(21MB)]
Feb 01 10:03:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940232863894, "job": 26, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [47], "files_L6": [45], "score": -1, "input_data_size": 23111303, "oldest_snapshot_seqno": -1}
Feb 01 10:03:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 26] Generated table #48: 14164 keys, 21263630 bytes, temperature: kUnknown
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940233016094, "cf_name": "default", "job": 26, "event": "table_file_creation", "file_number": 48, "file_size": 21263630, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 21183088, "index_size": 44025, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 35461, "raw_key_size": 380487, "raw_average_key_size": 26, "raw_value_size": 20942545, "raw_average_value_size": 1478, "num_data_blocks": 1634, "num_entries": 14164, "num_filter_entries": 14164, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769940232, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 48, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.016524) [db/compaction/compaction_job.cc:1663] [default] [JOB 26] Compacted 1@0 + 1@6 files to L6 => 21263630 bytes
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.018374) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 151.7 rd, 139.6 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(0.7, 21.3 +0.0 blob) out(20.3 +0.0 blob), read-write-amplify(61.1) write-amplify(29.3) OK, records in: 14692, records dropped: 528 output_compression: NoCompression
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.018403) EVENT_LOG_v1 {"time_micros": 1769940233018390, "job": 26, "event": "compaction_finished", "compaction_time_micros": 152332, "compaction_time_cpu_micros": 51993, "output_level": 6, "num_output_files": 1, "total_output_size": 21263630, "num_input_records": 14692, "num_output_records": 14164, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000047.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940233018644, "job": 26, "event": "table_file_deletion", "file_number": 47}
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000045.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940233021997, "job": 26, "event": "table_file_deletion", "file_number": 45}
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:52.863637) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.022089) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.022097) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.022100) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.022103) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:03:53.022106) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "format": "json"}]: dispatch
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume resize", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "new_size": 1073741824, "no_shrink": true, "format": "json"}]: dispatch
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:54 np0005604215.localdomain ceph-mon[298604]: pgmap v636: 177 pgs: 177 active+clean; 210 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 145 KiB/s wr, 8 op/s
Feb 01 10:03:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b_0036dad7-d892-418d-92b8-bb02442ec320", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b_0036dad7-d892-418d-92b8-bb02442ec320, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b_0036dad7-d892-418d-92b8-bb02442ec320, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v637: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 887 B/s rd, 276 KiB/s wr, 18 op/s
Feb 01 10:03:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:54.511 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:03:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:03:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:54 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:03:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:54 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:03:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e288 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "36e5267c-3b42-4026-8937-3923e0f02444", "format": "json"}]: dispatch
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:36e5267c-3b42-4026-8937-3923e0f02444, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:36e5267c-3b42-4026-8937-3923e0f02444, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:03:55.309+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '36e5267c-3b42-4026-8937-3923e0f02444' of type subvolume
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '36e5267c-3b42-4026-8937-3923e0f02444' of type subvolume
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/36e5267c-3b42-4026-8937-3923e0f02444'' moved to trashcan
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:03:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:36e5267c-3b42-4026-8937-3923e0f02444, vol_name:cephfs) < ""
Feb 01 10:03:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659", "osd", "allow rw pool=manila_data namespace=fsvolumens_1f645241-9977-49f9-af5c-e54bd4454730", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b_0036dad7-d892-418d-92b8-bb02442ec320", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "ebc1e7ef-8442-4193-b15e-8f8ec54f9c3b", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: pgmap v637: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 887 B/s rd, 276 KiB/s wr, 18 op/s
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:03:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v638: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 634 B/s rd, 132 KiB/s wr, 9 op/s
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e289 e289: 6 total, 6 up, 6 in
Feb 01 10:03:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:57.095 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:57.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:57.099 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:03:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:57.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:03:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:57.113 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "36e5267c-3b42-4026-8937-3923e0f02444", "format": "json"}]: dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "36e5267c-3b42-4026-8937-3923e0f02444", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: pgmap v638: 177 pgs: 177 active+clean; 211 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 634 B/s rd, 132 KiB/s wr, 9 op/s
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: osdmap e289: 6 total, 6 up, 6 in
Feb 01 10:03:57 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:57.832 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:03:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730/afda1918-3e18-4669-abdc-aa0ca3b12659
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e290 e290: 6 total, 6 up, 6 in
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.123 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.123 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.123 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.124 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.124 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1f645241-9977-49f9-af5c-e54bd4454730", "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1f645241-9977-49f9-af5c-e54bd4454730, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1f645241-9977-49f9-af5c-e54bd4454730, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:03:58.191+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1f645241-9977-49f9-af5c-e54bd4454730' of type subvolume
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1f645241-9977-49f9-af5c-e54bd4454730' of type subvolume
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1f645241-9977-49f9-af5c-e54bd4454730'' moved to trashcan
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1f645241-9977-49f9-af5c-e54bd4454730, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v641: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 358 KiB/s wr, 22 op/s
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: osdmap e290: 6 total, 6 up, 6 in
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/a3be12d6-b4dd-425b-b3bb-4918fb2827ad/.meta.tmp'
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/a3be12d6-b4dd-425b-b3bb-4918fb2827ad/.meta.tmp' to config b'/volumes/_nogroup/a3be12d6-b4dd-425b-b3bb-4918fb2827ad/.meta'
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:03:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/225147411' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.633 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.509s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "696b3ce7-12ee-4387-9824-52b489c80aa5_20508bbb-3be1-454a-9a1a-2a3a9c3349a7", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:696b3ce7-12ee-4387-9824-52b489c80aa5_20508bbb-3be1-454a-9a1a-2a3a9c3349a7, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.831 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:696b3ce7-12ee-4387-9824-52b489c80aa5_20508bbb-3be1-454a-9a1a-2a3a9c3349a7, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.832 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11500MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.832 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.832 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "696b3ce7-12ee-4387-9824-52b489c80aa5", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:696b3ce7-12ee-4387-9824-52b489c80aa5, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp'
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta.tmp' to config b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf/.meta'
Feb 01 10:03:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:696b3ce7-12ee-4387-9824-52b489c80aa5, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.899 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.899 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:03:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:58.921 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:03:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4293205967' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:59.379 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:03:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:59.386 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:03:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:59.407 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:03:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:59.408 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:03:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:59.409 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.576s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:03:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:03:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:03:59.558 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1f645241-9977-49f9-af5c-e54bd4454730", "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1f645241-9977-49f9-af5c-e54bd4454730", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: pgmap v641: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 358 KiB/s wr, 22 op/s
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/225147411' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "696b3ce7-12ee-4387-9824-52b489c80aa5_20508bbb-3be1-454a-9a1a-2a3a9c3349a7", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "snap_name": "696b3ce7-12ee-4387-9824-52b489c80aa5", "force": true, "format": "json"}]: dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4293205967' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:03:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2974073130' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:04:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:04:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:04:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:04:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:04:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e291 e291: 6 total, 6 up, 6 in
Feb 01 10:04:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:04:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18343 "" "Go-http-client/1.1"
Feb 01 10:04:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v643: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 209 KiB/s wr, 12 op/s
Feb 01 10:04:01 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:01 np0005604215.localdomain ceph-mon[298604]: osdmap e291: 6 total, 6 up, 6 in
Feb 01 10:04:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2954701584' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/.meta.tmp'
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/.meta.tmp' to config b'/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/.meta'
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "format": "json"}]: dispatch
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:01.409 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:01.409 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:01.410 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:04:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:04:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:04:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:04:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:04:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "format": "json"}]: dispatch
Feb 01 10:04:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:02.005+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f2ab1bfc-ed8f-45cf-a782-901f372acfbf' of type subvolume
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'f2ab1bfc-ed8f-45cf-a782-901f372acfbf' of type subvolume
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/f2ab1bfc-ed8f-45cf-a782-901f372acfbf'' moved to trashcan
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:f2ab1bfc-ed8f-45cf-a782-901f372acfbf, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:02.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:02.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: pgmap v643: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 209 KiB/s wr, 12 op/s
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "format": "json"}]: dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:02.510+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a3be12d6-b4dd-425b-b3bb-4918fb2827ad' of type subvolume
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'a3be12d6-b4dd-425b-b3bb-4918fb2827ad' of type subvolume
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v644: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 211 KiB/s wr, 15 op/s
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/a3be12d6-b4dd-425b-b3bb-4918fb2827ad'' moved to trashcan
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a3be12d6-b4dd-425b-b3bb-4918fb2827ad, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:04:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:02.867 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:03.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "format": "json"}]: dispatch
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "format": "json"}]: dispatch
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "f2ab1bfc-ed8f-45cf-a782-901f372acfbf", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:03 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:04:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:04:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:04:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:04:03 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:04:03 np0005604215.localdomain podman[317628]: 2026-02-01 10:04:03.887503205 +0000 UTC m=+0.092662396 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Feb 01 10:04:03 np0005604215.localdomain podman[317628]: 2026-02-01 10:04:03.892519011 +0000 UTC m=+0.097678232 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Feb 01 10:04:03 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:04:03 np0005604215.localdomain podman[317629]: 2026-02-01 10:04:03.939702604 +0000 UTC m=+0.142757320 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:04:04 np0005604215.localdomain podman[317633]: 2026-02-01 10:04:04.039930465 +0000 UTC m=+0.238798591 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:04:04 np0005604215.localdomain podman[317629]: 2026-02-01 10:04:04.064965532 +0000 UTC m=+0.268020298 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Feb 01 10:04:04 np0005604215.localdomain podman[317627]: 2026-02-01 10:04:04.016355084 +0000 UTC m=+0.225285353 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1769056855, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, io.openshift.expose-services=, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vendor=Red Hat, Inc., managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 10:04:04 np0005604215.localdomain podman[317633]: 2026-02-01 10:04:04.073764724 +0000 UTC m=+0.272632830 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:04:04 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:04:04 np0005604215.localdomain podman[317627]: 2026-02-01 10:04:04.101768293 +0000 UTC m=+0.310698552 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, release=1769056855, vendor=Red Hat, Inc., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, build-date=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, name=ubi9/ubi-minimal, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, architecture=x86_64, container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, org.opencontainers.image.created=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 10:04:04 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "format": "json"}]: dispatch
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: pgmap v644: 177 pgs: 177 active+clean; 213 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 211 KiB/s wr, 15 op/s
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a3be12d6-b4dd-425b-b3bb-4918fb2827ad", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:04 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:04:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:04 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v645: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 959 B/s rd, 340 KiB/s wr, 21 op/s
Feb 01 10:04:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:04.559 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc", "osd", "allow rw pool=manila_data namespace=fsvolumens_4ee9c104-e931-4251-a599-f0ad33e4932d", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: pgmap v645: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 959 B/s rd, 340 KiB/s wr, 21 op/s
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3350087035' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e291 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, vol_name:cephfs) < ""
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/a68d53cc-1ebe-4c8a-93d3-742bd1afa895'' moved to trashcan
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:a68d53cc-1ebe-4c8a-93d3-742bd1afa895, vol_name:cephfs) < ""
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:05 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:06 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:04:06.437 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:04:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:06.438 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:06 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:04:06.439 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1664883928' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "format": "json"}]: dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "a68d53cc-1ebe-4c8a-93d3-742bd1afa895", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:06 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v646: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 605 B/s rd, 152 KiB/s wr, 10 op/s
Feb 01 10:04:07 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e292 e292: 6 total, 6 up, 6 in
Feb 01 10:04:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:07.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:07 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:07 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:04:07 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d/29d14b5f-8066-45c6-8379-ca06b2f0e0bc
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "format": "json"}]: dispatch
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:4ee9c104-e931-4251-a599-f0ad33e4932d, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:07.913 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:4ee9c104-e931-4251-a599-f0ad33e4932d, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:07.914+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4ee9c104-e931-4251-a599-f0ad33e4932d' of type subvolume
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '4ee9c104-e931-4251-a599-f0ad33e4932d' of type subvolume
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/4ee9c104-e931-4251-a599-f0ad33e4932d'' moved to trashcan
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:4ee9c104-e931-4251-a599-f0ad33e4932d, vol_name:cephfs) < ""
Feb 01 10:04:08 np0005604215.localdomain ceph-mon[298604]: pgmap v646: 177 pgs: 177 active+clean; 214 MiB data, 1.2 GiB used, 41 GiB / 42 GiB avail; 605 B/s rd, 152 KiB/s wr, 10 op/s
Feb 01 10:04:08 np0005604215.localdomain ceph-mon[298604]: osdmap e292: 6 total, 6 up, 6 in
Feb 01 10:04:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:04:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v648: 177 pgs: 177 active+clean; 215 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 306 KiB/s wr, 18 op/s
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39_c889ce58-0d33-4136-87f7-ad74df7d884b", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39_c889ce58-0d33-4136-87f7-ad74df7d884b, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp'
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp' to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta'
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39_c889ce58-0d33-4136-87f7-ad74df7d884b, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "4ee9c104-e931-4251-a599-f0ad33e4932d", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp'
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta.tmp' to config b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f/.meta'
Feb 01 10:04:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:09.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:4fa60cbb-7815-4e58-abbf-0715923dbf39, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:04:09 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:09 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:09.599 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: pgmap v648: 177 pgs: 177 active+clean; 215 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 306 KiB/s wr, 18 op/s
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39_c889ce58-0d33-4136-87f7-ad74df7d884b", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "snap_name": "4fa60cbb-7815-4e58-abbf-0715923dbf39", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e293 e293: 6 total, 6 up, 6 in
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:10 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:04:10.441 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:04:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v650: 177 pgs: 177 active+clean; 215 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 321 KiB/s wr, 17 op/s
Feb 01 10:04:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:10 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:10 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: osdmap e293: 6 total, 6 up, 6 in
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:12 np0005604215.localdomain ceph-mon[298604]: pgmap v650: 177 pgs: 177 active+clean; 215 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 321 KiB/s wr, 17 op/s
Feb 01 10:04:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "format": "json"}]: dispatch
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'b6cb6a1d-1311-44d1-b899-a95bbef2e51f' of type subvolume
Feb 01 10:04:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:12.231+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'b6cb6a1d-1311-44d1-b899-a95bbef2e51f' of type subvolume
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/b6cb6a1d-1311-44d1-b899-a95bbef2e51f'' moved to trashcan
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:b6cb6a1d-1311-44d1-b899-a95bbef2e51f, vol_name:cephfs) < ""
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:12 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v651: 177 pgs: 177 active+clean; 216 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 236 KiB/s wr, 14 op/s
Feb 01 10:04:12 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:12 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:12 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:12.956 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "format": "json"}]: dispatch
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "b6cb6a1d-1311-44d1-b899-a95bbef2e51f", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: pgmap v651: 177 pgs: 177 active+clean; 216 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 236 KiB/s wr, 14 op/s
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:13 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:14 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v652: 177 pgs: 177 active+clean; 216 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 329 KiB/s wr, 19 op/s
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:14 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:14 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:14.626 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:14 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:04:14 np0005604215.localdomain systemd[1]: tmp-crun.Y9ccUe.mount: Deactivated successfully.
Feb 01 10:04:14 np0005604215.localdomain podman[317712]: 2026-02-01 10:04:14.878093176 +0000 UTC m=+0.094810223 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Feb 01 10:04:14 np0005604215.localdomain podman[317712]: 2026-02-01 10:04:14.889340235 +0000 UTC m=+0.106057202 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:04:14 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e293 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: pgmap v652: 177 pgs: 177 active+clean; 216 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.2 KiB/s rd, 329 KiB/s wr, 19 op/s
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:04:15 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v653: 177 pgs: 177 active+clean; 216 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 278 KiB/s wr, 16 op/s
Feb 01 10:04:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:04:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 e294: 6 total, 6 up, 6 in
Feb 01 10:04:17 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:17 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:17 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:04:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:17 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:17 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:04:17 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:17 np0005604215.localdomain podman[317734]: 2026-02-01 10:04:17.872679699 +0000 UTC m=+0.085786673 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:04:17 np0005604215.localdomain podman[317734]: 2026-02-01 10:04:17.886593291 +0000 UTC m=+0.099700245 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 10:04:17 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:04:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:17.999 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:18 np0005604215.localdomain ceph-mon[298604]: pgmap v653: 177 pgs: 177 active+clean; 216 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 278 KiB/s wr, 16 op/s
Feb 01 10:04:18 np0005604215.localdomain ceph-mon[298604]: osdmap e294: 6 total, 6 up, 6 in
Feb 01 10:04:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v655: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 288 KiB/s wr, 18 op/s
Feb 01 10:04:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:19 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:19 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:19 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:19 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:19.676 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:20 np0005604215.localdomain ceph-mon[298604]: pgmap v655: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1.1 KiB/s rd, 288 KiB/s wr, 18 op/s
Feb 01 10:04:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:20 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v656: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 242 KiB/s wr, 15 op/s
Feb 01 10:04:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:20 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:21 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:04:21
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['manila_data', 'backups', 'vms', '.mgr', 'images', 'manila_metadata', 'volumes']
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f9407318e80>), ('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f93d084a940>)]
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.635783082077052e-06 of space, bias 1.0, pg target 0.0003255208333333333 quantized to 32 (current 32)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0017890014307649358 of space, bias 4.0, pg target 1.424045138888889 quantized to 16 (current 16)
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:04:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: pgmap v656: 177 pgs: 177 active+clean; 217 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 921 B/s rd, 242 KiB/s wr, 15 op/s
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:04:22 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v657: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 314 KiB/s wr, 17 op/s
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta.tmp'
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta.tmp' to config b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta'
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "format": "json"}]: dispatch
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:04:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:04:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:23.030 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: mgrmap e62: np0005604215.uhhqtv(active, since 15m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: pgmap v657: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 314 KiB/s wr, 17 op/s
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "format": "json"}]: dispatch
Feb 01 10:04:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:24 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:24 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v658: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 240 KiB/s wr, 15 op/s
Feb 01 10:04:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:24.679 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: mgrmap e63: np0005604215.uhhqtv(active, since 15m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: pgmap v658: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 240 KiB/s wr, 15 op/s
Feb 01 10:04:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:25 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:25 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:25 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "snap_name": "bdf5fb57-afe7-4827-9246-942a9223d37d", "format": "json"}]: dispatch
Feb 01 10:04:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:bdf5fb57-afe7-4827-9246-942a9223d37d, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:04:25 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_create(format:json, prefix:fs subvolume snapshot create, snap_name:bdf5fb57-afe7-4827-9246-942a9223d37d, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:04:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:26 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:26 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot create", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "snap_name": "bdf5fb57-afe7-4827-9246-942a9223d37d", "format": "json"}]: dispatch
Feb 01 10:04:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v659: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 240 KiB/s wr, 15 op/s
Feb 01 10:04:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:28.064 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:28 np0005604215.localdomain ceph-mon[298604]: pgmap v659: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 240 KiB/s wr, 15 op/s
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:28 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:04:28 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v660: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 625 B/s rd, 296 KiB/s wr, 18 op/s
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:04:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:29.726 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ae161acb-9a7a-4b27-a3e8-29a643bfd153/.meta.tmp'
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ae161acb-9a7a-4b27-a3e8-29a643bfd153/.meta.tmp' to config b'/volumes/_nogroup/ae161acb-9a7a-4b27-a3e8-29a643bfd153/.meta'
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "format": "json"}]: dispatch
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, vol_name:cephfs) < ""
Feb 01 10:04:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, vol_name:cephfs) < ""
Feb 01 10:04:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:04:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:04:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:04:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:04:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:04:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18342 "" "Go-http-client/1.1"
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: pgmap v660: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 625 B/s rd, 296 KiB/s wr, 18 op/s
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v661: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 192 KiB/s wr, 12 op/s
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "format": "json"}]: dispatch
Feb 01 10:04:31 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:31 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID tempest-cephx-id-2018707573 with tenant 8c7611c3d483414ea2f2b40e93062710
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:31 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:04:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:04:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:04:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:04:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:04:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:04:31 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume authorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, tenant_id:8c7611c3d483414ea2f2b40e93062710, vol_name:cephfs) < ""
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: pgmap v661: 177 pgs: 177 active+clean; 219 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 192 KiB/s wr, 12 op/s
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.tempest-cephx-id-2018707573", "caps": ["mds", "allow rw path=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934", "osd", "allow rw pool=manila_data namespace=fsvolumens_976e5581-4212-481d-a9bc-03c631888d9c", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:32 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:32 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:32 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v662: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 325 KiB/s wr, 19 op/s
Feb 01 10:04:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:33.111 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:33 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "tenant_id": "8c7611c3d483414ea2f2b40e93062710", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:33 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "format": "json"}]: dispatch
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:33 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:33.175+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ae161acb-9a7a-4b27-a3e8-29a643bfd153' of type subvolume
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ae161acb-9a7a-4b27-a3e8-29a643bfd153' of type subvolume
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, vol_name:cephfs) < ""
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ae161acb-9a7a-4b27-a3e8-29a643bfd153'' moved to trashcan
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ae161acb-9a7a-4b27-a3e8-29a643bfd153, vol_name:cephfs) < ""
Feb 01 10:04:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:34 np0005604215.localdomain ceph-mon[298604]: pgmap v662: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 325 KiB/s wr, 19 op/s
Feb 01 10:04:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v663: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 216 KiB/s wr, 14 op/s
Feb 01 10:04:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:04:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1496006481' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:04:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:04:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/1496006481' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:04:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:34.730 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:04:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:04:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:04:34 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:04:34 np0005604215.localdomain podman[317762]: 2026-02-01 10:04:34.878551583 +0000 UTC m=+0.089973693 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 01 10:04:34 np0005604215.localdomain podman[317762]: 2026-02-01 10:04:34.88780762 +0000 UTC m=+0.099229790 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 10:04:34 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:04:34 np0005604215.localdomain podman[317761]: 2026-02-01 10:04:34.933032574 +0000 UTC m=+0.145090894 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, container_name=openstack_network_exporter, release=1769056855, architecture=x86_64, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, name=ubi9/ubi-minimal, vendor=Red Hat, Inc., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, io.openshift.expose-services=, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, distribution-scope=public, io.openshift.tags=minimal rhel9)
Feb 01 10:04:34 np0005604215.localdomain podman[317761]: 2026-02-01 10:04:34.949637419 +0000 UTC m=+0.161695769 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2026-01-22T05:09:47Z, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, release=1769056855, container_name=openstack_network_exporter, distribution-scope=public, config_id=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Feb 01 10:04:34 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain podman[317763]: 2026-02-01 10:04:35.045786232 +0000 UTC m=+0.256855751 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Feb 01 10:04:35 np0005604215.localdomain podman[317763]: 2026-02-01 10:04:35.093340128 +0000 UTC m=+0.304409647 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} v 0)
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} v 0)
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain podman[317764]: 2026-02-01 10:04:35.101974566 +0000 UTC m=+0.299058401 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:04:35 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:04:35 np0005604215.localdomain podman[317764]: 2026-02-01 10:04:35.136533799 +0000 UTC m=+0.333617624 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume deauthorize, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=tempest-cephx-id-2018707573, client_metadata.root=/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c/c5a01dcc-4cc5-42a5-9fd3-97ead336e934
Feb 01 10:04:35 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ae161acb-9a7a-4b27-a3e8-29a643bfd153", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: pgmap v663: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 216 KiB/s wr, 14 op/s
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1496006481' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/1496006481' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.tempest-cephx-id-2018707573", "format": "json"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.tempest-cephx-id-2018707573"}]': finished
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:tempest-cephx-id-2018707573, format:json, prefix:fs subvolume evict, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:04:35 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:35 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "auth_id": "tempest-cephx-id-2018707573", "format": "json"}]: dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:04:36 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v664: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 215 KiB/s wr, 12 op/s
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, vol_name:cephfs) < ""
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/c2ffc46e-0111-4c7c-9786-7a03cdade368/.meta.tmp'
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/c2ffc46e-0111-4c7c-9786-7a03cdade368/.meta.tmp' to config b'/volumes/_nogroup/c2ffc46e-0111-4c7c-9786-7a03cdade368/.meta'
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:2147483648, sub_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, vol_name:cephfs) < ""
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "format": "json"}]: dispatch
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, vol_name:cephfs) < ""
Feb 01 10:04:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, vol_name:cephfs) < ""
Feb 01 10:04:37 np0005604215.localdomain ceph-mon[298604]: pgmap v664: 177 pgs: 177 active+clean; 221 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 215 KiB/s wr, 12 op/s
Feb 01 10:04:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "size": 2147483648, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "format": "json"}]: dispatch
Feb 01 10:04:37 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:38.159 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v665: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 287 KiB/s wr, 17 op/s
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "976e5581-4212-481d-a9bc-03c631888d9c", "format": "json"}]: dispatch
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:976e5581-4212-481d-a9bc-03c631888d9c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:976e5581-4212-481d-a9bc-03c631888d9c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '976e5581-4212-481d-a9bc-03c631888d9c' of type subvolume
Feb 01 10:04:38 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:38.798+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '976e5581-4212-481d-a9bc-03c631888d9c' of type subvolume
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/976e5581-4212-481d-a9bc-03c631888d9c'' moved to trashcan
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:976e5581-4212-481d-a9bc-03c631888d9c, vol_name:cephfs) < ""
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:38 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:38 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:38 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:38 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: pgmap v665: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 287 KiB/s wr, 17 op/s
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "976e5581-4212-481d-a9bc-03c631888d9c", "format": "json"}]: dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "976e5581-4212-481d-a9bc-03c631888d9c", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:39 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:39.759 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "format": "json"}]: dispatch
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:40 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:40.133+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c2ffc46e-0111-4c7c-9786-7a03cdade368' of type subvolume
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'c2ffc46e-0111-4c7c-9786-7a03cdade368' of type subvolume
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, vol_name:cephfs) < ""
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/c2ffc46e-0111-4c7c-9786-7a03cdade368'' moved to trashcan
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:c2ffc46e-0111-4c7c-9786-7a03cdade368, vol_name:cephfs) < ""
Feb 01 10:04:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v666: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 205 KiB/s wr, 11 op/s
Feb 01 10:04:40 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "format": "json"}]: dispatch
Feb 01 10:04:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "c2ffc46e-0111-4c7c-9786-7a03cdade368", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:41 np0005604215.localdomain ceph-mon[298604]: pgmap v666: 177 pgs: 177 active+clean; 222 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 205 KiB/s wr, 11 op/s
Feb 01 10:04:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:04:41.779 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:04:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:04:41.780 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:04:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:04:41.780 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:42 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:04:42 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v667: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 298 KiB/s wr, 16 op/s
Feb 01 10:04:43 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:43 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:04:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:43.205 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:89116974-4a6c-4049-85b6-edf55e1de63b, vol_name:cephfs) < ""
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/89116974-4a6c-4049-85b6-edf55e1de63b/.meta.tmp'
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/89116974-4a6c-4049-85b6-edf55e1de63b/.meta.tmp' to config b'/volumes/_nogroup/89116974-4a6c-4049-85b6-edf55e1de63b/.meta'
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:89116974-4a6c-4049-85b6-edf55e1de63b, vol_name:cephfs) < ""
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "format": "json"}]: dispatch
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:89116974-4a6c-4049-85b6-edf55e1de63b, vol_name:cephfs) < ""
Feb 01 10:04:43 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:89116974-4a6c-4049-85b6-edf55e1de63b, vol_name:cephfs) < ""
Feb 01 10:04:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:44 np0005604215.localdomain ceph-mon[298604]: pgmap v667: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 298 KiB/s wr, 16 op/s
Feb 01 10:04:44 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v668: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 166 KiB/s wr, 10 op/s
Feb 01 10:04:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:44.758 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "format": "json"}]: dispatch
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:45 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:45 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:04:45 np0005604215.localdomain podman[317847]: 2026-02-01 10:04:45.875843908 +0000 UTC m=+0.088910280 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Feb 01 10:04:45 np0005604215.localdomain podman[317847]: 2026-02-01 10:04:45.915733856 +0000 UTC m=+0.128800268 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:04:45 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:04:46 np0005604215.localdomain ceph-mon[298604]: pgmap v668: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 166 KiB/s wr, 10 op/s
Feb 01 10:04:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:46 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v669: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 165 KiB/s wr, 9 op/s
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "format": "json"}]: dispatch
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:89116974-4a6c-4049-85b6-edf55e1de63b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:89116974-4a6c-4049-85b6-edf55e1de63b, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:46 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:46.688+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '89116974-4a6c-4049-85b6-edf55e1de63b' of type subvolume
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '89116974-4a6c-4049-85b6-edf55e1de63b' of type subvolume
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:89116974-4a6c-4049-85b6-edf55e1de63b, vol_name:cephfs) < ""
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/89116974-4a6c-4049-85b6-edf55e1de63b'' moved to trashcan
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:89116974-4a6c-4049-85b6-edf55e1de63b, vol_name:cephfs) < ""
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #49. Immutable memtables: 0.
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.088342) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 27] Flushing memtable with next log file: 49
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287088388, "job": 27, "event": "flush_started", "num_memtables": 1, "num_entries": 1529, "num_deletes": 261, "total_data_size": 2712974, "memory_usage": 2844272, "flush_reason": "Manual Compaction"}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 27] Level-0 flush table #50: started
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287105041, "cf_name": "default", "job": 27, "event": "table_file_creation", "file_number": 50, "file_size": 1775829, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 31424, "largest_seqno": 32948, "table_properties": {"data_size": 1769395, "index_size": 3391, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1989, "raw_key_size": 16685, "raw_average_key_size": 21, "raw_value_size": 1755307, "raw_average_value_size": 2244, "num_data_blocks": 147, "num_entries": 782, "num_filter_entries": 782, "num_deletions": 261, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940233, "oldest_key_time": 1769940233, "file_creation_time": 1769940287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 50, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 27] Flush lasted 16746 microseconds, and 6130 cpu microseconds.
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.105089) [db/flush_job.cc:967] [default] [JOB 27] Level-0 flush table #50: 1775829 bytes OK
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.105112) [db/memtable_list.cc:519] [default] Level-0 commit table #50 started
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.107991) [db/memtable_list.cc:722] [default] Level-0 commit table #50: memtable #1 done
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.108011) EVENT_LOG_v1 {"time_micros": 1769940287108005, "job": 27, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.108031) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 27] Try to delete WAL files size 2705142, prev total WAL file size 2705466, number of live WAL files 2.
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000046.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.108897) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '6C6F676D0034323634' seq:72057594037927935, type:22 .. '6C6F676D0034353136' seq:0, type:0; will stop at (end)
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 28] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 27 Base level 0, inputs: [50(1734KB)], [48(20MB)]
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287108971, "job": 28, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [50], "files_L6": [48], "score": -1, "input_data_size": 23039459, "oldest_snapshot_seqno": -1}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 28] Generated table #51: 14401 keys, 22821427 bytes, temperature: kUnknown
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287241984, "cf_name": "default", "job": 28, "event": "table_file_creation", "file_number": 51, "file_size": 22821427, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22737588, "index_size": 46746, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36037, "raw_key_size": 387409, "raw_average_key_size": 26, "raw_value_size": 22491272, "raw_average_value_size": 1561, "num_data_blocks": 1741, "num_entries": 14401, "num_filter_entries": 14401, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769940287, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 51, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.242370) [db/compaction/compaction_job.cc:1663] [default] [JOB 28] Compacted 1@0 + 1@6 files to L6 => 22821427 bytes
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.244160) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 173.1 rd, 171.5 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.7, 20.3 +0.0 blob) out(21.8 +0.0 blob), read-write-amplify(25.8) write-amplify(12.9) OK, records in: 14946, records dropped: 545 output_compression: NoCompression
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.244186) EVENT_LOG_v1 {"time_micros": 1769940287244175, "job": 28, "event": "compaction_finished", "compaction_time_micros": 133095, "compaction_time_cpu_micros": 58371, "output_level": 6, "num_output_files": 1, "total_output_size": 22821427, "num_input_records": 14946, "num_output_records": 14401, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000050.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287244590, "job": 28, "event": "table_file_deletion", "file_number": 50}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000048.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940287247547, "job": 28, "event": "table_file_deletion", "file_number": 48}
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.108741) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.247608) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.247614) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.247618) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.247621) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:04:47 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:04:47.247625) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: pgmap v669: 177 pgs: 177 active+clean; 223 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 165 KiB/s wr, 9 op/s
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "format": "json"}]: dispatch
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "89116974-4a6c-4049-85b6-edf55e1de63b", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:48.259 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:48 np0005604215.localdomain sudo[317867]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:04:48 np0005604215.localdomain sudo[317867]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:04:48 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:04:48 np0005604215.localdomain sudo[317867]: pam_unix(sudo:session): session closed for user root
Feb 01 10:04:48 np0005604215.localdomain sudo[317886]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:04:48 np0005604215.localdomain sudo[317886]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:04:48 np0005604215.localdomain podman[317885]: 2026-02-01 10:04:48.463980219 +0000 UTC m=+0.085365679 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 10:04:48 np0005604215.localdomain podman[317885]: 2026-02-01 10:04:48.469317515 +0000 UTC m=+0.090702995 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:04:48 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v670: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 227 KiB/s wr, 13 op/s
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice", "format": "json"} v 0)
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice"} v 0)
Feb 01 10:04:48 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:48 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:49 np0005604215.localdomain sudo[317886]: pam_unix(sudo:session): session closed for user root
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: pgmap v670: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 227 KiB/s wr, 13 op/s
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice", "format": "json"} : dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice"} : dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice"}]': finished
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice", "format": "json"}]: dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 2df47517-d091-4316-8057-ca155b0f19e8 (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 2df47517-d091-4316-8057-ca155b0f19e8 (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 2df47517-d091-4316-8057-ca155b0f19e8 (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:04:49 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:04:49 np0005604215.localdomain sudo[317958]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:04:49 np0005604215.localdomain sudo[317958]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:04:49 np0005604215.localdomain sudo[317958]: pam_unix(sudo:session): session closed for user root
Feb 01 10:04:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:49.797 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, vol_name:cephfs) < ""
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/0f0ccb95-b867-4b25-96e6-6d413cd0de0c/.meta.tmp'
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/0f0ccb95-b867-4b25-96e6-6d413cd0de0c/.meta.tmp' to config b'/volumes/_nogroup/0f0ccb95-b867-4b25-96e6-6d413cd0de0c/.meta'
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, vol_name:cephfs) < ""
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "format": "json"}]: dispatch
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, vol_name:cephfs) < ""
Feb 01 10:04:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, vol_name:cephfs) < ""
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "format": "json"}]: dispatch
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v671: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 155 KiB/s wr, 8 op/s
Feb 01 10:04:51 np0005604215.localdomain ceph-mon[298604]: pgmap v671: 177 pgs: 177 active+clean; 224 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 155 KiB/s wr, 8 op/s
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:04:51 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:04:51Z|00263|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:04:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v672: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 237 KiB/s wr, 12 op/s
Feb 01 10:04:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:04:52 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:04:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:53.302 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "format": "json"}]: dispatch
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:04:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:04:53.327+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0f0ccb95-b867-4b25-96e6-6d413cd0de0c' of type subvolume
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '0f0ccb95-b867-4b25-96e6-6d413cd0de0c' of type subvolume
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, vol_name:cephfs) < ""
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/0f0ccb95-b867-4b25-96e6-6d413cd0de0c'' moved to trashcan
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:04:53 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:0f0ccb95-b867-4b25-96e6-6d413cd0de0c, vol_name:cephfs) < ""
Feb 01 10:04:54 np0005604215.localdomain ceph-mon[298604]: pgmap v672: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 852 B/s rd, 237 KiB/s wr, 12 op/s
Feb 01 10:04:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v673: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 145 KiB/s wr, 9 op/s
Feb 01 10:04:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:54.801 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "format": "json"}]: dispatch
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "0f0ccb95-b867-4b25-96e6-6d413cd0de0c", "force": true, "format": "json"}]: dispatch
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:04:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:04:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:04:56 np0005604215.localdomain ceph-mon[298604]: pgmap v673: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 145 KiB/s wr, 9 op/s
Feb 01 10:04:56 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:56 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:56 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:04:56 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:04:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v674: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 144 KiB/s wr, 8 op/s
Feb 01 10:04:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:56 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, vol_name:cephfs) < ""
Feb 01 10:04:57 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8/.meta.tmp'
Feb 01 10:04:57 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8/.meta.tmp' to config b'/volumes/_nogroup/05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8/.meta'
Feb 01 10:04:57 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, vol_name:cephfs) < ""
Feb 01 10:04:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "format": "json"}]: dispatch
Feb 01 10:04:57 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, vol_name:cephfs) < ""
Feb 01 10:04:57 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, vol_name:cephfs) < ""
Feb 01 10:04:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:04:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:04:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:58.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:58.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:04:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:58.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:04:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:58.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:04:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:58.116 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: pgmap v674: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 144 KiB/s wr, 8 op/s
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "format": "json"}]: dispatch
Feb 01 10:04:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:58.345 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:58 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice_bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:04:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:58 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice_bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:04:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v675: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 198 KiB/s wr, 11 op/s
Feb 01 10:04:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:04:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:04:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:04:59 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice_bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:04:59 np0005604215.localdomain ceph-mon[298604]: pgmap v675: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 198 KiB/s wr, 11 op/s
Feb 01 10:04:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:04:59.831 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:04:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "format": "json"}]: dispatch
Feb 01 10:04:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:00 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:00.003+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8' of type subvolume
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8' of type subvolume
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, vol_name:cephfs) < ""
Feb 01 10:05:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:05:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8'' moved to trashcan
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8, vol_name:cephfs) < ""
Feb 01 10:05:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:05:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:05:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:05:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18344 "" "Go-http-client/1.1"
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.121 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.122 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.123 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.124 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:05:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "format": "json"}]: dispatch
Feb 01 10:05:00 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "05cbe550-ebf5-4e9b-a9e6-4d0d3e6763c8", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v676: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 136 KiB/s wr, 7 op/s
Feb 01 10:05:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:05:00 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3351195153' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.569 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.445s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.775 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.777 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11485MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.777 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.777 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.840 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.840 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:05:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:00.853 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3029507846' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:01.302 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.450s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:05:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:01.307 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:05:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:01.322 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:05:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:01.323 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:05:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:01.323 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.546s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3125540828' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: pgmap v676: 177 pgs: 177 active+clean; 225 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 136 KiB/s wr, 7 op/s
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3351195153' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2651817155' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3029507846' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:05:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:05:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:05:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:05:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:05:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} v 0)
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice_bob"} v 0)
Feb 01 10:05:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice_bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice_bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:05:01 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice_bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:02.322 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:02.323 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:05:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:05:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice_bob", "format": "json"} : dispatch
Feb 01 10:05:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice_bob"} : dispatch
Feb 01 10:05:02 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice_bob"}]': finished
Feb 01 10:05:02 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice_bob", "format": "json"}]: dispatch
Feb 01 10:05:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v677: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 216 KiB/s wr, 11 op/s
Feb 01 10:05:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:03.391 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:05:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:05:03 np0005604215.localdomain ceph-mon[298604]: pgmap v677: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 216 KiB/s wr, 11 op/s
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "snap_name": "bdf5fb57-afe7-4827-9246-942a9223d37d_6da53b8d-dd56-457c-aa1b-d2c1e64390fb", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:bdf5fb57-afe7-4827-9246-942a9223d37d_6da53b8d-dd56-457c-aa1b-d2c1e64390fb, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta.tmp'
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta.tmp' to config b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta'
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:bdf5fb57-afe7-4827-9246-942a9223d37d_6da53b8d-dd56-457c-aa1b-d2c1e64390fb, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "snap_name": "bdf5fb57-afe7-4827-9246-942a9223d37d", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:bdf5fb57-afe7-4827-9246-942a9223d37d, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta.tmp'
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta.tmp' to config b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c/.meta'
Feb 01 10:05:03 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_snapshot_rm(force:True, format:json, prefix:fs subvolume snapshot rm, snap_name:bdf5fb57-afe7-4827-9246-942a9223d37d, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:05:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:04.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:04.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:05:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v678: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 135 KiB/s wr, 8 op/s
Feb 01 10:05:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "snap_name": "bdf5fb57-afe7-4827-9246-942a9223d37d_6da53b8d-dd56-457c-aa1b-d2c1e64390fb", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:04 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume snapshot rm", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "snap_name": "bdf5fb57-afe7-4827-9246-942a9223d37d", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:04.833 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:05:04 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:05:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:04 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:05.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e294 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: pgmap v678: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 135 KiB/s wr, 8 op/s
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:05 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:05:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:05:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:05:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:05:05 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:05:05 np0005604215.localdomain podman[318023]: 2026-02-01 10:05:05.888183704 +0000 UTC m=+0.094198785 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:05:05 np0005604215.localdomain podman[318023]: 2026-02-01 10:05:05.929602709 +0000 UTC m=+0.135617720 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true)
Feb 01 10:05:05 np0005604215.localdomain systemd[1]: tmp-crun.wIyqr7.mount: Deactivated successfully.
Feb 01 10:05:05 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:05:05 np0005604215.localdomain podman[318022]: 2026-02-01 10:05:05.948803164 +0000 UTC m=+0.158222950 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, version=9.7, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9/ubi-minimal, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-type=git, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, vendor=Red Hat, Inc., managed_by=edpm_ansible, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, config_id=openstack_network_exporter)
Feb 01 10:05:06 np0005604215.localdomain podman[318025]: 2026-02-01 10:05:06.003351807 +0000 UTC m=+0.203598389 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 10:05:06 np0005604215.localdomain podman[318024]: 2026-02-01 10:05:06.02342158 +0000 UTC m=+0.227396427 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 10:05:06 np0005604215.localdomain podman[318025]: 2026-02-01 10:05:06.038653023 +0000 UTC m=+0.238899585 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:05:06 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:05:06 np0005604215.localdomain podman[318024]: 2026-02-01 10:05:06.08016296 +0000 UTC m=+0.284137767 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Feb 01 10:05:06 np0005604215.localdomain podman[318022]: 2026-02-01 10:05:06.090495491 +0000 UTC m=+0.299915277 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, version=9.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, release=1769056855, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, distribution-scope=public, maintainer=Red Hat, Inc., config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z)
Feb 01 10:05:06 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:05:06 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:05:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v679: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 134 KiB/s wr, 7 op/s
Feb 01 10:05:06 np0005604215.localdomain systemd[1]: tmp-crun.Ygo2b1.mount: Deactivated successfully.
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "format": "json"}]: dispatch
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:07 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:07.171+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '92c44af7-ac1a-42f2-8baf-64ce97a37c1c' of type subvolume
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '92c44af7-ac1a-42f2-8baf-64ce97a37c1c' of type subvolume
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/92c44af7-ac1a-42f2-8baf-64ce97a37c1c'' moved to trashcan
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:07 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:92c44af7-ac1a-42f2-8baf-64ce97a37c1c, vol_name:cephfs) < ""
Feb 01 10:05:07 np0005604215.localdomain ceph-mon[298604]: pgmap v679: 177 pgs: 177 active+clean; 226 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 134 KiB/s wr, 7 op/s
Feb 01 10:05:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1409088241' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/40dc17a8-88e2-443b-88e0-b305a0120dd3/.meta.tmp'
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/40dc17a8-88e2-443b-88e0-b305a0120dd3/.meta.tmp' to config b'/volumes/_nogroup/40dc17a8-88e2-443b-88e0-b305a0120dd3/.meta'
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "format": "json"}]: dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:08.418 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v680: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 197 KiB/s wr, 11 op/s
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "format": "json"}]: dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "92c44af7-ac1a-42f2-8baf-64ce97a37c1c", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1480131941' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:05:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:09.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "format": "json"}]: dispatch
Feb 01 10:05:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:09 np0005604215.localdomain ceph-mon[298604]: pgmap v680: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 197 KiB/s wr, 11 op/s
Feb 01 10:05:09 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:09 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e295 e295: 6 total, 6 up, 6 in
Feb 01 10:05:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:09.874 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v682: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 171 KiB/s wr, 9 op/s
Feb 01 10:05:10 np0005604215.localdomain ceph-mon[298604]: osdmap e295: 6 total, 6 up, 6 in
Feb 01 10:05:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:11.411 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:05:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:11.412 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 10:05:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:11.413 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:11.414 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID alice bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: pgmap v682: 177 pgs: 177 active+clean; 227 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 171 KiB/s wr, 9 op/s
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.alice bob", "caps": ["mds", "allow r path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow r pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:r, auth_id:alice bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, vol_name:cephfs) < ""
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/1281be2a-d93e-437b-9b63-ac349c9cd9d6/.meta.tmp'
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/1281be2a-d93e-437b-9b63-ac349c9cd9d6/.meta.tmp' to config b'/volumes/_nogroup/1281be2a-d93e-437b-9b63-ac349c9cd9d6/.meta'
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, vol_name:cephfs) < ""
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "format": "json"}]: dispatch
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, vol_name:cephfs) < ""
Feb 01 10:05:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, vol_name:cephfs) < ""
Feb 01 10:05:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v683: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 172 KiB/s wr, 10 op/s
Feb 01 10:05:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "r", "format": "json"}]: dispatch
Feb 01 10:05:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "format": "json"}]: dispatch
Feb 01 10:05:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:13.465 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:14 np0005604215.localdomain ceph-mon[298604]: pgmap v683: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 172 KiB/s wr, 10 op/s
Feb 01 10:05:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v684: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 185 KiB/s wr, 11 op/s
Feb 01 10:05:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:14.878 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e295 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.alice bob", "format": "json"} v 0)
Feb 01 10:05:15 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.alice bob"} v 0)
Feb 01 10:05:15 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:alice bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=alice bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:alice bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/7fe5699b-e701-4ec6-8f05-4a9b4e567acc/.meta.tmp'
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/7fe5699b-e701-4ec6-8f05-4a9b4e567acc/.meta.tmp' to config b'/volumes/_nogroup/7fe5699b-e701-4ec6-8f05-4a9b4e567acc/.meta'
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "format": "json"}]: dispatch
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, vol_name:cephfs) < ""
Feb 01 10:05:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, vol_name:cephfs) < ""
Feb 01 10:05:16 np0005604215.localdomain ceph-mon[298604]: pgmap v684: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 185 KiB/s wr, 11 op/s
Feb 01 10:05:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:05:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.alice bob", "format": "json"} : dispatch
Feb 01 10:05:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.alice bob"} : dispatch
Feb 01 10:05:16 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.alice bob"}]': finished
Feb 01 10:05:16 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v685: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 185 KiB/s wr, 11 op/s
Feb 01 10:05:16 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:05:16 np0005604215.localdomain podman[318105]: 2026-02-01 10:05:16.8676924 +0000 UTC m=+0.082952565 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 10:05:16 np0005604215.localdomain podman[318105]: 2026-02-01 10:05:16.881796028 +0000 UTC m=+0.097056193 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:05:16 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e296 e296: 6 total, 6 up, 6 in
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "alice bob", "format": "json"}]: dispatch
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "format": "json"}]: dispatch
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: pgmap v685: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 614 B/s rd, 185 KiB/s wr, 11 op/s
Feb 01 10:05:17 np0005604215.localdomain ceph-mon[298604]: osdmap e296: 6 total, 6 up, 6 in
Feb 01 10:05:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:05:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Feb 01 10:05:18 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:18 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: Creating meta for ID bob with tenant 2b47af5b1cd441dab5c6c7ba6645e3a3
Feb 01 10:05:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:18.505 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:18 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v687: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 345 B/s rd, 184 KiB/s wr, 10 op/s
Feb 01 10:05:18 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} v 0)
Feb 01 10:05:18 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:05:18 np0005604215.localdomain podman[318125]: 2026-02-01 10:05:18.867238847 +0000 UTC m=+0.078837107 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:05:18 np0005604215.localdomain podman[318125]: 2026-02-01 10:05:18.879705413 +0000 UTC m=+0.091303673 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 10:05:18 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, vol_name:cephfs) < ""
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/6ea849be-f272-4004-a8b4-e7f86ba4f16e/.meta.tmp'
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/6ea849be-f272-4004-a8b4-e7f86ba4f16e/.meta.tmp' to config b'/volumes/_nogroup/6ea849be-f272-4004-a8b4-e7f86ba4f16e/.meta'
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, vol_name:cephfs) < ""
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "format": "json"}]: dispatch
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, vol_name:cephfs) < ""
Feb 01 10:05:19 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, vol_name:cephfs) < ""
Feb 01 10:05:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:05:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:19 np0005604215.localdomain ceph-mon[298604]: pgmap v687: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 345 B/s rd, 184 KiB/s wr, 10 op/s
Feb 01 10:05:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"} : dispatch
Feb 01 10:05:19 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth get-or-create", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88", "mon", "allow r"], "format": "json"}]': finished
Feb 01 10:05:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:19.906 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "format": "json"}]: dispatch
Feb 01 10:05:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v688: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 163 KiB/s wr, 9 op/s
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:05:21
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['.mgr', 'manila_metadata', 'images', 'backups', 'vms', 'volumes', 'manila_data']
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:05:21 np0005604215.localdomain ceph-mon[298604]: pgmap v688: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 307 B/s rd, 163 KiB/s wr, 9 op/s
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 1.635783082077052e-06 of space, bias 1.0, pg target 0.0003255208333333333 quantized to 32 (current 32)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.0025597278929369066 of space, bias 4.0, pg target 2.0375434027777777 quantized to 16 (current 16)
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:05:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:05:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v689: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 67 KiB/s wr, 5 op/s
Feb 01 10:05:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:22 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/.meta.tmp'
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/.meta.tmp' to config b'/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/.meta'
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "format": "json"}]: dispatch
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "format": "json"}]: dispatch
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:23.191+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6ea849be-f272-4004-a8b4-e7f86ba4f16e' of type subvolume
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '6ea849be-f272-4004-a8b4-e7f86ba4f16e' of type subvolume
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/6ea849be-f272-4004-a8b4-e7f86ba4f16e'' moved to trashcan
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:23 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:6ea849be-f272-4004-a8b4-e7f86ba4f16e, vol_name:cephfs) < ""
Feb 01 10:05:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:23.507 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:24 np0005604215.localdomain ceph-mon[298604]: pgmap v689: 177 pgs: 177 active+clean; 228 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 67 KiB/s wr, 5 op/s
Feb 01 10:05:24 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:24 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "format": "json"}]: dispatch
Feb 01 10:05:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v690: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 124 KiB/s wr, 6 op/s
Feb 01 10:05:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:24.908 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "format": "json"}]: dispatch
Feb 01 10:05:25 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "6ea849be-f272-4004-a8b4-e7f86ba4f16e", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: pgmap v690: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 124 KiB/s wr, 6 op/s
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "format": "json"}]: dispatch
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:26 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:26.491+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7fe5699b-e701-4ec6-8f05-4a9b4e567acc' of type subvolume
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '7fe5699b-e701-4ec6-8f05-4a9b4e567acc' of type subvolume
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, vol_name:cephfs) < ""
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/7fe5699b-e701-4ec6-8f05-4a9b4e567acc'' moved to trashcan
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:7fe5699b-e701-4ec6-8f05-4a9b4e567acc, vol_name:cephfs) < ""
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "auth_id": "bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v691: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 124 KiB/s wr, 6 op/s
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} v 0)
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} : dispatch
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Feb 01 10:05:26 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:26 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_authorize(access_level:rw, auth_id:bob, format:json, prefix:fs subvolume authorize, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, tenant_id:2b47af5b1cd441dab5c6c7ba6645e3a3, vol_name:cephfs) < ""
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "format": "json"}]: dispatch
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "7fe5699b-e701-4ec6-8f05-4a9b4e567acc", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume authorize", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "auth_id": "bob", "tenant_id": "2b47af5b1cd441dab5c6c7ba6645e3a3", "access_level": "rw", "format": "json"}]: dispatch
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} : dispatch
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: pgmap v691: 177 pgs: 177 active+clean; 229 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 102 B/s rd, 124 KiB/s wr, 6 op/s
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]} : dispatch
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mon", "allow r", "mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37,allow rw path=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88,allow rw pool=manila_data namespace=fsvolumens_ee5830e4-c3f6-4299-9c44-15480a7cfa4f"]}]': finished
Feb 01 10:05:27 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:28.521 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v692: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 268 B/s rd, 185 KiB/s wr, 8 op/s
Feb 01 10:05:29 np0005604215.localdomain ceph-mon[298604]: pgmap v692: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 268 B/s rd, 185 KiB/s wr, 8 op/s
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Feb 01 10:05:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:29 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} v 0)
Feb 01 10:05:29 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} : dispatch
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=bob, client_metadata.root=/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f/96ec79df-7282-459b-9f45-01a3f66fbb7e
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "format": "json"}]: dispatch
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:29.871+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1281be2a-d93e-437b-9b63-ac349c9cd9d6' of type subvolume
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1281be2a-d93e-437b-9b63-ac349c9cd9d6' of type subvolume
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1281be2a-d93e-437b-9b63-ac349c9cd9d6'' moved to trashcan
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:29 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1281be2a-d93e-437b-9b63-ac349c9cd9d6, vol_name:cephfs) < ""
Feb 01 10:05:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:29.954 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:05:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:05:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:05:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:05:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:05:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18344 "" "Go-http-client/1.1"
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v693: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 132 KiB/s wr, 6 op/s
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} : dispatch
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]} : dispatch
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth caps", "entity": "client.bob", "caps": ["mds", "allow rw path=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37", "osd", "allow rw pool=manila_data namespace=fsvolumens_1c2f0941-aab0-42d0-937e-94c942e5fb88"]}]': finished
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "format": "json"}]: dispatch
Feb 01 10:05:30 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1281be2a-d93e-437b-9b63-ac349c9cd9d6", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:05:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:05:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:05:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:05:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:05:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:05:31 np0005604215.localdomain ceph-mon[298604]: pgmap v693: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 132 KiB/s wr, 6 op/s
Feb 01 10:05:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v694: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 190 KiB/s wr, 9 op/s
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "format": "json"}]: dispatch
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '40dc17a8-88e2-443b-88e0-b305a0120dd3' of type subvolume
Feb 01 10:05:33 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:33.062+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '40dc17a8-88e2-443b-88e0-b305a0120dd3' of type subvolume
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/40dc17a8-88e2-443b-88e0-b305a0120dd3'' moved to trashcan
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:40dc17a8-88e2-443b-88e0-b305a0120dd3, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:33.567 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.bob", "format": "json"} v 0)
Feb 01 10:05:33 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:33 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth rm", "entity": "client.bob"} v 0)
Feb 01 10:05:33 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_deauthorize(auth_id:bob, format:json, prefix:fs subvolume deauthorize, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict clients with auth_name=bob, client_metadata.root=/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88/53cdb057-d7f8-43f2-812c-305c99393a37
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_v1] evict: joined all
Feb 01 10:05:33 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_evict(auth_id:bob, format:json, prefix:fs subvolume evict, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: pgmap v694: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 597 B/s rd, 190 KiB/s wr, 9 op/s
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "format": "json"}]: dispatch
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "40dc17a8-88e2-443b-88e0-b305a0120dd3", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.bob", "format": "json"} : dispatch
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth rm", "entity": "client.bob"} : dispatch
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' cmd='[{"prefix": "auth rm", "entity": "client.bob"}]': finished
Feb 01 10:05:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v695: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 189 KiB/s wr, 8 op/s
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2584492261' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:05:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/2584492261' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:05:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:34.957 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume deauthorize", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:35 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume evict", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "auth_id": "bob", "format": "json"}]: dispatch
Feb 01 10:05:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2584492261' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:05:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/2584492261' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:05:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e296 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:36 np0005604215.localdomain ceph-mon[298604]: pgmap v695: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 189 KiB/s wr, 8 op/s
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v696: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 130 KiB/s wr, 6 op/s
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: tmp-crun.uPZfYD.mount: Deactivated successfully.
Feb 01 10:05:36 np0005604215.localdomain podman[318153]: 2026-02-01 10:05:36.829372763 +0000 UTC m=+0.080993554 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 10:05:36 np0005604215.localdomain podman[318153]: 2026-02-01 10:05:36.840784268 +0000 UTC m=+0.092405089 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:05:36 np0005604215.localdomain podman[318150]: 2026-02-01 10:05:36.891305685 +0000 UTC m=+0.147815878 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9/ubi-minimal, vcs-type=git, vendor=Red Hat, Inc., config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, build-date=2026-01-22T05:09:47Z, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, version=9.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, distribution-scope=public, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter)
Feb 01 10:05:36 np0005604215.localdomain podman[318151]: 2026-02-01 10:05:36.937519499 +0000 UTC m=+0.193319290 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Feb 01 10:05:36 np0005604215.localdomain podman[318150]: 2026-02-01 10:05:36.956255251 +0000 UTC m=+0.212765444 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.7, release=1769056855, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2026-01-22T05:09:47Z, vendor=Red Hat, Inc., name=ubi9/ubi-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, config_id=openstack_network_exporter, managed_by=edpm_ansible)
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "format": "json"}]: dispatch
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:36 np0005604215.localdomain podman[318151]: 2026-02-01 10:05:36.972643429 +0000 UTC m=+0.228443180 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ee5830e4-c3f6-4299-9c44-15480a7cfa4f' of type subvolume
Feb 01 10:05:36 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:36.977+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume 'ee5830e4-c3f6-4299-9c44-15480a7cfa4f' of type subvolume
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/ee5830e4-c3f6-4299-9c44-15480a7cfa4f'' moved to trashcan
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:36 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:ee5830e4-c3f6-4299-9c44-15480a7cfa4f, vol_name:cephfs) < ""
Feb 01 10:05:36 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:05:37 np0005604215.localdomain podman[318152]: 2026-02-01 10:05:37.049155423 +0000 UTC m=+0.300032591 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Feb 01 10:05:37 np0005604215.localdomain podman[318152]: 2026-02-01 10:05:37.088673659 +0000 UTC m=+0.339550827 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:05:37 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:05:38 np0005604215.localdomain ceph-mon[298604]: pgmap v696: 177 pgs: 177 active+clean; 230 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 511 B/s rd, 130 KiB/s wr, 6 op/s
Feb 01 10:05:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "format": "json"}]: dispatch
Feb 01 10:05:38 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "ee5830e4-c3f6-4299-9c44-15480a7cfa4f", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v697: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 192 KiB/s wr, 10 op/s
Feb 01 10:05:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:38.602 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:38 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:05:38.832 259225 INFO neutron.agent.linux.ip_lib [None req-21546086-5e68-41c2-95df-a8701f6241d6 - - - - - -] Device tapd2a0f71b-44 cannot be used as it has no MAC address
Feb 01 10:05:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:38.852 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:38 np0005604215.localdomain kernel: device tapd2a0f71b-44 entered promiscuous mode
Feb 01 10:05:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:38.863 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:38 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:05:38Z|00264|binding|INFO|Claiming lport d2a0f71b-4467-42be-891d-ec006aa4434a for this chassis.
Feb 01 10:05:38 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:05:38Z|00265|binding|INFO|d2a0f71b-4467-42be-891d-ec006aa4434a: Claiming unknown
Feb 01 10:05:38 np0005604215.localdomain NetworkManager[5972]: <info>  [1769940338.8670] manager: (tapd2a0f71b-44): new Generic device (/org/freedesktop/NetworkManager/Devices/46)
Feb 01 10:05:38 np0005604215.localdomain systemd-udevd[318244]: Network interface NamePolicy= disabled on kernel command line.
Feb 01 10:05:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:38.878 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/16', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-33acccbf-fdce-4047-a62f-897349b76d78', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33acccbf-fdce-4047-a62f-897349b76d78', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6ca417b497f4e6882e6d3909dae11b9', 'neutron:revision_number': '1', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6efaebd7-2ff5-4535-bb80-29930fba1adb, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=d2a0f71b-4467-42be-891d-ec006aa4434a) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:05:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:38.879 158655 INFO neutron.agent.ovn.metadata.agent [-] Port d2a0f71b-4467-42be-891d-ec006aa4434a in datapath 33acccbf-fdce-4047-a62f-897349b76d78 bound to our chassis
Feb 01 10:05:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:38.880 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Port 874ad3b7-ea03-438f-9071-2da637345e05 IP addresses were not retrieved from the Port_Binding MAC column ['unknown'] _get_port_ips /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:536
Feb 01 10:05:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:38.880 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33acccbf-fdce-4047-a62f-897349b76d78, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 10:05:38 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:38.881 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[5b0cb76b-8d52-4e49-907f-b00602a701f4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:05:38Z|00266|binding|INFO|Setting lport d2a0f71b-4467-42be-891d-ec006aa4434a ovn-installed in OVS
Feb 01 10:05:38 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:05:38Z|00267|binding|INFO|Setting lport d2a0f71b-4467-42be-891d-ec006aa4434a up in Southbound
Feb 01 10:05:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:38.902 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain virtnodedevd[224955]: ethtool ioctl error on tapd2a0f71b-44: No such device
Feb 01 10:05:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:38.940 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:38 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:38.968 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:39 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e297 e297: 6 total, 6 up, 6 in
Feb 01 10:05:39 np0005604215.localdomain ceph-mon[298604]: pgmap v697: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 1023 B/s rd, 192 KiB/s wr, 10 op/s
Feb 01 10:05:39 np0005604215.localdomain podman[318315]: 
Feb 01 10:05:39 np0005604215.localdomain podman[318315]: 2026-02-01 10:05:39.870431097 +0000 UTC m=+0.089301532 container create 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:05:39 np0005604215.localdomain systemd[1]: Started libpod-conmon-4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314.scope.
Feb 01 10:05:39 np0005604215.localdomain podman[318315]: 2026-02-01 10:05:39.825481783 +0000 UTC m=+0.044352268 image pull  quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified
Feb 01 10:05:39 np0005604215.localdomain systemd[1]: Started libcrun container.
Feb 01 10:05:39 np0005604215.localdomain kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c09f618ddecd5f6752fa51b48a7ea5a7239cb1ba93f57b916b2d542bccd407c1/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Feb 01 10:05:39 np0005604215.localdomain podman[318315]: 2026-02-01 10:05:39.949003915 +0000 UTC m=+0.167874360 container init 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 10:05:39 np0005604215.localdomain dnsmasq[318334]: started, version 2.85 cachesize 150
Feb 01 10:05:39 np0005604215.localdomain dnsmasq[318334]: DNS service limited to local subnets
Feb 01 10:05:39 np0005604215.localdomain dnsmasq[318334]: compile time options: IPv6 GNU-getopt DBus no-UBus no-i18n IDN2 DHCP DHCPv6 no-Lua TFTP no-conntrack ipset auth cryptohash DNSSEC loop-detect inotify dumpfile
Feb 01 10:05:39 np0005604215.localdomain dnsmasq[318334]: warning: no upstream servers configured
Feb 01 10:05:39 np0005604215.localdomain dnsmasq-dhcp[318334]: DHCP, static leases only on 10.100.0.0, lease time 1d
Feb 01 10:05:39 np0005604215.localdomain dnsmasq[318334]: read /var/lib/neutron/dhcp/33acccbf-fdce-4047-a62f-897349b76d78/addn_hosts - 0 addresses
Feb 01 10:05:39 np0005604215.localdomain dnsmasq-dhcp[318334]: read /var/lib/neutron/dhcp/33acccbf-fdce-4047-a62f-897349b76d78/host
Feb 01 10:05:39 np0005604215.localdomain dnsmasq-dhcp[318334]: read /var/lib/neutron/dhcp/33acccbf-fdce-4047-a62f-897349b76d78/opts
Feb 01 10:05:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:39.984 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:39 np0005604215.localdomain podman[318315]: 2026-02-01 10:05:39.986602223 +0000 UTC m=+0.205472668 container start 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Feb 01 10:05:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e297 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e298 e298: 6 total, 6 up, 6 in
Feb 01 10:05:40 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:05:40.230 259225 INFO neutron.agent.dhcp.agent [None req-42a447f9-d25e-479d-8ce2-2b8bd0236d90 - - - - - -] DHCP configuration for ports {'9614388b-eb52-4cd9-ad36-6e1d8e7f685b'} is completed
Feb 01 10:05:40 np0005604215.localdomain ceph-mon[298604]: osdmap e297: 6 total, 6 up, 6 in
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "format": "json"}]: dispatch
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:40 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:40.318+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1c2f0941-aab0-42d0-937e-94c942e5fb88' of type subvolume
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '1c2f0941-aab0-42d0-937e-94c942e5fb88' of type subvolume
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/1c2f0941-aab0-42d0-937e-94c942e5fb88'' moved to trashcan
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:1c2f0941-aab0-42d0-937e-94c942e5fb88, vol_name:cephfs) < ""
Feb 01 10:05:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v700: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 92 KiB/s wr, 5 op/s
Feb 01 10:05:40 np0005604215.localdomain systemd[1]: tmp-crun.c3EJHh.mount: Deactivated successfully.
Feb 01 10:05:41 np0005604215.localdomain ceph-mon[298604]: osdmap e298: 6 total, 6 up, 6 in
Feb 01 10:05:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "format": "json"}]: dispatch
Feb 01 10:05:41 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "1c2f0941-aab0-42d0-937e-94c942e5fb88", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:41 np0005604215.localdomain ceph-mon[298604]: pgmap v700: 177 pgs: 177 active+clean; 231 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 767 B/s rd, 92 KiB/s wr, 5 op/s
Feb 01 10:05:41 np0005604215.localdomain dnsmasq[318334]: exiting on receipt of SIGTERM
Feb 01 10:05:41 np0005604215.localdomain systemd[1]: libpod-4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314.scope: Deactivated successfully.
Feb 01 10:05:41 np0005604215.localdomain podman[318350]: 2026-02-01 10:05:41.333140236 +0000 UTC m=+0.067541927 container kill 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:05:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:05:41Z|00268|binding|INFO|Removing iface tapd2a0f71b-44 ovn-installed in OVS
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.363 158655 WARNING neutron.agent.ovn.metadata.agent [-] Removing non-external type port 874ad3b7-ea03-438f-9071-2da637345e05 with type ""
Feb 01 10:05:41 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:05:41Z|00269|binding|INFO|Removing lport d2a0f71b-4467-42be-891d-ec006aa4434a ovn-installed in OVS
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.365 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched DELETE: PortBindingDeletedEvent(events=('delete',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'np0005604215.localdomain'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/16', 'neutron:device_id': 'dhcpd3c7262e-bf25-53c6-bfa9-f11e8686eb9b-33acccbf-fdce-4047-a62f-897349b76d78', 'neutron:device_owner': 'network:dhcp', 'neutron:mtu': '', 'neutron:network_name': 'neutron-33acccbf-fdce-4047-a62f-897349b76d78', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'b6ca417b497f4e6882e6d3909dae11b9', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'np0005604215.localdomain'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6efaebd7-2ff5-4535-bb80-29930fba1adb, chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f7fd10a29a0>], logical_port=d2a0f71b-4467-42be-891d-ec006aa4434a) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:05:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:41.367 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.369 158655 INFO neutron.agent.ovn.metadata.agent [-] Port d2a0f71b-4467-42be-891d-ec006aa4434a in datapath 33acccbf-fdce-4047-a62f-897349b76d78 unbound from our chassis
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.373 158655 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 33acccbf-fdce-4047-a62f-897349b76d78, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.374 303130 DEBUG oslo.privsep.daemon [-] privsep: reply[a9ef3f44-1af5-46da-b151-9346c1f1aa9c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501
Feb 01 10:05:41 np0005604215.localdomain podman[318363]: 2026-02-01 10:05:41.40546254 +0000 UTC m=+0.059357023 container died 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:05:41 np0005604215.localdomain podman[318363]: 2026-02-01 10:05:41.561389809 +0000 UTC m=+0.215284232 container cleanup 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3)
Feb 01 10:05:41 np0005604215.localdomain systemd[1]: libpod-conmon-4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314.scope: Deactivated successfully.
Feb 01 10:05:41 np0005604215.localdomain podman[318370]: 2026-02-01 10:05:41.604410604 +0000 UTC m=+0.247765869 container remove 4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314 (image=quay.io/podified-antelope-centos9/openstack-neutron-dhcp-agent:current-podified, name=neutron-dnsmasq-qdhcp-33acccbf-fdce-4047-a62f-897349b76d78, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127)
Feb 01 10:05:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:41.615 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:41 np0005604215.localdomain kernel: device tapd2a0f71b-44 left promiscuous mode
Feb 01 10:05:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:41.633 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:41 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:05:41.653 259225 INFO neutron.agent.dhcp.agent [None req-13ccfa7a-2ea3-4c70-afdb-0f7b98556939 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 10:05:41 np0005604215.localdomain neutron_dhcp_agent[259221]: 2026-02-01 10:05:41.653 259225 INFO neutron.agent.dhcp.agent [None req-13ccfa7a-2ea3-4c70-afdb-0f7b98556939 - - - - - -] Network not present, action: clean_devices, action_kwargs: {}
Feb 01 10:05:41 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:41.676 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.783 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.784 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:05:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:05:41.784 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:05:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay-c09f618ddecd5f6752fa51b48a7ea5a7239cb1ba93f57b916b2d542bccd407c1-merged.mount: Deactivated successfully.
Feb 01 10:05:41 np0005604215.localdomain systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4b4ed587ee645ca6a596ea803f2a53c3fea9006817704916ebc64addea7f4314-userdata-shm.mount: Deactivated successfully.
Feb 01 10:05:41 np0005604215.localdomain systemd[1]: run-netns-qdhcp\x2d33acccbf\x2dfdce\x2d4047\x2da62f\x2d897349b76d78.mount: Deactivated successfully.
Feb 01 10:05:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v701: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.7 MiB/s wr, 62 op/s
Feb 01 10:05:43 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:43.629 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:44 np0005604215.localdomain ceph-mon[298604]: pgmap v701: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.7 MiB/s wr, 62 op/s
Feb 01 10:05:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v702: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.7 MiB/s wr, 62 op/s
Feb 01 10:05:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:44.986 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e298 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:45 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, vol_name:cephfs) < ""
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/29ed0300-6f83-4d5e-934d-f9bed65972ad/.meta.tmp'
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/29ed0300-6f83-4d5e-934d-f9bed65972ad/.meta.tmp' to config b'/volumes/_nogroup/29ed0300-6f83-4d5e-934d-f9bed65972ad/.meta'
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, vol_name:cephfs) < ""
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "format": "json"}]: dispatch
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, vol_name:cephfs) < ""
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, vol_name:cephfs) < ""
Feb 01 10:05:46 np0005604215.localdomain ceph-mon[298604]: pgmap v702: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 40 KiB/s rd, 2.7 MiB/s wr, 62 op/s
Feb 01 10:05:46 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v703: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 56 op/s
Feb 01 10:05:47 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 e299: 6 total, 6 up, 6 in
Feb 01 10:05:47 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:47 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "format": "json"}]: dispatch
Feb 01 10:05:47 np0005604215.localdomain ceph-mon[298604]: osdmap e299: 6 total, 6 up, 6 in
Feb 01 10:05:47 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:05:47 np0005604215.localdomain podman[318399]: 2026-02-01 10:05:47.865335271 +0000 UTC m=+0.079202968 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ceilometer_agent_compute)
Feb 01 10:05:47 np0005604215.localdomain podman[318399]: 2026-02-01 10:05:47.880959447 +0000 UTC m=+0.094827124 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, managed_by=edpm_ansible, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2)
Feb 01 10:05:47 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:05:48 np0005604215.localdomain ceph-mon[298604]: pgmap v703: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 39 KiB/s rd, 2.6 MiB/s wr, 56 op/s
Feb 01 10:05:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v705: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 2.5 MiB/s wr, 56 op/s
Feb 01 10:05:48 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:48.668 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:065a1ec8-780b-4355-aa78-092fe74d1d95, vol_name:cephfs) < ""
Feb 01 10:05:49 np0005604215.localdomain ceph-mon[298604]: pgmap v705: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 38 KiB/s rd, 2.5 MiB/s wr, 56 op/s
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/065a1ec8-780b-4355-aa78-092fe74d1d95/.meta.tmp'
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/065a1ec8-780b-4355-aa78-092fe74d1d95/.meta.tmp' to config b'/volumes/_nogroup/065a1ec8-780b-4355-aa78-092fe74d1d95/.meta'
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:065a1ec8-780b-4355-aa78-092fe74d1d95, vol_name:cephfs) < ""
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "format": "json"}]: dispatch
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:065a1ec8-780b-4355-aa78-092fe74d1d95, vol_name:cephfs) < ""
Feb 01 10:05:49 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:065a1ec8-780b-4355-aa78-092fe74d1d95, vol_name:cephfs) < ""
Feb 01 10:05:49 np0005604215.localdomain sudo[318420]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:05:49 np0005604215.localdomain sudo[318420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:05:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:05:49 np0005604215.localdomain sudo[318420]: pam_unix(sudo:session): session closed for user root
Feb 01 10:05:49 np0005604215.localdomain sudo[318443]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 check-host
Feb 01 10:05:49 np0005604215.localdomain sudo[318443]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:05:49 np0005604215.localdomain podman[318438]: 2026-02-01 10:05:49.675688047 +0000 UTC m=+0.078733124 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:05:49 np0005604215.localdomain podman[318438]: 2026-02-01 10:05:49.68576494 +0000 UTC m=+0.088810017 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:05:49 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:05:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:49.991 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain.devices.0}] v 0)
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604213.localdomain}] v 0)
Feb 01 10:05:50 np0005604215.localdomain sudo[318443]: pam_unix(sudo:session): session closed for user root
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain.devices.0}] v 0)
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain.devices.0}] v 0)
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604215.localdomain}] v 0)
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/host.np0005604212.localdomain}] v 0)
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "format": "json"}]: dispatch
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:50 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:50 np0005604215.localdomain sudo[318499]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:05:50 np0005604215.localdomain sudo[318499]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:05:50 np0005604215.localdomain sudo[318499]: pam_unix(sudo:session): session closed for user root
Feb 01 10:05:50 np0005604215.localdomain sudo[318517]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:05:50 np0005604215.localdomain sudo[318517]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:05:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v706: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.1 MiB/s wr, 47 op/s
Feb 01 10:05:51 np0005604215.localdomain sudo[318517]: pam_unix(sudo:session): session closed for user root
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev c95f1f35-5299-4082-b400-b5fd30318b3b (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev c95f1f35-5299-4082-b400-b5fd30318b3b (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event c95f1f35-5299-4082-b400-b5fd30318b3b (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: pgmap v706: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 31 KiB/s rd, 2.1 MiB/s wr, 47 op/s
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:05:51 np0005604215.localdomain sudo[318568]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:05:51 np0005604215.localdomain sudo[318568]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:05:51 np0005604215.localdomain sudo[318568]: pam_unix(sudo:session): session closed for user root
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:05:51 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:05:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "format": "json"}]: dispatch
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:065a1ec8-780b-4355-aa78-092fe74d1d95, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:065a1ec8-780b-4355-aa78-092fe74d1d95, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:52 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:52.545+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '065a1ec8-780b-4355-aa78-092fe74d1d95' of type subvolume
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '065a1ec8-780b-4355-aa78-092fe74d1d95' of type subvolume
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:065a1ec8-780b-4355-aa78-092fe74d1d95, vol_name:cephfs) < ""
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/065a1ec8-780b-4355-aa78-092fe74d1d95'' moved to trashcan
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:065a1ec8-780b-4355-aa78-092fe74d1d95, vol_name:cephfs) < ""
Feb 01 10:05:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v707: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 3 op/s
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #52. Immutable memtables: 0.
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.912910) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 29] Flushing memtable with next log file: 52
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940352912952, "job": 29, "event": "flush_started", "num_memtables": 1, "num_entries": 1493, "num_deletes": 253, "total_data_size": 1788539, "memory_usage": 1823576, "flush_reason": "Manual Compaction"}
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 29] Level-0 flush table #53: started
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940352923347, "cf_name": "default", "job": 29, "event": "table_file_creation", "file_number": 53, "file_size": 1146304, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 32953, "largest_seqno": 34441, "table_properties": {"data_size": 1140040, "index_size": 3350, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 1861, "raw_key_size": 15788, "raw_average_key_size": 21, "raw_value_size": 1126669, "raw_average_value_size": 1560, "num_data_blocks": 141, "num_entries": 722, "num_filter_entries": 722, "num_deletions": 253, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940287, "oldest_key_time": 1769940287, "file_creation_time": 1769940352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 53, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 29] Flush lasted 10484 microseconds, and 4446 cpu microseconds.
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.923393) [db/flush_job.cc:967] [default] [JOB 29] Level-0 flush table #53: 1146304 bytes OK
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.923417) [db/memtable_list.cc:519] [default] Level-0 commit table #53 started
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.926040) [db/memtable_list.cc:722] [default] Level-0 commit table #53: memtable #1 done
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.926061) EVENT_LOG_v1 {"time_micros": 1769940352926055, "job": 29, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.926083) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 29] Try to delete WAL files size 1781147, prev total WAL file size 1781147, number of live WAL files 2.
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000049.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.926956) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003132383031' seq:72057594037927935, type:22 .. '7061786F73003133303533' seq:0, type:0; will stop at (end)
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 30] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 29 Base level 0, inputs: [53(1119KB)], [51(21MB)]
Feb 01 10:05:52 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940352927038, "job": 30, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [53], "files_L6": [51], "score": -1, "input_data_size": 23967731, "oldest_snapshot_seqno": -1}
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 30] Generated table #54: 14587 keys, 22436462 bytes, temperature: kUnknown
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940353051459, "cf_name": "default", "job": 30, "event": "table_file_creation", "file_number": 54, "file_size": 22436462, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 22352342, "index_size": 46579, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 36485, "raw_key_size": 392401, "raw_average_key_size": 26, "raw_value_size": 22103756, "raw_average_value_size": 1515, "num_data_blocks": 1727, "num_entries": 14587, "num_filter_entries": 14587, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769940352, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 54, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.051959) [db/compaction/compaction_job.cc:1663] [default] [JOB 30] Compacted 1@0 + 1@6 files to L6 => 22436462 bytes
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.054972) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 192.3 rd, 180.0 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(1.1, 21.8 +0.0 blob) out(21.4 +0.0 blob), read-write-amplify(40.5) write-amplify(19.6) OK, records in: 15123, records dropped: 536 output_compression: NoCompression
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.055003) EVENT_LOG_v1 {"time_micros": 1769940353054990, "job": 30, "event": "compaction_finished", "compaction_time_micros": 124668, "compaction_time_cpu_micros": 58286, "output_level": 6, "num_output_files": 1, "total_output_size": 22436462, "num_input_records": 15123, "num_output_records": 14587, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000053.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940353055812, "job": 30, "event": "table_file_deletion", "file_number": 53}
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000051.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940353059360, "job": 30, "event": "table_file_deletion", "file_number": 51}
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:52.926809) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.059523) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.059532) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.059535) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.059538) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:05:53 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:05:53.059542) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:05:53 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:53.710 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "format": "json"}]: dispatch
Feb 01 10:05:54 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "065a1ec8-780b-4355-aa78-092fe74d1d95", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:54 np0005604215.localdomain ceph-mon[298604]: pgmap v707: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 3 op/s
Feb 01 10:05:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v708: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 3 op/s
Feb 01 10:05:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:54.992 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "format": "json"}]: dispatch
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:05:55 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:05:55.766+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '29ed0300-6f83-4d5e-934d-f9bed65972ad' of type subvolume
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '29ed0300-6f83-4d5e-934d-f9bed65972ad' of type subvolume
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, vol_name:cephfs) < ""
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/29ed0300-6f83-4d5e-934d-f9bed65972ad'' moved to trashcan
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:05:55 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:29ed0300-6f83-4d5e-934d-f9bed65972ad, vol_name:cephfs) < ""
Feb 01 10:05:56 np0005604215.localdomain ceph-mon[298604]: pgmap v708: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 3 op/s
Feb 01 10:05:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v709: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 3 op/s
Feb 01 10:05:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "format": "json"}]: dispatch
Feb 01 10:05:57 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "29ed0300-6f83-4d5e-934d-f9bed65972ad", "force": true, "format": "json"}]: dispatch
Feb 01 10:05:57 np0005604215.localdomain ceph-mon[298604]: pgmap v709: 177 pgs: 177 active+clean; 232 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 204 B/s rd, 54 KiB/s wr, 3 op/s
Feb 01 10:05:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:58.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:58.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:05:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:58.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:05:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:58.120 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:05:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v710: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 358 B/s rd, 98 KiB/s wr, 4 op/s
Feb 01 10:05:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:58.747 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:05:59 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2", "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:05:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_create(format:json, group_name:ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Feb 01 10:05:59 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_create(format:json, group_name:ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Feb 01 10:05:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:05:59.116 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:05:59 np0005604215.localdomain ceph-mon[298604]: pgmap v710: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 358 B/s rd, 98 KiB/s wr, 4 op/s
Feb 01 10:05:59 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2", "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:00.027 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:06:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:06:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:06:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:06:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:06:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18348 "" "Go-http-client/1.1"
Feb 01 10:06:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:00.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v711: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 80 KiB/s wr, 2 op/s
Feb 01 10:06:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3499613286' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.119 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.119 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.120 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.120 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.120 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:06:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:06:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1457138988' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:06:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:06:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:06:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:06:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:06:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.580 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.459s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:06:01 np0005604215.localdomain ceph-mon[298604]: pgmap v711: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 80 KiB/s wr, 2 op/s
Feb 01 10:06:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2322596810' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1457138988' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.792 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.794 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11481MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.794 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.795 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.860 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.861 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:06:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:01.982 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:06:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Feb 01 10:06:02 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Feb 01 10:06:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:06:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/758051834' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:02.506 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.525s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:06:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:02.512 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:06:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:02.533 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:06:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:02.535 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:06:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:02.535 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.741s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:06:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v712: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 100 KiB/s wr, 3 op/s
Feb 01 10:06:03 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "ac2fcae5-84f4-4aa2-b0db-8c8658f5c9d2", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/758051834' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:03.802 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:04 np0005604215.localdomain ceph-mon[298604]: pgmap v712: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 100 KiB/s wr, 3 op/s
Feb 01 10:06:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:04.535 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:04.536 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:04.536 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:04.536 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:06:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v713: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 68 KiB/s wr, 2 op/s
Feb 01 10:06:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:05.034 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:05 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "a4d9ba32-76b0-4058-9528-d3cb71806502", "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_create(format:json, group_name:a4d9ba32-76b0-4058-9528-d3cb71806502, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Feb 01 10:06:05 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_create(format:json, group_name:a4d9ba32-76b0-4058-9528-d3cb71806502, mode:0755, prefix:fs subvolumegroup create, vol_name:cephfs) < ""
Feb 01 10:06:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:06.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:06 np0005604215.localdomain ceph-mon[298604]: pgmap v713: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 68 KiB/s wr, 2 op/s
Feb 01 10:06:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v714: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 68 KiB/s wr, 2 op/s
Feb 01 10:06:07 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup create", "vol_name": "cephfs", "group_name": "a4d9ba32-76b0-4058-9528-d3cb71806502", "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:06:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:06:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:06:07 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:06:07 np0005604215.localdomain systemd[1]: tmp-crun.pQQdLQ.mount: Deactivated successfully.
Feb 01 10:06:07 np0005604215.localdomain podman[318635]: 2026-02-01 10:06:07.892660079 +0000 UTC m=+0.098236129 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 10:06:07 np0005604215.localdomain podman[318631]: 2026-02-01 10:06:07.977134251 +0000 UTC m=+0.185549379 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Feb 01 10:06:07 np0005604215.localdomain podman[318635]: 2026-02-01 10:06:07.999772283 +0000 UTC m=+0.205348343 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:06:08 np0005604215.localdomain podman[318630]: 2026-02-01 10:06:07.853517845 +0000 UTC m=+0.069324292 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, org.opencontainers.image.created=2026-01-22T05:09:47Z, config_id=openstack_network_exporter, name=ubi9/ubi-minimal, release=1769056855, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, managed_by=edpm_ansible, version=9.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=)
Feb 01 10:06:08 np0005604215.localdomain podman[318631]: 2026-02-01 10:06:08.00964283 +0000 UTC m=+0.218057938 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:06:08 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:06:08 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:06:08 np0005604215.localdomain podman[318630]: 2026-02-01 10:06:08.090734536 +0000 UTC m=+0.306540993 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, container_name=openstack_network_exporter, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, name=ubi9/ubi-minimal, io.buildah.version=1.33.7, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=openstack_network_exporter, io.openshift.tags=minimal rhel9, architecture=x86_64, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 10:06:08 np0005604215.localdomain podman[318638]: 2026-02-01 10:06:08.098474456 +0000 UTC m=+0.297821413 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter)
Feb 01 10:06:08 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:06:08 np0005604215.localdomain podman[318638]: 2026-02-01 10:06:08.135629689 +0000 UTC m=+0.334976636 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:06:08 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:06:08 np0005604215.localdomain ceph-mon[298604]: pgmap v714: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 68 KiB/s wr, 2 op/s
Feb 01 10:06:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "a4d9ba32-76b0-4058-9528-d3cb71806502", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:a4d9ba32-76b0-4058-9528-d3cb71806502, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Feb 01 10:06:08 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolumegroup_rm(force:True, format:json, group_name:a4d9ba32-76b0-4058-9528-d3cb71806502, prefix:fs subvolumegroup rm, vol_name:cephfs) < ""
Feb 01 10:06:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v715: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 84 KiB/s wr, 3 op/s
Feb 01 10:06:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:08.804 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:09.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:09 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2063868740' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:10.068 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:10 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolumegroup rm", "vol_name": "cephfs", "group_name": "a4d9ba32-76b0-4058-9528-d3cb71806502", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:10 np0005604215.localdomain ceph-mon[298604]: pgmap v715: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 426 B/s rd, 84 KiB/s wr, 3 op/s
Feb 01 10:06:10 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2000731624' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:06:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v716: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 36 KiB/s wr, 1 op/s
Feb 01 10:06:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:11.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:06:11.786 158655 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, ssl=[], options={'arp_ns_explicit_output': 'true', 'fdb_removal_limit': '0', 'ignore_lsp_down': 'false', 'mac_binding_removal_limit': '0', 'mac_prefix': '62:f5:f4', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '7a:c8:7b:0d:61:da'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43
Feb 01 10:06:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:11.787 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:11 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:06:11.788 158655 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, vol_name:cephfs) < ""
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/35634281-ee3f-4e5c-8cf1-1c19c4d823ce/.meta.tmp'
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/35634281-ee3f-4e5c-8cf1-1c19c4d823ce/.meta.tmp' to config b'/volumes/_nogroup/35634281-ee3f-4e5c-8cf1-1c19c4d823ce/.meta'
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, vol_name:cephfs) < ""
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "format": "json"}]: dispatch
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, vol_name:cephfs) < ""
Feb 01 10:06:11 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, vol_name:cephfs) < ""
Feb 01 10:06:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:12.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:12.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183
Feb 01 10:06:12 np0005604215.localdomain ceph-mon[298604]: pgmap v716: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 36 KiB/s wr, 1 op/s
Feb 01 10:06:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "format": "json"}]: dispatch
Feb 01 10:06:12 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:06:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v717: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 53 KiB/s wr, 2 op/s
Feb 01 10:06:13 np0005604215.localdomain ceph-mon[298604]: pgmap v717: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 53 KiB/s wr, 2 op/s
Feb 01 10:06:13 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:13.838 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v718: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 33 KiB/s wr, 1 op/s
Feb 01 10:06:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:15.069 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "format": "json"}]: dispatch
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:06:15 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:06:15.242+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '35634281-ee3f-4e5c-8cf1-1c19c4d823ce' of type subvolume
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '35634281-ee3f-4e5c-8cf1-1c19c4d823ce' of type subvolume
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, vol_name:cephfs) < ""
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/35634281-ee3f-4e5c-8cf1-1c19c4d823ce'' moved to trashcan
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:06:15 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:35634281-ee3f-4e5c-8cf1-1c19c4d823ce, vol_name:cephfs) < ""
Feb 01 10:06:15 np0005604215.localdomain ceph-mon[298604]: pgmap v718: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 33 KiB/s wr, 1 op/s
Feb 01 10:06:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "format": "json"}]: dispatch
Feb 01 10:06:15 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "35634281-ee3f-4e5c-8cf1-1c19c4d823ce", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v719: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 33 KiB/s wr, 1 op/s
Feb 01 10:06:17 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:06:17.791 158655 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=f18e6148-4a7e-452d-80cb-72c86b59e439, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89
Feb 01 10:06:17 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:06:17Z|00270|memory_trim|INFO|Detected inactivity (last active 30001 ms ago): trimming memory
Feb 01 10:06:18 np0005604215.localdomain ceph-mon[298604]: pgmap v719: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 33 KiB/s wr, 1 op/s
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, vol_name:cephfs) < ""
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] wrote 155 bytes to config b'/volumes/_nogroup/5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb/.meta.tmp'
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.metadata_manager] Renamed b'/volumes/_nogroup/5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb/.meta.tmp' to config b'/volumes/_nogroup/5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb/.meta'
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_create(format:json, mode:0755, namespace_isolated:True, prefix:fs subvolume create, size:1073741824, sub_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, vol_name:cephfs) < ""
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "format": "json"}]: dispatch
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, vol_name:cephfs) < ""
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_getpath(format:json, prefix:fs subvolume getpath, sub_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, vol_name:cephfs) < ""
Feb 01 10:06:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v720: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 54 KiB/s wr, 2 op/s
Feb 01 10:06:18 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:06:18 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:18.841 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:18 np0005604215.localdomain podman[318717]: 2026-02-01 10:06:18.867798879 +0000 UTC m=+0.083668237 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:06:18 np0005604215.localdomain podman[318717]: 2026-02-01 10:06:18.880253226 +0000 UTC m=+0.096122614 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Feb 01 10:06:18 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:06:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:19.113 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:19.113 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145
Feb 01 10:06:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:19.129 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154
Feb 01 10:06:19 np0005604215.localdomain ceph-mon[298604]: from='client.15654 172.18.0.34:0/4105879027' entity='client.openstack' cmd={"prefix": "mon dump", "format": "json"} : dispatch
Feb 01 10:06:19 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:06:19 np0005604215.localdomain systemd[1]: tmp-crun.SxjE92.mount: Deactivated successfully.
Feb 01 10:06:19 np0005604215.localdomain podman[318736]: 2026-02-01 10:06:19.862304119 +0000 UTC m=+0.078429025 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:06:19 np0005604215.localdomain podman[318736]: 2026-02-01 10:06:19.900963039 +0000 UTC m=+0.117087985 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:06:19 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:06:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:20.092 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:20.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume create", "vol_name": "cephfs", "sub_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "size": 1073741824, "namespace_isolated": true, "mode": "0755", "format": "json"}]: dispatch
Feb 01 10:06:20 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume getpath", "vol_name": "cephfs", "sub_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "format": "json"}]: dispatch
Feb 01 10:06:20 np0005604215.localdomain ceph-mon[298604]: pgmap v720: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 54 KiB/s wr, 2 op/s
Feb 01 10:06:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v721: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 38 KiB/s wr, 1 op/s
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:06:21
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['backups', 'volumes', 'images', 'manila_metadata', 'vms', '.mgr', 'manila_data']
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "format": "json"}]: dispatch
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_clone_status(clone_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_clone_status(clone_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, format:json, prefix:fs clone status, vol_name:cephfs) < ""
Feb 01 10:06:21 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:06:21.842+0000 7f93ec23e640 -1 mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb' of type subvolume
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported operation 'clone-status' is not allowed on subvolume '5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb' of type subvolume
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Starting _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, vol_name:cephfs) < ""
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.00291850964893914 of space, bias 4.0, pg target 2.3231336805555554 quantized to 16 (current 16)
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.operations.versions.subvolume_base] subvolume path 'b'/volumes/_nogroup/5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb'' moved to trashcan
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.fs.async_job] queuing job for volume 'cephfs'
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO volumes.module] Finishing _cmd_fs_subvolume_rm(force:True, format:json, prefix:fs subvolume rm, sub_name:5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb, vol_name:cephfs) < ""
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:06:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:06:22 np0005604215.localdomain ceph-mon[298604]: pgmap v721: 177 pgs: 177 active+clean; 233 MiB data, 1.3 GiB used, 41 GiB / 42 GiB avail; 85 B/s rd, 38 KiB/s wr, 1 op/s
Feb 01 10:06:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v722: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 78 KiB/s wr, 3 op/s
Feb 01 10:06:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs clone status", "vol_name": "cephfs", "clone_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "format": "json"}]: dispatch
Feb 01 10:06:23 np0005604215.localdomain ceph-mon[298604]: from='client.15654 -' entity='client.openstack' cmd=[{"prefix": "fs subvolume rm", "vol_name": "cephfs", "sub_name": "5e0c14be-c6b0-4cd5-a1d1-18f3baab03fb", "force": true, "format": "json"}]: dispatch
Feb 01 10:06:23 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:23.885 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:24 np0005604215.localdomain ceph-mon[298604]: pgmap v722: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 255 B/s rd, 78 KiB/s wr, 3 op/s
Feb 01 10:06:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v723: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 61 KiB/s wr, 2 op/s
Feb 01 10:06:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:25.096 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:26 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:26.001 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:26 np0005604215.localdomain ceph-mon[298604]: pgmap v723: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 61 KiB/s wr, 2 op/s
Feb 01 10:06:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v724: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 61 KiB/s wr, 2 op/s
Feb 01 10:06:28 np0005604215.localdomain ceph-mon[298604]: pgmap v724: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 61 KiB/s wr, 2 op/s
Feb 01 10:06:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v725: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 75 KiB/s wr, 3 op/s
Feb 01 10:06:28 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:28.918 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:06:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:06:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:06:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:06:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:06:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18337 "" "Go-http-client/1.1"
Feb 01 10:06:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:30.125 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:30 np0005604215.localdomain ceph-mon[298604]: pgmap v725: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 75 KiB/s wr, 3 op/s
Feb 01 10:06:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v726: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 54 KiB/s wr, 2 op/s
Feb 01 10:06:31 np0005604215.localdomain ceph-mon[298604]: pgmap v726: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 54 KiB/s wr, 2 op/s
Feb 01 10:06:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:06:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:06:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:06:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:06:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:06:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:06:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v727: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 58 KiB/s wr, 3 op/s
Feb 01 10:06:33 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:33.976 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:34 np0005604215.localdomain ceph-mon[298604]: pgmap v727: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 341 B/s rd, 58 KiB/s wr, 3 op/s
Feb 01 10:06:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v728: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 18 KiB/s wr, 1 op/s
Feb 01 10:06:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"df", "format":"json"} v 0)
Feb 01 10:06:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3217422935' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:06:34 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} v 0)
Feb 01 10:06:34 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.32:0/3217422935' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:06:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:35.126 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3217422935' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:06:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/3217422935' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:06:36 np0005604215.localdomain ceph-mon[298604]: pgmap v728: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 18 KiB/s wr, 1 op/s
Feb 01 10:06:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v729: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 18 KiB/s wr, 1 op/s
Feb 01 10:06:38 np0005604215.localdomain ceph-mon[298604]: pgmap v729: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 18 KiB/s wr, 1 op/s
Feb 01 10:06:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v730: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 18 KiB/s wr, 1 op/s
Feb 01 10:06:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:06:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:06:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:06:38 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:06:38 np0005604215.localdomain podman[318758]: 2026-02-01 10:06:38.879486291 +0000 UTC m=+0.086237616 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, maintainer=Red Hat, Inc., cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1769056855, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, version=9.7, architecture=x86_64, managed_by=edpm_ansible, distribution-scope=public, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 10:06:38 np0005604215.localdomain podman[318758]: 2026-02-01 10:06:38.89073739 +0000 UTC m=+0.097488725 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.7, maintainer=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, name=ubi9/ubi-minimal, config_id=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, vcs-type=git, release=1769056855, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., container_name=openstack_network_exporter, cpe=cpe:/a:redhat:enterprise_linux:9::appstream)
Feb 01 10:06:38 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:06:38 np0005604215.localdomain podman[318760]: 2026-02-01 10:06:38.93873779 +0000 UTC m=+0.137980643 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Feb 01 10:06:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:39.003 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:39 np0005604215.localdomain podman[318759]: 2026-02-01 10:06:39.021471547 +0000 UTC m=+0.225122316 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 10:06:39 np0005604215.localdomain podman[318760]: 2026-02-01 10:06:39.045687369 +0000 UTC m=+0.244930212 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Feb 01 10:06:39 np0005604215.localdomain podman[318766]: 2026-02-01 10:06:39.056713391 +0000 UTC m=+0.251735642 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:06:39 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:06:39 np0005604215.localdomain podman[318766]: 2026-02-01 10:06:39.067741643 +0000 UTC m=+0.262763904 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 10:06:39 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:06:39 np0005604215.localdomain podman[318759]: 2026-02-01 10:06:39.150548002 +0000 UTC m=+0.354198761 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Feb 01 10:06:39 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:06:39 np0005604215.localdomain systemd[1]: tmp-crun.gV4HZP.mount: Deactivated successfully.
Feb 01 10:06:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:40.161 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:40 np0005604215.localdomain ceph-mon[298604]: pgmap v730: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 170 B/s rd, 18 KiB/s wr, 1 op/s
Feb 01 10:06:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v731: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.2 KiB/s wr, 0 op/s
Feb 01 10:06:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:06:41.784 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:06:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:06:41.784 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:06:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:06:41.785 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:06:42 np0005604215.localdomain ceph-mon[298604]: pgmap v731: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.2 KiB/s wr, 0 op/s
Feb 01 10:06:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v732: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.2 KiB/s wr, 0 op/s
Feb 01 10:06:43 np0005604215.localdomain ceph-mon[298604]: pgmap v732: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 4.2 KiB/s wr, 0 op/s
Feb 01 10:06:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:44.041 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v733: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:45.163 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:45 np0005604215.localdomain ceph-mon[298604]: pgmap v733: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v734: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:48 np0005604215.localdomain ceph-mon[298604]: pgmap v734: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v735: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:49.045 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:49 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:06:49 np0005604215.localdomain podman[318838]: 2026-02-01 10:06:49.867480319 +0000 UTC m=+0.084858255 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_id=ceilometer_agent_compute, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Feb 01 10:06:49 np0005604215.localdomain podman[318838]: 2026-02-01 10:06:49.877891391 +0000 UTC m=+0.095269317 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute)
Feb 01 10:06:49 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:06:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:50.195 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:50 np0005604215.localdomain ceph-mon[298604]: pgmap v735: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v736: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:06:50 np0005604215.localdomain podman[318857]: 2026-02-01 10:06:50.861799153 +0000 UTC m=+0.077885558 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:06:50 np0005604215.localdomain podman[318857]: 2026-02-01 10:06:50.899922966 +0000 UTC m=+0.116009331 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 10:06:50 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:06:51 np0005604215.localdomain sudo[318881]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:06:51 np0005604215.localdomain sudo[318881]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:06:51 np0005604215.localdomain sudo[318881]: pam_unix(sudo:session): session closed for user root
Feb 01 10:06:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:06:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:06:51 np0005604215.localdomain sudo[318899]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:06:51 np0005604215.localdomain sudo[318899]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:06:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:06:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:06:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:06:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:06:52 np0005604215.localdomain sudo[318899]: pam_unix(sudo:session): session closed for user root
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: pgmap v736: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:06:52 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev b3f6b11c-ac50-459f-a02b-938037b5c87f (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:06:52 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev b3f6b11c-ac50-459f-a02b-938037b5c87f (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:06:52 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event b3f6b11c-ac50-459f-a02b-938037b5c87f (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:06:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:06:52 np0005604215.localdomain sudo[318949]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:06:52 np0005604215.localdomain sudo[318949]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:06:52 np0005604215.localdomain sudo[318949]: pam_unix(sudo:session): session closed for user root
Feb 01 10:06:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v737: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:06:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:06:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:06:53 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:06:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:54.086 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:54 np0005604215.localdomain ceph-mon[298604]: pgmap v737: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v738: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:55.215 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:06:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v739: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:56 np0005604215.localdomain ceph-mon[298604]: pgmap v738: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:56 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:06:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:06:57 np0005604215.localdomain ceph-mon[298604]: pgmap v739: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:06:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:58.102 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:06:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:58.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:06:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:58.103 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:06:58 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:58.123 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:06:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v740: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:06:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:06:59.135 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:06:59 np0005604215.localdomain ceph-mon[298604]: pgmap v740: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:07:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:07:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:07:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:07:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:07:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18346 "" "Go-http-client/1.1"
Feb 01 10:07:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:00.262 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v741: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:01.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:01.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:07:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:07:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:07:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:07:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:07:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:07:01 np0005604215.localdomain ceph-mon[298604]: pgmap v741: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.125 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.126 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.127 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.127 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.128 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:07:02 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:07:02 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3163310464' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.585 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.458s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:07:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v742: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.795 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.797 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11467MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.797 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:07:02 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:02.798 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.106 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.107 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.124 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:07:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/951164908' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:03 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3163310464' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.410 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.415 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.415 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.415 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.415 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.415 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:07:03.416 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:07:03 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:07:03 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2249660806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.526 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.401s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.532 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.546 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.549 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:07:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:03.549 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.751s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:07:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:04.184 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:04 np0005604215.localdomain ceph-mon[298604]: pgmap v742: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2249660806' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3994439579' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v743: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:05.306 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:05.550 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:05.551 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:06.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:06.100 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:07:06 np0005604215.localdomain ceph-mon[298604]: pgmap v743: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v744: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:07.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:08 np0005604215.localdomain ceph-mon[298604]: pgmap v744: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v745: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:09.219 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:09 np0005604215.localdomain ceph-mon[298604]: pgmap v745: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:07:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:07:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:07:09 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:07:09 np0005604215.localdomain podman[319011]: 2026-02-01 10:07:09.884914 +0000 UTC m=+0.091086757 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, vcs-type=git, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2026-01-22T05:09:47Z, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, io.buildah.version=1.33.7, config_id=openstack_network_exporter, version=9.7, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9/ubi-minimal, release=1769056855, com.redhat.component=ubi9-minimal-container)
Feb 01 10:07:09 np0005604215.localdomain podman[319012]: 2026-02-01 10:07:09.92649983 +0000 UTC m=+0.131744249 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Feb 01 10:07:09 np0005604215.localdomain podman[319012]: 2026-02-01 10:07:09.963694904 +0000 UTC m=+0.168939283 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Feb 01 10:07:09 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:07:09 np0005604215.localdomain podman[319011]: 2026-02-01 10:07:09.983398656 +0000 UTC m=+0.189571443 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, config_id=openstack_network_exporter, managed_by=edpm_ansible, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, distribution-scope=public, name=ubi9/ubi-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.created=2026-01-22T05:09:47Z, version=9.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, release=1769056855, vendor=Red Hat, Inc., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Feb 01 10:07:10 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:07:10 np0005604215.localdomain podman[319013]: 2026-02-01 10:07:10.050343113 +0000 UTC m=+0.252558598 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Feb 01 10:07:10 np0005604215.localdomain podman[319014]: 2026-02-01 10:07:10.098232149 +0000 UTC m=+0.295978965 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:07:10 np0005604215.localdomain podman[319013]: 2026-02-01 10:07:10.118469368 +0000 UTC m=+0.320684883 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20260127, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Feb 01 10:07:10 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:07:10 np0005604215.localdomain podman[319014]: 2026-02-01 10:07:10.134664789 +0000 UTC m=+0.332411575 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:07:10 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:07:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:10.336 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v746: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:10 np0005604215.localdomain systemd[1]: tmp-crun.PxHjc8.mount: Deactivated successfully.
Feb 01 10:07:11 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:11.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:11 np0005604215.localdomain ceph-mon[298604]: pgmap v746: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3664023745' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:11 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1623814643' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:07:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v747: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:14.262 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:14 np0005604215.localdomain ceph-mon[298604]: pgmap v747: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v748: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:15.375 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:16 np0005604215.localdomain ceph-mon[298604]: pgmap v748: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v749: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:18 np0005604215.localdomain ceph-mon[298604]: pgmap v749: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v750: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:19.297 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:19 np0005604215.localdomain ceph-mon[298604]: pgmap v750: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:19.972 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:07:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:20.377 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v751: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:20 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:07:20 np0005604215.localdomain podman[319095]: 2026-02-01 10:07:20.865486534 +0000 UTC m=+0.080837827 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, config_id=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.schema-version=1.0)
Feb 01 10:07:20 np0005604215.localdomain podman[319095]: 2026-02-01 10:07:20.872901695 +0000 UTC m=+0.088252968 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:07:20 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:07:21
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['.mgr', 'volumes', 'vms', 'backups', 'manila_metadata', 'images', 'manila_data']
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:07:21 np0005604215.localdomain ceph-mon[298604]: pgmap v751: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:07:21 np0005604215.localdomain podman[319113]: 2026-02-01 10:07:21.898326702 +0000 UTC m=+0.078129213 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002965129466778336 of space, bias 4.0, pg target 2.3602430555555554 quantized to 16 (current 16)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:07:21 np0005604215.localdomain podman[319113]: 2026-02-01 10:07:21.9137087 +0000 UTC m=+0.093511211 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:07:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:07:21 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:07:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v752: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:23 np0005604215.localdomain sshd[319136]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:23 np0005604215.localdomain sshd[319136]: Accepted publickey for zuul from 38.102.83.114 port 50728 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:23 np0005604215.localdomain systemd-logind[761]: New session 78 of user zuul.
Feb 01 10:07:23 np0005604215.localdomain systemd[1]: Started Session 78 of User zuul.
Feb 01 10:07:23 np0005604215.localdomain sshd[319136]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:23 np0005604215.localdomain sudo[319156]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/sh -c echo BECOME-SUCCESS-rpxrcuewuxwqqofjittmgjhurlixvnwe ; /usr/bin/python3
Feb 01 10:07:23 np0005604215.localdomain sudo[319156]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:23 np0005604215.localdomain python3[319158]: ansible-ansible.legacy.command Invoked with _raw_params=subscription-manager unregister
                                                           _uses_shell=True zuul_log_id=fa163ef9-e89a-3279-acd3-00000000000c-1-overcloudnovacompute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Feb 01 10:07:24 np0005604215.localdomain sudo[319156]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:24 np0005604215.localdomain ceph-mon[298604]: pgmap v752: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:24.326 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v753: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:25.413 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:26 np0005604215.localdomain ceph-mon[298604]: pgmap v753: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v754: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:28 np0005604215.localdomain ceph-mon[298604]: pgmap v754: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:28 np0005604215.localdomain sshd[319136]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:28 np0005604215.localdomain systemd[1]: session-78.scope: Deactivated successfully.
Feb 01 10:07:28 np0005604215.localdomain systemd-logind[761]: Session 78 logged out. Waiting for processes to exit.
Feb 01 10:07:28 np0005604215.localdomain systemd-logind[761]: Removed session 78.
Feb 01 10:07:28 np0005604215.localdomain ovn_controller[152787]: 2026-02-01T10:07:28Z|00271|memory_trim|INFO|Detected inactivity (last active 30008 ms ago): trimming memory
Feb 01 10:07:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v755: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:29.374 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:07:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:07:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:07:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:07:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:07:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18350 "" "Go-http-client/1.1"
Feb 01 10:07:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:30 np0005604215.localdomain ceph-mon[298604]: pgmap v755: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:30.449 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v756: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:07:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:07:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:07:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:07:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:07:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:07:32 np0005604215.localdomain ceph-mon[298604]: pgmap v756: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v757: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:34 np0005604215.localdomain ceph-mon[298604]: pgmap v757: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:34.404 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v758: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:35 np0005604215.localdomain ceph-mon[298604]: pgmap v758: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/246910684' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:07:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/246910684' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:07:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:35.485 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v759: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:38 np0005604215.localdomain ceph-mon[298604]: pgmap v759: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v760: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:39.440 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:40 np0005604215.localdomain ceph-mon[298604]: pgmap v760: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:40.522 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v761: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: tmp-crun.uqqYQS.mount: Deactivated successfully.
Feb 01 10:07:40 np0005604215.localdomain podman[319168]: 2026-02-01 10:07:40.885201663 +0000 UTC m=+0.084536602 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image)
Feb 01 10:07:40 np0005604215.localdomain podman[319161]: 2026-02-01 10:07:40.859177474 +0000 UTC m=+0.069249227 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, release=1769056855, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, vendor=Red Hat, Inc., managed_by=edpm_ansible, version=9.7, io.openshift.expose-services=, distribution-scope=public, io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2026-01-22T05:09:47Z, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, name=ubi9/ubi-minimal, io.buildah.version=1.33.7)
Feb 01 10:07:40 np0005604215.localdomain podman[319169]: 2026-02-01 10:07:40.927674175 +0000 UTC m=+0.127375015 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:07:40 np0005604215.localdomain podman[319169]: 2026-02-01 10:07:40.933219558 +0000 UTC m=+0.132920328 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:07:40 np0005604215.localdomain podman[319161]: 2026-02-01 10:07:40.94771913 +0000 UTC m=+0.157790923 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, name=ubi9/ubi-minimal, build-date=2026-01-22T05:09:47Z, io.openshift.expose-services=, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, org.opencontainers.image.created=2026-01-22T05:09:47Z, release=1769056855, version=9.7, vcs-type=git, distribution-scope=public, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., config_id=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 10:07:40 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:07:41 np0005604215.localdomain podman[319162]: 2026-02-01 10:07:41.028718191 +0000 UTC m=+0.231944811 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20260127, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true)
Feb 01 10:07:41 np0005604215.localdomain podman[319162]: 2026-02-01 10:07:41.037840105 +0000 UTC m=+0.241066805 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team)
Feb 01 10:07:41 np0005604215.localdomain podman[319168]: 2026-02-01 10:07:41.050127327 +0000 UTC m=+0.249462326 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Feb 01 10:07:41 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:07:41 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:07:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:07:41.785 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:07:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:07:41.786 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:07:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:07:41.787 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:07:42 np0005604215.localdomain ceph-mon[298604]: pgmap v761: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v762: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:44 np0005604215.localdomain ceph-mon[298604]: pgmap v762: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:44.481 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v763: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:45 np0005604215.localdomain ceph-mon[298604]: pgmap v763: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:45.567 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v764: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:46 np0005604215.localdomain sshd[319242]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:46 np0005604215.localdomain sshd[319242]: Accepted publickey for zuul from 38.102.83.114 port 34760 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:46 np0005604215.localdomain systemd-logind[761]: New session 79 of user zuul.
Feb 01 10:07:46 np0005604215.localdomain systemd[1]: Started Session 79 of User zuul.
Feb 01 10:07:46 np0005604215.localdomain sshd[319242]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:46 np0005604215.localdomain sudo[319246]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /var/log
Feb 01 10:07:46 np0005604215.localdomain sudo[319246]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:47 np0005604215.localdomain sudo[319246]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:47 np0005604215.localdomain sshd[319245]: Received disconnect from 38.102.83.114 port 34760:11: disconnected by user
Feb 01 10:07:47 np0005604215.localdomain sshd[319245]: Disconnected from user zuul 38.102.83.114 port 34760
Feb 01 10:07:47 np0005604215.localdomain sshd[319242]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:47 np0005604215.localdomain systemd-logind[761]: Session 79 logged out. Waiting for processes to exit.
Feb 01 10:07:47 np0005604215.localdomain systemd[1]: session-79.scope: Deactivated successfully.
Feb 01 10:07:47 np0005604215.localdomain systemd-logind[761]: Removed session 79.
Feb 01 10:07:48 np0005604215.localdomain sshd[319264]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:48 np0005604215.localdomain sshd[319264]: Accepted publickey for zuul from 38.102.83.114 port 34768 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:48 np0005604215.localdomain systemd-logind[761]: New session 80 of user zuul.
Feb 01 10:07:48 np0005604215.localdomain systemd[1]: Started Session 80 of User zuul.
Feb 01 10:07:48 np0005604215.localdomain sshd[319264]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:48 np0005604215.localdomain ceph-mon[298604]: pgmap v764: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:48 np0005604215.localdomain sudo[319268]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/networks
Feb 01 10:07:48 np0005604215.localdomain sudo[319268]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:48 np0005604215.localdomain sudo[319268]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:48 np0005604215.localdomain sshd[319267]: Received disconnect from 38.102.83.114 port 34768:11: disconnected by user
Feb 01 10:07:48 np0005604215.localdomain sshd[319267]: Disconnected from user zuul 38.102.83.114 port 34768
Feb 01 10:07:48 np0005604215.localdomain sshd[319264]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:48 np0005604215.localdomain systemd[1]: session-80.scope: Deactivated successfully.
Feb 01 10:07:48 np0005604215.localdomain systemd-logind[761]: Session 80 logged out. Waiting for processes to exit.
Feb 01 10:07:48 np0005604215.localdomain systemd-logind[761]: Removed session 80.
Feb 01 10:07:48 np0005604215.localdomain sshd[319286]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v765: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:48 np0005604215.localdomain sshd[319286]: Accepted publickey for zuul from 38.102.83.114 port 34772 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:48 np0005604215.localdomain systemd-logind[761]: New session 81 of user zuul.
Feb 01 10:07:48 np0005604215.localdomain systemd[1]: Started Session 81 of User zuul.
Feb 01 10:07:48 np0005604215.localdomain sshd[319286]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:48 np0005604215.localdomain sudo[319290]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/containers/containers.conf
Feb 01 10:07:48 np0005604215.localdomain sudo[319290]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:48 np0005604215.localdomain sudo[319290]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:48 np0005604215.localdomain sshd[319289]: Received disconnect from 38.102.83.114 port 34772:11: disconnected by user
Feb 01 10:07:48 np0005604215.localdomain sshd[319289]: Disconnected from user zuul 38.102.83.114 port 34772
Feb 01 10:07:48 np0005604215.localdomain sshd[319286]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:48 np0005604215.localdomain systemd[1]: session-81.scope: Deactivated successfully.
Feb 01 10:07:48 np0005604215.localdomain systemd-logind[761]: Session 81 logged out. Waiting for processes to exit.
Feb 01 10:07:48 np0005604215.localdomain systemd-logind[761]: Removed session 81.
Feb 01 10:07:49 np0005604215.localdomain sshd[319308]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:49 np0005604215.localdomain sshd[319308]: Accepted publickey for zuul from 38.102.83.114 port 34782 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:49 np0005604215.localdomain systemd-logind[761]: New session 82 of user zuul.
Feb 01 10:07:49 np0005604215.localdomain systemd[1]: Started Session 82 of User zuul.
Feb 01 10:07:49 np0005604215.localdomain sshd[319308]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:49 np0005604215.localdomain sudo[319312]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ceph
Feb 01 10:07:49 np0005604215.localdomain sudo[319312]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:49 np0005604215.localdomain sudo[319312]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:49 np0005604215.localdomain sshd[319311]: Received disconnect from 38.102.83.114 port 34782:11: disconnected by user
Feb 01 10:07:49 np0005604215.localdomain sshd[319311]: Disconnected from user zuul 38.102.83.114 port 34782
Feb 01 10:07:49 np0005604215.localdomain sshd[319308]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:49 np0005604215.localdomain systemd[1]: session-82.scope: Deactivated successfully.
Feb 01 10:07:49 np0005604215.localdomain systemd-logind[761]: Session 82 logged out. Waiting for processes to exit.
Feb 01 10:07:49 np0005604215.localdomain systemd-logind[761]: Removed session 82.
Feb 01 10:07:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:49.529 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:49 np0005604215.localdomain sshd[319330]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:49 np0005604215.localdomain sshd[319330]: Accepted publickey for zuul from 38.102.83.114 port 34790 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:49 np0005604215.localdomain systemd-logind[761]: New session 83 of user zuul.
Feb 01 10:07:49 np0005604215.localdomain systemd[1]: Started Session 83 of User zuul.
Feb 01 10:07:49 np0005604215.localdomain sshd[319330]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:49 np0005604215.localdomain sudo[319334]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/ci
Feb 01 10:07:49 np0005604215.localdomain sudo[319334]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:49 np0005604215.localdomain sudo[319334]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:49 np0005604215.localdomain sshd[319333]: Received disconnect from 38.102.83.114 port 34790:11: disconnected by user
Feb 01 10:07:49 np0005604215.localdomain sshd[319333]: Disconnected from user zuul 38.102.83.114 port 34790
Feb 01 10:07:49 np0005604215.localdomain sshd[319330]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:49 np0005604215.localdomain systemd[1]: session-83.scope: Deactivated successfully.
Feb 01 10:07:49 np0005604215.localdomain systemd-logind[761]: Session 83 logged out. Waiting for processes to exit.
Feb 01 10:07:49 np0005604215.localdomain systemd-logind[761]: Removed session 83.
Feb 01 10:07:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:07:50 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                           ** DB Stats **
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Cumulative writes: 4723 writes, 35K keys, 4723 commit groups, 1.0 writes per commit group, ingest: 0.06 GB, 0.05 MB/s
                                                           Cumulative WAL: 4723 writes, 4723 syncs, 1.00 writes per sync, written: 0.06 GB, 0.05 MB/s
                                                           Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           Interval writes: 2498 writes, 13K keys, 2498 commit groups, 1.0 writes per commit group, ingest: 18.92 MB, 0.03 MB/s
                                                           Interval WAL: 2498 writes, 2498 syncs, 1.00 writes per sync, written: 0.02 GB, 0.03 MB/s
                                                           Interval stall: 00:00:0.000 H:M:S, 0.0 percent
                                                           
                                                           ** Compaction Stats [default] **
                                                           Level    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                             L0      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   1.0      0.0    170.2      0.26              0.11        15    0.017       0      0       0.0       0.0
                                                             L6      1/0   21.40 MB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   6.7    165.5    153.8      1.88              0.75        14    0.135    190K   7188       0.0       0.0
                                                            Sum      1/0   21.40 MB   0.0      0.3     0.0      0.3       0.3      0.1       0.0   7.7    145.8    155.8      2.14              0.86        29    0.074    190K   7188       0.0       0.0
                                                            Int      0/0    0.00 KB   0.0      0.2     0.0      0.1       0.2      0.0       0.0  14.2    149.2    152.0      1.04              0.43        14    0.075    101K   3753       0.0       0.0
                                                           
                                                           ** Compaction Stats [default] **
                                                           Priority    Files   Size     Score Read(GB)  Rn(GB) Rnp1(GB) Write(GB) Wnew(GB) Moved(GB) W-Amp Rd(MB/s) Wr(MB/s) Comp(sec) CompMergeCPU(sec) Comp(cnt) Avg(sec) KeyIn KeyDrop Rblob(GB) Wblob(GB)
                                                           ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
                                                            Low      0/0    0.00 KB   0.0      0.3     0.0      0.3       0.3      0.0       0.0   0.0    165.5    153.8      1.88              0.75        14    0.135    190K   7188       0.0       0.0
                                                           High      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0    171.8      0.25              0.11        14    0.018       0      0       0.0       0.0
                                                           User      0/0    0.00 KB   0.0      0.0     0.0      0.0       0.0      0.0       0.0   0.0      0.0      0.7      0.00              0.00         1    0.002       0      0       0.0       0.0
                                                           
                                                           Blob file count: 0, total size: 0.0 GB, garbage size: 0.0 GB, space amp: 0.0
                                                           
                                                           Uptime(secs): 1200.0 total, 600.0 interval
                                                           Flush(GB): cumulative 0.042, interval 0.011
                                                           AddFile(GB): cumulative 0.000, interval 0.000
                                                           AddFile(Total Files): cumulative 0, interval 0
                                                           AddFile(L0 Files): cumulative 0, interval 0
                                                           AddFile(Keys): cumulative 0, interval 0
                                                           Cumulative compaction: 0.33 GB write, 0.28 MB/s write, 0.30 GB read, 0.26 MB/s read, 2.1 seconds
                                                           Interval compaction: 0.15 GB write, 0.26 MB/s write, 0.15 GB read, 0.26 MB/s read, 1.0 seconds
                                                           Stalls(count): 0 level0_slowdown, 0 level0_slowdown_with_compaction, 0 level0_numfiles, 0 level0_numfiles_with_compaction, 0 stop for pending_compaction_bytes, 0 slowdown for pending_compaction_bytes, 0 memtable_compaction, 0 memtable_slowdown, interval 0 total count
                                                           Block cache BinnedLRUCache@0x562ae85ff1f0#2 capacity: 304.00 MB usage: 24.62 MB table_size: 0 occupancy: 18446744073709551615 collections: 3 last_copies: 0 last_secs: 0.000135 secs_since: 0
                                                           Block cache entry stats(count,size,portion): DataBlock(1314,23.37 MB,7.68725%) FilterBlock(29,562.73 KB,0.180771%) IndexBlock(29,718.48 KB,0.230804%) Misc(1,0.00 KB,0%)
                                                           
                                                           ** File Read Latency Histogram By Level [default] **
Feb 01 10:07:50 np0005604215.localdomain sshd[319352]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:50 np0005604215.localdomain ceph-mon[298604]: pgmap v765: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:50 np0005604215.localdomain sshd[319352]: Accepted publickey for zuul from 38.102.83.114 port 34804 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:50 np0005604215.localdomain systemd-logind[761]: New session 84 of user zuul.
Feb 01 10:07:50 np0005604215.localdomain systemd[1]: Started Session 84 of User zuul.
Feb 01 10:07:50 np0005604215.localdomain sshd[319352]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:50 np0005604215.localdomain sudo[319356]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.conf
Feb 01 10:07:50 np0005604215.localdomain sudo[319356]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:50 np0005604215.localdomain sudo[319356]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:50 np0005604215.localdomain sshd[319355]: Received disconnect from 38.102.83.114 port 34804:11: disconnected by user
Feb 01 10:07:50 np0005604215.localdomain sshd[319355]: Disconnected from user zuul 38.102.83.114 port 34804
Feb 01 10:07:50 np0005604215.localdomain sshd[319352]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:50 np0005604215.localdomain systemd[1]: session-84.scope: Deactivated successfully.
Feb 01 10:07:50 np0005604215.localdomain systemd-logind[761]: Session 84 logged out. Waiting for processes to exit.
Feb 01 10:07:50 np0005604215.localdomain systemd-logind[761]: Removed session 84.
Feb 01 10:07:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:50.599 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v766: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:50 np0005604215.localdomain sshd[319374]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:50 np0005604215.localdomain sshd[319374]: Accepted publickey for zuul from 38.102.83.114 port 34814 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:50 np0005604215.localdomain systemd-logind[761]: New session 85 of user zuul.
Feb 01 10:07:50 np0005604215.localdomain systemd[1]: Started Session 85 of User zuul.
Feb 01 10:07:50 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:07:50 np0005604215.localdomain sshd[319374]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:50 np0005604215.localdomain podman[319377]: 2026-02-01 10:07:50.994491947 +0000 UTC m=+0.092340475 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 01 10:07:51 np0005604215.localdomain sudo[319390]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/yum.repos.d
Feb 01 10:07:51 np0005604215.localdomain sudo[319390]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:51 np0005604215.localdomain podman[319377]: 2026-02-01 10:07:51.034703308 +0000 UTC m=+0.132551756 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_id=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.schema-version=1.0)
Feb 01 10:07:51 np0005604215.localdomain sudo[319390]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:51 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:07:51 np0005604215.localdomain sshd[319378]: Received disconnect from 38.102.83.114 port 34814:11: disconnected by user
Feb 01 10:07:51 np0005604215.localdomain sshd[319378]: Disconnected from user zuul 38.102.83.114 port 34814
Feb 01 10:07:51 np0005604215.localdomain sshd[319374]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:51 np0005604215.localdomain systemd-logind[761]: Session 85 logged out. Waiting for processes to exit.
Feb 01 10:07:51 np0005604215.localdomain systemd[1]: session-85.scope: Deactivated successfully.
Feb 01 10:07:51 np0005604215.localdomain systemd-logind[761]: Removed session 85.
Feb 01 10:07:51 np0005604215.localdomain sshd[319416]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:51 np0005604215.localdomain sshd[319416]: Accepted publickey for zuul from 38.102.83.114 port 34822 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f93d0879d60>)]
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 10:07:51 np0005604215.localdomain systemd-logind[761]: New session 86 of user zuul.
Feb 01 10:07:51 np0005604215.localdomain systemd[1]: Started Session 86 of User zuul.
Feb 01 10:07:51 np0005604215.localdomain sshd[319416]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: [('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f9407318940>), ('cephfs', <mgr_util.CephfsConnectionPool.Connection object at 0x7f93d0879100>)]
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] disconnecting from cephfs 'cephfs'
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:07:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:07:51 np0005604215.localdomain sudo[319420]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /etc/os-net-config
Feb 01 10:07:51 np0005604215.localdomain sudo[319420]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:51 np0005604215.localdomain sudo[319420]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:51 np0005604215.localdomain sshd[319419]: Received disconnect from 38.102.83.114 port 34822:11: disconnected by user
Feb 01 10:07:51 np0005604215.localdomain sshd[319419]: Disconnected from user zuul 38.102.83.114 port 34822
Feb 01 10:07:51 np0005604215.localdomain sshd[319416]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:51 np0005604215.localdomain systemd[1]: session-86.scope: Deactivated successfully.
Feb 01 10:07:51 np0005604215.localdomain systemd-logind[761]: Session 86 logged out. Waiting for processes to exit.
Feb 01 10:07:51 np0005604215.localdomain systemd-logind[761]: Removed session 86.
Feb 01 10:07:51 np0005604215.localdomain sshd[319438]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:07:51 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:07:52 np0005604215.localdomain sshd[319438]: Accepted publickey for zuul from 38.102.83.114 port 34838 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:07:52 np0005604215.localdomain systemd-logind[761]: New session 87 of user zuul.
Feb 01 10:07:52 np0005604215.localdomain systemd[1]: Started Session 87 of User zuul.
Feb 01 10:07:52 np0005604215.localdomain systemd[1]: tmp-crun.1qaQBS.mount: Deactivated successfully.
Feb 01 10:07:52 np0005604215.localdomain sshd[319438]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:07:52 np0005604215.localdomain podman[319440]: 2026-02-01 10:07:52.085521945 +0000 UTC m=+0.087578536 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Feb 01 10:07:52 np0005604215.localdomain podman[319440]: 2026-02-01 10:07:52.101678168 +0000 UTC m=+0.103734719 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Feb 01 10:07:52 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:07:52 np0005604215.localdomain sudo[319464]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/rsync --server --sender -lLogDtprze.LsfxC . /home/zuul/ansible_hostname
Feb 01 10:07:52 np0005604215.localdomain sudo[319464]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:07:52 np0005604215.localdomain sudo[319464]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:52 np0005604215.localdomain sshd[319458]: Received disconnect from 38.102.83.114 port 34838:11: disconnected by user
Feb 01 10:07:52 np0005604215.localdomain sshd[319458]: Disconnected from user zuul 38.102.83.114 port 34838
Feb 01 10:07:52 np0005604215.localdomain sshd[319438]: pam_unix(sshd:session): session closed for user zuul
Feb 01 10:07:52 np0005604215.localdomain systemd-logind[761]: Session 87 logged out. Waiting for processes to exit.
Feb 01 10:07:52 np0005604215.localdomain systemd[1]: session-87.scope: Deactivated successfully.
Feb 01 10:07:52 np0005604215.localdomain systemd-logind[761]: Removed session 87.
Feb 01 10:07:52 np0005604215.localdomain ceph-mon[298604]: pgmap v766: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:07:52 np0005604215.localdomain sudo[319483]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:07:52 np0005604215.localdomain sudo[319483]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:07:52 np0005604215.localdomain sudo[319483]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v767: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:52 np0005604215.localdomain sudo[319501]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:07:52 np0005604215.localdomain sudo[319501]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:07:53 np0005604215.localdomain sudo[319501]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: mgrmap e64: np0005604215.uhhqtv(active, since 18m), standbys: np0005604212.oynhpm, np0005604209.isqrps, np0005604213.caiaeh
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: pgmap v767: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:07:53 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 0fb73ffa-67eb-472b-9d95-63d611a4bb0a (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:07:53 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 0fb73ffa-67eb-472b-9d95-63d611a4bb0a (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:07:53 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 0fb73ffa-67eb-472b-9d95-63d611a4bb0a (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:07:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:07:53 np0005604215.localdomain sudo[319551]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:07:53 np0005604215.localdomain sudo[319551]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:07:53 np0005604215.localdomain sudo[319551]: pam_unix(sudo:session): session closed for user root
Feb 01 10:07:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:07:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:07:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:07:54 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:07:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:54.560 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v768: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:07:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:55.635 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:56 np0005604215.localdomain ceph-mon[298604]: pgmap v768: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v769: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:56 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:07:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:07:57 np0005604215.localdomain ceph-mon[298604]: pgmap v769: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:07:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v770: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:07:59 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:07:59.600 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:07:59 np0005604215.localdomain ceph-mon[298604]: pgmap v770: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:08:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:08:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:08:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:08:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:08:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:08:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18331 "" "Go-http-client/1.1"
Feb 01 10:08:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:00.122 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:00.123 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:08:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:00.123 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:08:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:00.151 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:08:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v771: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:08:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:00.673 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:01.124 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:08:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:08:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:08:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:08:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:08:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:08:01 np0005604215.localdomain ceph-mon[298604]: pgmap v771: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:08:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v772: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.103 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.104 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.129 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.129 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.130 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.130 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.130 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:08:03 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:08:03 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1449967358' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.576 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.781 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.783 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11472MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.784 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.784 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.867 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.868 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:08:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:03.986 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing inventories for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.011 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating ProviderTree inventory for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.011 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Updating inventory in ProviderTree for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.038 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing aggregate associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.071 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Refreshing trait associations for resource provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590, traits: COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_STORAGE_BUS_IDE,COMPUTE_STORAGE_BUS_SCSI,HW_CPU_X86_SSE,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_IMAGE_TYPE_QCOW2,HW_CPU_X86_AVX2,HW_CPU_X86_SSE2,HW_CPU_X86_AESNI,COMPUTE_NET_VIF_MODEL_RTL8139,HW_CPU_X86_AMD_SVM,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,HW_CPU_X86_SSE42,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_NET_VIF_MODEL_NE2K_PCI,HW_CPU_X86_MMX,COMPUTE_SECURITY_TPM_1_2,COMPUTE_VIOMMU_MODEL_INTEL,HW_CPU_X86_ABM,COMPUTE_STORAGE_BUS_SATA,COMPUTE_TRUSTED_CERTS,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_ACCELERATORS,COMPUTE_IMAGE_TYPE_RAW,HW_CPU_X86_AVX,COMPUTE_IMAGE_TYPE_AKI,HW_CPU_X86_SVM,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_VIOMMU_MODEL_AUTO,HW_CPU_X86_SHA,HW_CPU_X86_SSE41,COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_NET_VIF_MODEL_E1000E,HW_CPU_X86_BMI,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_VOLUME_ATTACH_WITH_TAG,HW_CPU_X86_FMA3,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VOLUME_EXTEND,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_STORAGE_BUS_FDC,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_VIF_MODEL_E1000,HW_CPU_X86_SSSE3,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE4A,HW_CPU_X86_F16C,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_VOLUME_MULTI_ATTACH,HW_CPU_X86_CLMUL,COMPUTE_RESCUE_BFV,HW_CPU_X86_BMI2,COMPUTE_VIOMMU_MODEL_VIRTIO _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.102 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:08:04 np0005604215.localdomain ceph-mon[298604]: pgmap v772: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 426 B/s wr, 0 op/s
Feb 01 10:08:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1449967358' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:08:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3302170123' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.551 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.449s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.557 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:08:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v773: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.633 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.635 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.636 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.852s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:08:04 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:04.650 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3893328463' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3302170123' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:05.633 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:05.677 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:06.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:06 np0005604215.localdomain ceph-mon[298604]: pgmap v773: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 01 10:08:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/60058546' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v774: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 01 10:08:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:07.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:08.101 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:08 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:08.102 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477
Feb 01 10:08:08 np0005604215.localdomain ceph-mon[298604]: pgmap v774: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 01 10:08:08 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v775: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 01 10:08:09 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:09.683 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:10 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:10 np0005604215.localdomain ceph-mon[298604]: pgmap v775: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail; 0 B/s wr, 0 op/s
Feb 01 10:08:10 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v776: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:10 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:10.679 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:08:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:08:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:08:11 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:08:11 np0005604215.localdomain podman[319613]: 2026-02-01 10:08:11.876673381 +0000 UTC m=+0.090078404 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, com.redhat.component=ubi9-minimal-container, version=9.7, container_name=openstack_network_exporter, config_id=openstack_network_exporter, architecture=x86_64, io.openshift.tags=minimal rhel9, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., build-date=2026-01-22T05:09:47Z, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, managed_by=edpm_ansible, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, org.opencontainers.image.created=2026-01-22T05:09:47Z, name=ubi9/ubi-minimal, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Feb 01 10:08:11 np0005604215.localdomain podman[319613]: 2026-02-01 10:08:11.892253527 +0000 UTC m=+0.105658560 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, managed_by=edpm_ansible, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, org.opencontainers.image.created=2026-01-22T05:09:47Z, io.buildah.version=1.33.7, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, distribution-scope=public, name=ubi9/ubi-minimal, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, release=1769056855, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.7, build-date=2026-01-22T05:09:47Z)
Feb 01 10:08:11 np0005604215.localdomain podman[319615]: 2026-02-01 10:08:11.929891628 +0000 UTC m=+0.137472900 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_managed=true)
Feb 01 10:08:11 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:08:11 np0005604215.localdomain podman[319616]: 2026-02-01 10:08:11.992143515 +0000 UTC m=+0.193949557 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter)
Feb 01 10:08:12 np0005604215.localdomain podman[319615]: 2026-02-01 10:08:12.023353067 +0000 UTC m=+0.230934349 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, org.label-schema.license=GPLv2)
Feb 01 10:08:12 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:08:12 np0005604215.localdomain podman[319616]: 2026-02-01 10:08:12.078754661 +0000 UTC m=+0.280560693 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:08:12 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:08:12 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:12.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:12 np0005604215.localdomain podman[319614]: 2026-02-01 10:08:12.083449027 +0000 UTC m=+0.294998352 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Feb 01 10:08:12 np0005604215.localdomain podman[319614]: 2026-02-01 10:08:12.167534595 +0000 UTC m=+0.379083910 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:08:12 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:08:12 np0005604215.localdomain ceph-mon[298604]: pgmap v776: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:12 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1109428422' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:12 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v777: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:13 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3571253492' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:08:14 np0005604215.localdomain ceph-mon[298604]: pgmap v777: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:14 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v778: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:14 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:14.727 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:15.096 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:08:15 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:15 np0005604215.localdomain ceph-mon[298604]: pgmap v778: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:15 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:15.682 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:16 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v779: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:18 np0005604215.localdomain ceph-mon[298604]: pgmap v779: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:18 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v780: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:19 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:19.778 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:20 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:20 np0005604215.localdomain ceph-mon[298604]: pgmap v780: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:20 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v781: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:20 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:20.686 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Optimize plan auto_2026-02-01_10:08:21
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] Mode upmap, max misplaced 0.050000
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] do_upmap
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] pools ['images', 'vms', 'manila_metadata', 'backups', 'volumes', '.mgr', 'manila_data']
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [balancer INFO root] prepared 0/10 changes
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:08:21 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:08:21 np0005604215.localdomain podman[319697]: 2026-02-01 10:08:21.861998188 +0000 UTC m=+0.076290706 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_id=ceilometer_agent_compute, managed_by=edpm_ansible)
Feb 01 10:08:21 np0005604215.localdomain podman[319697]: 2026-02-01 10:08:21.875115135 +0000 UTC m=+0.089407693 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, container_name=ceilometer_agent_compute, org.label-schema.build-date=20260127, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Feb 01 10:08:21 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] _maybe_adjust
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool '.mgr' root_id -1 using 3.080724804578448e-05 of space, bias 1.0, pg target 0.006161449609156895 quantized to 1 (current 1)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'vms' root_id -1 using 0.003325274375348967 of space, bias 1.0, pg target 0.6650548750697934 quantized to 32 (current 32)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'volumes' root_id -1 using 0.0014861089300670016 of space, bias 1.0, pg target 0.29672641637004465 quantized to 32 (current 32)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'images' root_id -1 using 0.004299383200725851 of space, bias 1.0, pg target 0.8584435124115949 quantized to 32 (current 32)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'backups' root_id -1 using 2.7263051367950866e-07 of space, bias 1.0, pg target 5.425347222222222e-05 quantized to 32 (current 32)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_data' root_id -1 using 0.0 of space, bias 1.0, pg target 0.0 quantized to 32 (current 32)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] effective_target_ratio 0.0 0.0 0 45071990784
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [pg_autoscaler INFO root] Pool 'manila_metadata' root_id -1 using 0.002965129466778336 of space, bias 4.0, pg target 2.3602430555555554 quantized to 16 (current 16)
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] TrashPurgeScheduleHandler: load_schedules
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] MirrorSnapshotScheduleHandler: load_schedules
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: vms, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: volumes, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: images, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:08:21 np0005604215.localdomain ceph-mgr[278126]: [rbd_support INFO root] load_schedules: backups, start_after=
Feb 01 10:08:22 np0005604215.localdomain ceph-mon[298604]: pgmap v781: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:22 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v782: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:22 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:08:22 np0005604215.localdomain systemd[1]: tmp-crun.TXNKFM.mount: Deactivated successfully.
Feb 01 10:08:22 np0005604215.localdomain podman[319714]: 2026-02-01 10:08:22.848048929 +0000 UTC m=+0.069785793 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:08:22 np0005604215.localdomain podman[319714]: 2026-02-01 10:08:22.884766212 +0000 UTC m=+0.106503106 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Feb 01 10:08:22 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:08:24 np0005604215.localdomain ceph-mon[298604]: pgmap v782: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:24 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v783: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:24 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:24.818 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:25 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:25 np0005604215.localdomain ceph-mon[298604]: pgmap v783: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:25 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:25.685 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:26 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v784: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:28 np0005604215.localdomain ceph-mon[298604]: pgmap v784: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:28 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v785: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:29 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:29.842 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:30 np0005604215.localdomain podman[236852]: time="2026-02-01T10:08:30Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:08:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:08:30 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:08:30 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:08:30 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18350 "" "Go-http-client/1.1"
Feb 01 10:08:30 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:30 np0005604215.localdomain ceph-mon[298604]: pgmap v785: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:30 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v786: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:30 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:30.687 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:08:31 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:08:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:08:31 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:08:31 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:08:31 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:08:32 np0005604215.localdomain ceph-mon[298604]: pgmap v786: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:32 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v787: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:34 np0005604215.localdomain ceph-mon[298604]: pgmap v787: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:34 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v788: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:34 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:34.845 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:35 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:35 np0005604215.localdomain ceph-mon[298604]: pgmap v788: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/354795917' entity='client.openstack' cmd={"prefix":"df", "format":"json"} : dispatch
Feb 01 10:08:35 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.32:0/354795917' entity='client.openstack' cmd={"prefix":"osd pool get-quota", "pool": "volumes", "format":"json"} : dispatch
Feb 01 10:08:35 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:35.731 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:36 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v789: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:38 np0005604215.localdomain ceph-mon[298604]: pgmap v789: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:38 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v790: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:39 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:39.876 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:40 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:40 np0005604215.localdomain ceph-mon[298604]: pgmap v790: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:40 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v791: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:40 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:40.735 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:41 np0005604215.localdomain sshd[319739]: main: sshd: ssh-rsa algorithm is disabled
Feb 01 10:08:41 np0005604215.localdomain sshd[319739]: Accepted publickey for zuul from 192.168.122.10 port 39100 ssh2: RSA SHA256:FaiiiQaEkJGWa0aviTZljfSthXoqY/a5WeXCnGz5d3s
Feb 01 10:08:41 np0005604215.localdomain systemd-logind[761]: New session 88 of user zuul.
Feb 01 10:08:41 np0005604215.localdomain systemd[1]: Started Session 88 of User zuul.
Feb 01 10:08:41 np0005604215.localdomain sshd[319739]: pam_unix(sshd:session): session opened for user zuul(uid=1000) by (uid=0)
Feb 01 10:08:41 np0005604215.localdomain sudo[319743]:     zuul : PWD=/home/zuul ; USER=root ; COMMAND=/bin/bash -c rm -rf /var/tmp/sos-osp && mkdir /var/tmp/sos-osp && sos report --batch --all-logs --tmp-dir=/var/tmp/sos-osp  -p container,openstack_edpm,system,storage,virt
Feb 01 10:08:41 np0005604215.localdomain sudo[319743]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1000)
Feb 01 10:08:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:08:41.786 158655 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:08:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:08:41.787 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:08:41 np0005604215.localdomain ovn_metadata_agent[158650]: 2026-02-01 10:08:41.787 158655 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:08:42 np0005604215.localdomain ceph-mon[298604]: pgmap v791: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: tmp-crun.HcStrC.mount: Deactivated successfully.
Feb 01 10:08:42 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v792: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:42 np0005604215.localdomain podman[319770]: 2026-02-01 10:08:42.646384897 +0000 UTC m=+0.092612013 container health_status 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, health_status=healthy, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, name=ubi9/ubi-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, maintainer=Red Hat, Inc., org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, version=9.7, build-date=2026-01-22T05:09:47Z, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., vcs-type=git, architecture=x86_64, config_id=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, release=1769056855, container_name=openstack_network_exporter, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc)
Feb 01 10:08:42 np0005604215.localdomain podman[319772]: 2026-02-01 10:08:42.663120118 +0000 UTC m=+0.100047085 container health_status c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, container_name=ovn_controller, org.label-schema.build-date=20260127, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4)
Feb 01 10:08:42 np0005604215.localdomain podman[319772]: 2026-02-01 10:08:42.754552013 +0000 UTC m=+0.191478980 container exec_died c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20260127)
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: c2ff29759d63e369dfc47cb104999aeff3b911c38bef981d5b2e8d25093f2835.service: Deactivated successfully.
Feb 01 10:08:42 np0005604215.localdomain podman[319770]: 2026-02-01 10:08:42.781717249 +0000 UTC m=+0.227944415 container exec_died 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc (image=quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d, name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9/ubi-minimal, architecture=x86_64, vendor=Red Hat, Inc., release=1769056855, version=9.7, io.openshift.tags=minimal rhel9, cpe=cpe:/a:redhat:enterprise_linux:9::appstream, com.redhat.component=ubi9-minimal-container, config_data={'command': [], 'environment': {'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter', 'test': '/openstack/healthcheck openstack-netwo'}, 'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter@sha256:95e30a9e9ee5f2452d2dfa3855f9e8fdaf49020bb364c6875c2cdae8579f6e5d', 'net': 'host', 'ports': ['9105:9105'], 'privileged': True, 'recreate': True, 'restart': 'always', 'volumes': ['/var/lib/openstack/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., org.opencontainers.image.revision=812a20485e9d8d728e95b468c2886da21352b9fc, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.openshift.expose-services=, vcs-ref=812a20485e9d8d728e95b468c2886da21352b9fc, container_name=openstack_network_exporter, build-date=2026-01-22T05:09:47Z, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, org.opencontainers.image.created=2026-01-22T05:09:47Z, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Feb 01 10:08:42 np0005604215.localdomain podman[319771]: 2026-02-01 10:08:42.746328768 +0000 UTC m=+0.185514296 container health_status 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2)
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: 1e81c1b86182cdec2675267a0409f7c3101cf714634d5a55e5ebe8324ebb45fc.service: Deactivated successfully.
Feb 01 10:08:42 np0005604215.localdomain podman[319771]: 2026-02-01 10:08:42.826236395 +0000 UTC m=+0.265421923 container exec_died 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df122b180261157f1de1391083b3d8abac306e2f12893ac7b9291feafc874311'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/openstack/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: 412d8177703a1df6d718b121467f1d2144461403048e54909e20ac7820fdf5d5.service: Deactivated successfully.
Feb 01 10:08:42 np0005604215.localdomain podman[319776]: 2026-02-01 10:08:42.870177423 +0000 UTC m=+0.304311233 container health_status c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, health_status=healthy, managed_by=edpm_ansible, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Feb 01 10:08:42 np0005604215.localdomain podman[319776]: 2026-02-01 10:08:42.880108651 +0000 UTC m=+0.314242521 container exec_died c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603 (image=quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c, name=node_exporter, config_data={'command': ['--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/node_exporter', 'test': '/openstack/healthcheck node_exporter'}, 'image': 'quay.io/prometheus/node-exporter@sha256:39c642b2b337e38c18e80266fb14383754178202f40103646337722a594d984c', 'net': 'host', 'ports': ['9100:9100'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Feb 01 10:08:42 np0005604215.localdomain systemd[1]: c52b82d6176d722c60ca710eaef0fb255b9c854186667ee27d01230a5b524603.service: Deactivated successfully.
Feb 01 10:08:43 np0005604215.localdomain ceph-mon[298604]: pgmap v792: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59713 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49197 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69389 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v793: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:44 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:44.919 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:44 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59722 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49203 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69401 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "status"} v 0)
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3655989950' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 01 10:08:45 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:45.736 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.59713 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.49197 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.69389 -' entity='client.admin' cmd=[{"prefix": "orch status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: pgmap v793: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.59722 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.49203 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.69401 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2930303390' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3030695088' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 01 10:08:45 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3655989950' entity='client.admin' cmd={"prefix": "status"} : dispatch
Feb 01 10:08:46 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v794: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:48 np0005604215.localdomain ovs-vsctl[320075]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Feb 01 10:08:48 np0005604215.localdomain ceph-mon[298604]: pgmap v794: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:48 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v795: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:48 np0005604215.localdomain virtqemud[224673]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Feb 01 10:08:48 np0005604215.localdomain virtqemud[224673]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Feb 01 10:08:48 np0005604215.localdomain virtqemud[224673]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Feb 01 10:08:49 np0005604215.localdomain systemd[1]: efi.automount: Got automount request for /efi, triggered by 320226 (lsinitrd)
Feb 01 10:08:49 np0005604215.localdomain systemd[1]: Mounting EFI System Partition Automount...
Feb 01 10:08:49 np0005604215.localdomain systemd[1]: Mounted EFI System Partition Automount.
Feb 01 10:08:49 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: cache status {prefix=cache status} (starting...)
Feb 01 10:08:49 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:49 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: client ls {prefix=client ls} (starting...)
Feb 01 10:08:49 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:49 np0005604215.localdomain lvm[320312]: PV /dev/loop4 online, VG ceph_vg1 is complete.
Feb 01 10:08:49 np0005604215.localdomain lvm[320312]: VG ceph_vg1 finished
Feb 01 10:08:49 np0005604215.localdomain lvm[320318]: PV /dev/loop3 online, VG ceph_vg0 is complete.
Feb 01 10:08:49 np0005604215.localdomain lvm[320318]: VG ceph_vg0 finished
Feb 01 10:08:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69416 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59746 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49218 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:49 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:49.954 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:49 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69425 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: damage ls {prefix=damage ls} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59758 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: dump loads {prefix=dump loads} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49227 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: dump tree {prefix=dump tree,root=/} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:50 np0005604215.localdomain ceph-mon[298604]: pgmap v795: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: dump_blocked_ops {prefix=dump_blocked_ops} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "report"} v 0)
Feb 01 10:08:50 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3671674022' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: dump_historic_ops {prefix=dump_historic_ops} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v796: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69452 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:50.693+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:50 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:50.736 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: dump_historic_ops_by_duration {prefix=dump_historic_ops_by_duration} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:50 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:08:50 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1557184720' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59794 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:50 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:50.976+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:50 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: dump_ops_in_flight {prefix=dump_ops_in_flight} (starting...)
Feb 01 10:08:50 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm"} v 0)
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/815896363' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49245 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:51.099+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:51 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: get subtrees {prefix=get subtrees} (starting...)
Feb 01 10:08:51 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config log"} v 0)
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2400779285' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: ops {prefix=ops} (starting...)
Feb 01 10:08:51 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.69416 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.59746 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.49218 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.69425 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.59758 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.49227 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3671674022' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: pgmap v796: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/4225870902' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.69452 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2543722143' entity='client.admin' cmd={"prefix": "report"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1557184720' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.59794 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3476630521' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/815896363' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.49245 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3435019968' entity='client.admin' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2400779285' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/177687973' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] scanning for idle connections..
Feb 01 10:08:51 np0005604215.localdomain ceph-mgr[278126]: [volumes INFO mgr_util] cleaning up connections: []
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config-key dump"} v 0)
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2087664243' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 01 10:08:51 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3438570122' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: session ls {prefix=session ls} (starting...)
Feb 01 10:08:52 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg Can't run that command on an inactive MDS!
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69509 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mds[276952]: mds.mds.np0005604215.rwvxvg asok_command: status {prefix=status} (starting...)
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59860 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3818264438' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2407799247' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/228446044' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/177687973' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2417881894' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/713414899' entity='client.admin' cmd={"prefix": "config log"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2087664243' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/355594869' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3438570122' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/625421462' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1649893447' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2532873274' entity='client.admin' cmd={"prefix": "config-key dump"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.69509 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.59860 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3818264438' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49302 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69527 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59878 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v797: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1160796356' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:52 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.
Feb 01 10:08:52 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49308 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:52 np0005604215.localdomain podman[320773]: 2026-02-01 10:08:52.868619627 +0000 UTC m=+0.084119920 container health_status 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20260127)
Feb 01 10:08:52 np0005604215.localdomain podman[320773]: 2026-02-01 10:08:52.880503276 +0000 UTC m=+0.096003569 container exec_died 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6 (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b85d0548925081ae8c6bdd697658cec4, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20260127, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'command': 'kolla_start', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal', 'EDPM_CONFIG_HASH': 'b489452d0fbfc60248928d346c9d0d46319800de5f37757197e2af2ded932519-df76d86ebef250bd697134497aae1383ef1748f5be4a1e05ee2fab91e8becbf8'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute', 'test': '/openstack/healthcheck compute'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'net': 'host', 'restart': 'always', 'security_opt': 'label:type:ceilometer_polling_t', 'user': 'ceilometer', 'volumes': ['/var/lib/openstack/telemetry:/var/lib/kolla/config_files/src:z', '/var/lib/kolla/config_files/ceilometer_agent_compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=ceilometer_agent_compute, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Feb 01 10:08:52 np0005604215.localdomain systemd[1]: 3bbf98158f72b9f2627ab6262c9747e51e0453ba4964c8b85046158976642de6.service: Deactivated successfully.
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "features"} v 0)
Feb 01 10:08:52 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1730585574' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr stat"} v 0)
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3008078364' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/416931476' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.49302 -' entity='client.admin' cmd=[{"prefix": "crash ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/343376414' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.69527 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.59878 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: pgmap v797: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1160796356' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.49308 -' entity='client.admin' cmd=[{"prefix": "crash stat", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3554343423' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2621346718' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1730585574' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? ' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/664413012' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3008078364' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3790036772' entity='client.admin' cmd={"prefix": "features"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1339406215' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1804523666' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail"} v 0)
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2033218375' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 01 10:08:53 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/923184260' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:53 np0005604215.localdomain sudo[320900]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/which python3
Feb 01 10:08:53 np0005604215.localdomain sudo[320900]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:08:53 np0005604215.localdomain systemd[1]: Started /usr/bin/podman healthcheck run a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.
Feb 01 10:08:53 np0005604215.localdomain sudo[320900]: pam_unix(sudo:session): session closed for user root
Feb 01 10:08:53 np0005604215.localdomain sudo[320940]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/python3 /var/lib/ceph/33fac0b9-80c7-560f-918a-c92d3021ca1e/cephadm.a31fbded8455ec58c0f5dde8c3d5c4a1f59de5789b672b4f3ea9cd8d29a0d1c3 --timeout 895 gather-facts
Feb 01 10:08:53 np0005604215.localdomain sudo[320940]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:08:53 np0005604215.localdomain systemd[1]: tmp-crun.Vi3qVP.mount: Deactivated successfully.
Feb 01 10:08:53 np0005604215.localdomain podman[320938]: 2026-02-01 10:08:53.858572919 +0000 UTC m=+0.082232260 container health_status a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, health_status=healthy, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:08:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69596 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Feb 01 10:08:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:53.866+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Feb 01 10:08:53 np0005604215.localdomain podman[320938]: 2026-02-01 10:08:53.879631865 +0000 UTC m=+0.103291256 container exec_died a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d (image=quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'environment': {'CONTAINER_HOST': 'unix:///run/podman/podman.sock', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/podman_exporter', 'test': '/openstack/healthcheck podman_exporter'}, 'image': 'quay.io/navidys/prometheus-podman-exporter@sha256:d339ba049bbd1adccb795962bf163f5b22fd84dea865d88b9eb525e46247d6bd', 'net': 'host', 'ports': ['9882:9882'], 'privileged': True, 'recreate': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=podman_exporter, container_name=podman_exporter)
Feb 01 10:08:53 np0005604215.localdomain systemd[1]: a1727fb04d4959d73005ec0bc3202e0bb0e547dc9cc4d8bed82bddac8cdc638d.service: Deactivated successfully.
Feb 01 10:08:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59947 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:53 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Feb 01 10:08:53 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:53.919+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Feb 01 10:08:53 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69605 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49356 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Feb 01 10:08:54 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:54.095+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'insights' is not enabled/loaded (required by command 'insights'): use `ceph mgr module enable insights` to enable it
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.107:0/1313299484' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/593729190' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69623 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain sudo[320940]: pam_unix(sudo:session): session closed for user root
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2033218375' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1409580558' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/923184260' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2149349104' entity='client.admin' cmd={"prefix": "health", "detail": "detail"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4108111134' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/4237289556' entity='client.admin' cmd={"prefix": "mgr stat"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.69596 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.59947 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.69605 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.49356 -' entity='client.admin' cmd=[{"prefix": "insights", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2205205682' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1313299484' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/593729190' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.69623 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1889768941' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config generate-minimal-conf"} v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "auth get", "entity": "client.admin"} v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [INF] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/cephadm/osd_remove_queue}] v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] update: starting ev 2ca004ce-1bb0-4311-b29d-02d3704ef94b (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] complete: finished ev 2ca004ce-1bb0-4311-b29d-02d3704ef94b (Updating node-proxy deployment (+3 -> 3))
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Completed event 2ca004ce-1bb0-4311-b29d-02d3704ef94b (Updating node-proxy deployment (+3 -> 3)) in 0 seconds
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "states": ["destroyed"], "format": "json"} v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59965 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49374 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v798: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69629 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain sudo[321123]: ceph-admin : PWD=/home/ceph-admin ; USER=root ; COMMAND=/bin/ls /etc/sysctl.d
Feb 01 10:08:54 np0005604215.localdomain sudo[321123]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=1002)
Feb 01 10:08:54 np0005604215.localdomain sudo[321123]: pam_unix(sudo:session): session closed for user root
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} v 0)
Feb 01 10:08:54 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3277300199' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 01 10:08:54 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.59980 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:54 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:54.955 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69647 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49389 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr dump"} v 0)
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3667816937' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69665 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69671 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49401 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "config generate-minimal-conf"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "auth get", "entity": "client.admin"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3518407237' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "audit"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 172.18.0.108:0/3988766652' entity='mgr.np0005604215.uhhqtv' cmd={"prefix": "osd tree", "states": ["destroyed"], "format": "json"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.59965 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.49374 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: pgmap v798: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.69629 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3277300199' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/4032511149' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.59980 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1626695449' entity='client.admin' cmd={"prefix": "log last", "num": 10000, "level": "debug", "channel": "cluster"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.69647 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.49389 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3667816937' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.69665 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/263854055' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:49.300890+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 729088 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 88 ms_handle_refused con 0x557982421000 session 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:50.301082+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 88 heartbeat osd_stat(store_statfs(0x1ba08e000/0x0/0x1bfc00000, data 0x197e922/0x19ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 729088 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:51.301228+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 729088 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:52.301428+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 729088 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:53.301612+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 729088 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 824743 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 45
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now 
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect Terminating session with v2:172.18.0.105:6800/155238379
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect No active mgr available yet
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 72.262588501s of 72.291702271s, submitted: 6
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:54.301744+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19810b0/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94978048 unmapped: 778240 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19810b0/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:55.301894+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 46
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect Starting new session with [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: get_auth_request con 0x557982dd2800 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_configure stats_period=5
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95100928 unmapped: 655360 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:56.302032+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95100928 unmapped: 655360 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:57.302189+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 47
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 663552 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:58.302347+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 663552 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:59.302473+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 663552 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:00.302587+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 663552 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_monmap mon_map magic: 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient:  got monmap 14 from mon.np0005604215 (according to old e14)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: dump:
                                                          epoch 14
                                                          fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                          last_changed 2026-02-01T09:47:31.128772+0000
                                                          created 2026-02-01T07:37:52.883666+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: mon.np0005604215 at [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] went away
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _reopen_session rank -1
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _add_conns ranks=[0,1]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): picked mon.np0005604212 con 0x557982bbf800 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): picked mon.np0005604213 con 0x557982d29400 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): start opening mon connection
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): start opening mon connection
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _finish_auth 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request con 0x557982bbf800 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth method 2
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request con 0x557982d29400 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth method 2
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_done global_id 24224 payload 293
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _finish_hunting 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: found mon.np0005604213
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604213 at v2:172.18.0.104:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _finish_auth 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:01.142783+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604213 at v2:172.18.0.104:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: ms_handle_reset current mon [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _reopen_session rank -1
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _add_conns ranks=[0,1]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): picked mon.np0005604212 con 0x557982bbf800 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): picked mon.np0005604213 con 0x557982d28000 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): start opening mon connection
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): start opening mon connection
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 ms_handle_reset con 0x557982d29400 session 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request con 0x557982d28000 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth method 2
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request con 0x557982bbf800 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth method 2
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient(hunting): handle_auth_done global_id 24224 payload 293
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _finish_hunting 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: found mon.np0005604212
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _finish_auth 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:01.151425+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_monmap mon_map magic: 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient:  got monmap 14 from mon.np0005604212 (according to old e14)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: dump:
                                                          epoch 14
                                                          fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                          last_changed 2026-02-01T09:47:31.128772+0000
                                                          created 2026-02-01T07:37:52.883666+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_config config(7 keys)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: set_mon_vals no callback set
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 47
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:01.302696+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:02.302825+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:03.303014+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:04.303196+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:05.303407+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:06.303573+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:07.303791+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:08.303973+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:09.304217+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:10.304403+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:11.304574+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:12.304749+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:13.304997+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:14.305119+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:15.305239+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:16.305371+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:17.305544+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:18.305691+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:19.305846+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:20.305987+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_monmap mon_map magic: 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient:  got monmap 15 from mon.np0005604212 (according to old e15)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: dump:
                                                          epoch 15
                                                          fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                          last_changed 2026-02-01T09:47:50.388496+0000
                                                          created 2026-02-01T07:37:52.883666+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604215
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:21.306145+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:22.306323+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:23.306514+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:24.306695+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:25.306853+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:26.306995+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:27.307193+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:28.307363+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:29.307495+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:30.307684+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 heartbeat osd_stat(store_statfs(0x1ba08a000/0x0/0x1bfc00000, data 0x19813fa/0x1a03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:31.307856+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:32.308012+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:33.308188+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 737280 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:34.308358+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 827567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 48
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now 
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect Terminating session with v2:172.18.0.200:6800/2795591711
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect No active mgr available yet
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 40.252487183s of 40.276077271s, submitted: 6
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 ms_handle_reset con 0x5579842c9800 session 0x5579829d7860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95232000 unmapped: 524288 heap: 95756288 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:35.308496+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 49
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: get_auth_request con 0x557982d29400 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_configure stats_period=5
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95379456 unmapped: 1425408 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:36.308656+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95379456 unmapped: 1425408 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 50
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:37.308863+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95232000 unmapped: 1572864 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:38.309013+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95232000 unmapped: 1572864 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 51
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:39.309153+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95379456 unmapped: 1425408 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:40.309323+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 52
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:41.309458+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:42.309791+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:43.309976+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:44.310147+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:45.310354+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:46.310612+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:47.310790+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:48.310950+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:49.311116+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:50.311283+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:51.311667+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:52.311876+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:53.312139+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:54.312381+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:55.312711+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:56.313347+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:57.313915+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:58.314359+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:59.314714+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:00.314983+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:01.315211+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:02.315382+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:03.315750+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:04.316078+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:05.316368+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:06.316604+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:07.316895+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:08.317119+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:09.317362+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:10.317527+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:11.317763+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:12.318789+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:13.319229+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:14.319769+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:15.320413+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:16.320698+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:17.321211+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:18.321396+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:19.322153+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 301989888 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:20.322806+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:21.323158+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:22.323346+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:23.323519+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:24.323715+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:25.324037+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:26.324352+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:27.324590+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:28.324742+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:29.324936+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:30.325409+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:31.325630+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:32.325779+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:33.325941+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:34.326135+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:35.326414+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:36.326633+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:37.326915+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:38.327163+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:39.327678+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:40.328048+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:41.328334+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:42.328690+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:43.328904+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:44.329726+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:45.330181+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:46.330965+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:47.331548+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:48.331823+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:49.332025+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 830567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:50.332454+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95092736 unmapped: 1712128 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 53
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now 
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/1840881422
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect No active mgr available yet
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 90 handle_osd_map epochs [91,91], i have 90, src has [1,91]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 76.688491821s of 76.714271545s, submitted: 7
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 ms_handle_reset con 0x55797ff34400 session 0x55798240cb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:51.332598+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95232000 unmapped: 1572864 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba086000/0x0/0x1bfc00000, data 0x198402a/0x1a07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 54
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: get_auth_request con 0x557982d64800 auth_method 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_configure stats_period=5
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:52.332740+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94699520 unmapped: 2105344 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:53.332908+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 55
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94699520 unmapped: 2105344 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:54.333069+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94699520 unmapped: 2105344 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:55.333233+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94699520 unmapped: 2105344 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 56
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:56.333386+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94699520 unmapped: 2105344 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 57
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:57.333562+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:58.333793+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:59.333985+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:00.334189+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:01.334382+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:02.334543+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:03.334753+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:04.334935+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:05.335203+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:06.335361+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:07.335594+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:08.335774+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:09.336017+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:10.336262+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:11.336470+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:12.336637+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:13.336833+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:14.337060+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:15.337356+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:16.337532+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:17.337744+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:18.337982+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:19.338175+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:20.338359+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:21.338510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:22.338664+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:23.338856+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:24.339033+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:25.339227+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:26.339474+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:27.339685+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:28.339874+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:29.340115+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:30.340357+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:31.340566+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:32.340755+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:33.341096+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:34.341270+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:35.341538+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:36.341737+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:37.341909+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:38.342020+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:39.342250+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:40.342430+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:41.342655+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:42.342812+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:43.343005+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:44.343215+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:45.343434+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:46.343637+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:47.343822+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:48.344021+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:49.344262+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:50.344444+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:51.344621+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:52.344781+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:53.344961+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:54.345135+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:55.345373+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:56.345573+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:57.345810+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:58.346024+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:59.346170+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:00.346344+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:01.346500+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:02.346650+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:03.346844+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:04.347021+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:05.347162+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:06.347361+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:07.347596+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:08.347753+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:09.347934+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:10.348112+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:11.348255+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:12.348430+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:13.348637+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:14.348841+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:15.349037+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:16.349185+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:17.349365+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:18.349508+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986db2/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:19.349688+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 833567 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5426 writes, 23K keys, 5426 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5426 writes, 740 syncs, 7.33 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 109 writes, 398 keys, 109 commit groups, 1.0 writes per commit group, ingest: 0.49 MB, 0.00 MB/s
                                                          Interval WAL: 109 writes, 47 syncs, 2.32 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:20.349831+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:21.349993+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 90.121688843s of 90.145111084s, submitted: 7
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba083000/0x0/0x1bfc00000, data 0x1986ecc/0x1a0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:22.350118+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:23.350261+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 58
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94617600 unmapped: 2187264 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:24.350438+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 94625792 unmapped: 2179072 heap: 96804864 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 834220 data_alloc: 285212672 data_used: 13291520
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:25.350601+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 heartbeat osd_stat(store_statfs(0x1ba082000/0x0/0x1bfc00000, data 0x1986eea/0x1a0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 91 handle_osd_map epochs [91,92], i have 91, src has [1,92]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 92 handle_osd_map epochs [92,92], i have 92, src has [1,92]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95223808 unmapped: 17317888 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:26.350767+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95223808 unmapped: 17317888 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:27.350934+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95223808 unmapped: 17317888 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:28.351092+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95223808 unmapped: 17317888 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:29.351256+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95019008 unmapped: 17522688 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 92 handle_osd_map epochs [93,93], i have 92, src has [1,93]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:30.351452+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:31.351610+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:32.373492+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:33.373650+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:34.373895+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:35.374085+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:36.374266+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:37.374482+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:38.374602+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:39.374748+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:40.374881+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:41.375034+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:42.375171+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:43.375337+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:44.375538+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:45.375716+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:46.375910+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:47.376097+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:48.376326+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:49.376521+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:50.376747+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:51.376952+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:52.377147+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:53.377617+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:54.377874+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:55.378083+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:56.378269+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:57.378528+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:58.378725+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:59.378936+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:00.379130+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:01.379398+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:02.379639+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:03.379840+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:04.379995+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:05.380201+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:06.380374+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:07.380571+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:08.380736+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:09.380930+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:10.381125+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:11.381273+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:12.381469+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:13.381734+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:14.382001+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:15.382198+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:16.382354+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:17.382592+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:18.382769+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:19.382963+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:20.383082+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982cd0400 session 0x5579827a6d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccec00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:21.383214+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:22.383344+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:23.383471+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:24.383628+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:25.383813+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:26.383974+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:27.384189+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:28.384361+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:29.384526+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:30.384761+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:31.384940+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:32.385175+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:33.385335+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:34.385472+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:35.385705+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:36.385853+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:37.386091+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:38.386344+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:39.386544+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:40.386714+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:41.386900+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:42.387087+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b9078000/0x0/0x1bfc00000, data 0x298c14e/0x2a15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:43.387248+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982a10400 session 0x5579811f8780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:44.387488+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x5579810c0000 session 0x5579823f81e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a12000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982a12000 session 0x5579811f9a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:45.387651+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 952764 data_alloc: 285212672 data_used: 13316096
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 95027200 unmapped: 17514496 heap: 112541696 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x55797ff34400 session 0x5579811f81e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 84.842269897s of 84.957778931s, submitted: 18
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:46.387833+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x5579810c0000 session 0x557983d56960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b80b2000/0x0/0x1bfc00000, data 0x364f1bf/0x36dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982a10400 session 0x557983d56f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982d29400 session 0x55798240c960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982a11400 session 0x5579823db4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x55797ff34400 session 0x5579823a1e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x5579810c0000 session 0x557983d561e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 96436224 unmapped: 31932416 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:47.388011+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd1000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982cd1000 session 0x557983d56d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 96452608 unmapped: 31916032 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:48.388199+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844db000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x5579844db000 session 0x55798048ab40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bbec00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982bbec00 session 0x55798048b860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x55797ff34400 session 0x5579823a1c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 96280576 unmapped: 32088064 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:49.389051+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 96403456 unmapped: 31965184 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:50.389256+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b6f2a000/0x0/0x1bfc00000, data 0x4ad7231/0x4b64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd1000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1233082 data_alloc: 285212672 data_used: 13774848
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 105037824 unmapped: 23330816 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:51.389396+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b6f05000/0x0/0x1bfc00000, data 0x4afb254/0x4b89000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 109617152 unmapped: 18751488 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:52.389534+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 110256128 unmapped: 18112512 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:53.389712+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 110256128 unmapped: 18112512 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:54.389942+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 110256128 unmapped: 18112512 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:55.390130+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1348574 data_alloc: 301989888 data_used: 27795456
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b6f05000/0x0/0x1bfc00000, data 0x4afb254/0x4b89000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 110338048 unmapped: 18030592 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:56.390315+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 110338048 unmapped: 18030592 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:57.390510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b6f05000/0x0/0x1bfc00000, data 0x4afb254/0x4b89000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 110354432 unmapped: 18014208 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:58.390956+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.335810661s of 12.898536682s, submitted: 112
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 112050176 unmapped: 16318464 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:59.391130+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122732544 unmapped: 5636096 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:00.391379+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1539398 data_alloc: 301989888 data_used: 29134848
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122380288 unmapped: 5988352 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:01.391479+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122658816 unmapped: 5709824 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:02.391647+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cce400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b5232000/0x0/0x1bfc00000, data 0x67c0254/0x684e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 ms_handle_reset con 0x557982cd1000 session 0x557983d574a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:03.391781+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122839040 unmapped: 5529600 heap: 128368640 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 heartbeat osd_stat(store_statfs(0x1b51a1000/0x0/0x1bfc00000, data 0x684f287/0x68df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 94 ms_handle_reset con 0x557982421c00 session 0x5579813325a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 94 heartbeat osd_stat(store_statfs(0x1b51a1000/0x0/0x1bfc00000, data 0x684f287/0x68df000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 94 ms_handle_reset con 0x5579810c0000 session 0x55798137a960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:04.391982+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122642432 unmapped: 6774784 heap: 129417216 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:05.392133+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122650624 unmapped: 6766592 heap: 129417216 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1598031 data_alloc: 301989888 data_used: 29900800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 94 handle_osd_map epochs [95,95], i have 94, src has [1,95]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:06.392346+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124002304 unmapped: 5414912 heap: 129417216 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 95 ms_handle_reset con 0x557982389c00 session 0x5579823a1c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 95 ms_handle_reset con 0x557982a10400 session 0x557980b95860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:07.392509+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 132259840 unmapped: 14770176 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 96 ms_handle_reset con 0x55797ff34400 session 0x5579823e83c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 96 handle_osd_map epochs [96,96], i have 96, src has [1,96]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:08.392674+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 132284416 unmapped: 14745600 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 96 heartbeat osd_stat(store_statfs(0x1b43c1000/0x0/0x1bfc00000, data 0x7639fc4/0x76cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 96 handle_osd_map epochs [97,97], i have 96, src has [1,97]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.539793968s of 10.022453308s, submitted: 439
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:09.392825+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 132390912 unmapped: 14639104 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 handle_osd_map epochs [97,97], i have 97, src has [1,97]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 ms_handle_reset con 0x557982cce400 session 0x557980b963c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b439b000/0x0/0x1bfc00000, data 0x765d91a/0x76f1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:10.392947+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120242176 unmapped: 26787840 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1123107 data_alloc: 285212672 data_used: 13352960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 ms_handle_reset con 0x557982420c00 session 0x5579823e9c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 ms_handle_reset con 0x5579844d9000 session 0x5579823f8f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 heartbeat osd_stat(store_statfs(0x1b8287000/0x0/0x1bfc00000, data 0x3776837/0x3806000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:11.393108+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 112566272 unmapped: 34463744 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 97 handle_osd_map epochs [98,98], i have 97, src has [1,98]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:12.393415+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 112566272 unmapped: 34463744 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:13.393569+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 112566272 unmapped: 34463744 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 ms_handle_reset con 0x557982c17c00 session 0x557983f41860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 heartbeat osd_stat(store_statfs(0x1b82ca000/0x0/0x1bfc00000, data 0x3731034/0x37c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 ms_handle_reset con 0x557982d65000 session 0x557983f41c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 ms_handle_reset con 0x557981272000 session 0x557983f41a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c8000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:14.393715+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 ms_handle_reset con 0x5579842c8000 session 0x557983f40d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 112574464 unmapped: 34455552 heap: 147030016 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 heartbeat osd_stat(store_statfs(0x1b82cc000/0x0/0x1bfc00000, data 0x3731044/0x37c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:15.393844+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 114843648 unmapped: 38313984 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 ms_handle_reset con 0x557982d65800 session 0x557983f40b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1342992 data_alloc: 285212672 data_used: 13381632
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 ms_handle_reset con 0x557981272000 session 0x557980b97a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:16.394128+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 114900992 unmapped: 38256640 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 98 handle_osd_map epochs [99,99], i have 98, src has [1,99]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 99 ms_handle_reset con 0x557982c17c00 session 0x557981612b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 99 ms_handle_reset con 0x557982d65000 session 0x5579827beb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c8000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 99 ms_handle_reset con 0x5579842c8000 session 0x557983f410e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:17.394366+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 114941952 unmapped: 38215680 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 99 handle_osd_map epochs [99,99], i have 99, src has [1,99]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c8800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 99 heartbeat osd_stat(store_statfs(0x1b6759000/0x0/0x1bfc00000, data 0x52a1848/0x5334000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 99 handle_osd_map epochs [100,100], i have 99, src has [1,100]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:18.394525+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 115007488 unmapped: 38150144 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 ms_handle_reset con 0x5579842c8800 session 0x557983df8f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:19.394707+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 114597888 unmapped: 38559744 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:20.394826+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 114597888 unmapped: 38559744 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1288091 data_alloc: 301989888 data_used: 19722240
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:21.394975+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 114597888 unmapped: 38559744 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 heartbeat osd_stat(store_statfs(0x1b74ee000/0x0/0x1bfc00000, data 0x450c1ba/0x45a0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.919137001s of 13.022070885s, submitted: 243
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:22.395116+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122388480 unmapped: 30769152 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 ms_handle_reset con 0x55797ff34400 session 0x557981612000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:23.395270+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117276672 unmapped: 35880960 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:24.395424+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117284864 unmapped: 35872768 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 ms_handle_reset con 0x557982c16000 session 0x5579816e21e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ce2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 ms_handle_reset con 0x557982ce2000 session 0x5579816e30e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:25.395572+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117284864 unmapped: 35872768 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1420739 data_alloc: 301989888 data_used: 19726336
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 ms_handle_reset con 0x557982dd2000 session 0x5579816e2b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 ms_handle_reset con 0x557982c17400 session 0x557983f7a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55797ff34400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:26.395730+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117342208 unmapped: 35815424 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 100 handle_osd_map epochs [101,101], i have 100, src has [1,101]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:27.395895+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b648e000/0x0/0x1bfc00000, data 0x55679de/0x55ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117415936 unmapped: 35741696 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:28.396111+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117743616 unmapped: 35414016 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b648e000/0x0/0x1bfc00000, data 0x55679de/0x55ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:29.396276+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117743616 unmapped: 35414016 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:30.396477+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117735424 unmapped: 35422208 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1438904 data_alloc: 301989888 data_used: 20779008
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:31.396670+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 101 heartbeat osd_stat(store_statfs(0x1b648b000/0x0/0x1bfc00000, data 0x556b9de/0x5603000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117751808 unmapped: 35405824 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 101 ms_handle_reset con 0x557982c16400 session 0x55798136c780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 101 ms_handle_reset con 0x5579842c9400 session 0x557983f403c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:32.396859+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117760000 unmapped: 35397632 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.501235008s of 10.836386681s, submitted: 70
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:33.397043+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 101 handle_osd_map epochs [101,102], i have 101, src has [1,102]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 ms_handle_reset con 0x557982a13400 session 0x5579823a45a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 117784576 unmapped: 35373056 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 ms_handle_reset con 0x557982389800 session 0x5579823a4000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:34.397179+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 ms_handle_reset con 0x55797ff34400 session 0x557981333e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 ms_handle_reset con 0x557982c16000 session 0x557983f7a780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 ms_handle_reset con 0x557982d29c00 session 0x5579816e23c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121479168 unmapped: 31678464 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:35.397370+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 103 heartbeat osd_stat(store_statfs(0x1b541e000/0x0/0x1bfc00000, data 0x65d72fc/0x6670000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 103 ms_handle_reset con 0x557982389800 session 0x557983f7a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121593856 unmapped: 31563776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1574782 data_alloc: 301989888 data_used: 24174592
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:36.397553+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 103 handle_osd_map epochs [104,104], i have 103, src has [1,104]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121667584 unmapped: 31490048 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:37.397748+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130711552 unmapped: 22446080 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd3000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:38.397916+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 ms_handle_reset con 0x557982dd3000 session 0x5579816e30e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121864192 unmapped: 31293440 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:39.398070+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122470400 unmapped: 30687232 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:40.398219+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122470400 unmapped: 30687232 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 ms_handle_reset con 0x557982d65000 session 0x557983f7be00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 ms_handle_reset con 0x557982a11000 session 0x5579811381e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1469332 data_alloc: 285212672 data_used: 13590528
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 heartbeat osd_stat(store_statfs(0x1b5823000/0x0/0x1bfc00000, data 0x5d6b5fa/0x5e07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 heartbeat osd_stat(store_statfs(0x1b5823000/0x0/0x1bfc00000, data 0x5d6b5fa/0x5e07000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:41.398443+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 ms_handle_reset con 0x557982389800 session 0x5579827a7e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121372672 unmapped: 31784960 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 104 handle_osd_map epochs [105,105], i have 104, src has [1,105]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.10] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982c16000 session 0x5579853ce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982d29c00 session 0x5579853ce1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd3000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982dd3000 session 0x5579853ce3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[2.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982389800 session 0x5579853ce5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:42.398585+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121421824 unmapped: 31735808 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a12800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.639682770s of 10.038187027s, submitted: 362
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982a12800 session 0x5579853ce780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:43.399483+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 123297792 unmapped: 29859840 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 heartbeat osd_stat(store_statfs(0x1b7fdc000/0x0/0x1bfc00000, data 0x3a13e50/0x3ab1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x5579823e5c00 session 0x557983f405a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c8c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:44.399624+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x5579842c8c00 session 0x55798277f0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 123297792 unmapped: 29859840 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982421c00 session 0x5579823a1e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 ms_handle_reset con 0x557982389800 session 0x5579853ceb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:45.399803+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122814464 unmapped: 30343168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1368421 data_alloc: 285212672 data_used: 13430784
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a12800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:46.400017+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122839040 unmapped: 30318592 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 ms_handle_reset con 0x557982421c00 session 0x5579823e83c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:47.400269+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122060800 unmapped: 31096832 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b7d31000/0x0/0x1bfc00000, data 0x3cbe783/0x3d5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:48.400513+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122060800 unmapped: 31096832 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 ms_handle_reset con 0x557982a12800 session 0x55798528c780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 ms_handle_reset con 0x5579823e5c00 session 0x5579853ced20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b7d31000/0x0/0x1bfc00000, data 0x3cbe783/0x3d5d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd3400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:49.400659+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120741888 unmapped: 32415744 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 ms_handle_reset con 0x557982dd3400 session 0x5579853cf0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 heartbeat osd_stat(store_statfs(0x1b901f000/0x0/0x1bfc00000, data 0x29ad711/0x2a4a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:50.400851+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1105687 data_alloc: 285212672 data_used: 13438976
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:51.401083+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:52.401339+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:53.401525+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b903f000/0x0/0x1bfc00000, data 0x29afef2/0x2a4d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:54.401759+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:55.402068+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1110369 data_alloc: 285212672 data_used: 13463552
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:56.402277+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:57.402510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b903f000/0x0/0x1bfc00000, data 0x29afef2/0x2a4d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:58.402731+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:59.402929+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:00.403147+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 107 heartbeat osd_stat(store_statfs(0x1b903f000/0x0/0x1bfc00000, data 0x29afef2/0x2a4d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120766464 unmapped: 32391168 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1110369 data_alloc: 285212672 data_used: 13463552
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 17.558017731s of 17.956705093s, submitted: 98
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ce2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 107 handle_osd_map epochs [107,108], i have 107, src has [1,108]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 108 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:01.403281+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 108 ms_handle_reset con 0x557982ce2000 session 0x5579853cf860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120537088 unmapped: 32620544 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:02.403428+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120537088 unmapped: 32620544 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:03.403622+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 108 handle_osd_map epochs [109,109], i have 108, src has [1,109]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 ms_handle_reset con 0x557982e36800 session 0x5579853cfa40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:04.403780+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x29b5182/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:05.403965+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1119587 data_alloc: 285212672 data_used: 13488128
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x29b5182/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:06.404191+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x29b5182/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x29b5182/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:07.404410+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:08.404663+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:09.404853+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:10.405056+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1119587 data_alloc: 285212672 data_used: 13488128
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 heartbeat osd_stat(store_statfs(0x1b8c38000/0x0/0x1bfc00000, data 0x29b5182/0x2a55000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:11.405246+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.216014862s of 10.422396660s, submitted: 55
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120496128 unmapped: 32661504 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:12.405414+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120496128 unmapped: 32661504 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:13.405577+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:14.405756+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 110 heartbeat osd_stat(store_statfs(0x1b8c34000/0x0/0x1bfc00000, data 0x29b7986/0x2a59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:15.405939+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1122589 data_alloc: 285212672 data_used: 13488128
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:16.406139+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:17.406362+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:18.406577+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 110 heartbeat osd_stat(store_statfs(0x1b8c34000/0x0/0x1bfc00000, data 0x29b7986/0x2a59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:19.406744+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:20.406928+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120528896 unmapped: 32628736 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1122589 data_alloc: 285212672 data_used: 13488128
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:21.407075+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120537088 unmapped: 32620544 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 110 heartbeat osd_stat(store_statfs(0x1b8c34000/0x0/0x1bfc00000, data 0x29b7986/0x2a59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:22.407245+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccdc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.947183609s of 10.974117279s, submitted: 14
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120537088 unmapped: 32620544 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:23.407453+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 110 heartbeat osd_stat(store_statfs(0x1b8c34000/0x0/0x1bfc00000, data 0x29b7986/0x2a59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 110 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 111 ms_handle_reset con 0x557982ccdc00 session 0x5579853cfe00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120545280 unmapped: 32612352 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:24.407604+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120545280 unmapped: 32612352 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:25.407758+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120545280 unmapped: 32612352 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1127539 data_alloc: 285212672 data_used: 13500416
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:26.407908+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 111 heartbeat osd_stat(store_statfs(0x1b8c2f000/0x0/0x1bfc00000, data 0x29ba6a4/0x2a5e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120553472 unmapped: 32604160 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:27.408132+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120553472 unmapped: 32604160 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 111 handle_osd_map epochs [112,112], i have 111, src has [1,112]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 112 ms_handle_reset con 0x557982ccf800 session 0x557983f40960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:28.408277+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 112 heartbeat osd_stat(store_statfs(0x1b8c2c000/0x0/0x1bfc00000, data 0x29bcc16/0x2a61000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:29.408502+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:30.408696+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1129833 data_alloc: 285212672 data_used: 13500416
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 112 heartbeat osd_stat(store_statfs(0x1b8c2c000/0x0/0x1bfc00000, data 0x29bcc16/0x2a61000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:31.408834+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:32.409033+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 112 heartbeat osd_stat(store_statfs(0x1b8c2c000/0x0/0x1bfc00000, data 0x29bcc16/0x2a61000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:33.409212+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:34.409360+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:35.409598+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120504320 unmapped: 32653312 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1129833 data_alloc: 285212672 data_used: 13500416
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:36.410024+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 112 handle_osd_map epochs [113,113], i have 112, src has [1,113]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 13.972243309s of 14.050683022s, submitted: 59
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:37.410525+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 113 heartbeat osd_stat(store_statfs(0x1b8c28000/0x0/0x1bfc00000, data 0x29bf41a/0x2a65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:38.410994+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120520704 unmapped: 32636928 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:39.411418+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120528896 unmapped: 32628736 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:40.411690+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120528896 unmapped: 32628736 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1132835 data_alloc: 285212672 data_used: 13500416
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:41.411931+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120528896 unmapped: 32628736 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:42.412383+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 113 heartbeat osd_stat(store_statfs(0x1b8c28000/0x0/0x1bfc00000, data 0x29bf41a/0x2a65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120528896 unmapped: 32628736 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:43.412673+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120528896 unmapped: 32628736 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982388c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:44.412960+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120537088 unmapped: 32620544 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 113 handle_osd_map epochs [114,114], i have 113, src has [1,114]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 114 ms_handle_reset con 0x557982388c00 session 0x557983f410e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:45.413202+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120561664 unmapped: 32595968 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1139777 data_alloc: 285212672 data_used: 13512704
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:46.413426+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982388c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.998906136s of 10.088939667s, submitted: 25
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 114 heartbeat osd_stat(store_statfs(0x1b8c20000/0x0/0x1bfc00000, data 0x29c218e/0x2a6d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:47.413649+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 114 handle_osd_map epochs [115,115], i have 114, src has [1,115]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 115 ms_handle_reset con 0x557982388c00 session 0x55798236b2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120569856 unmapped: 32587776 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 115 ms_handle_reset con 0x557982420800 session 0x55798137b680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 115 ms_handle_reset con 0x557980b5a000 session 0x5579823db0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:48.414002+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 115 heartbeat osd_stat(store_statfs(0x1b8c1c000/0x0/0x1bfc00000, data 0x29c4aac/0x2a71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 115 handle_osd_map epochs [116,116], i have 115, src has [1,116]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120659968 unmapped: 32497664 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 116 ms_handle_reset con 0x5579842c9400 session 0x55798528cd20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd2800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:49.414243+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 116 handle_osd_map epochs [117,117], i have 116, src has [1,117]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120684544 unmapped: 32473088 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 ms_handle_reset con 0x557982dd2800 session 0x55798528c000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:50.414458+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120709120 unmapped: 32448512 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1148868 data_alloc: 285212672 data_used: 13512704
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:51.414629+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 heartbeat osd_stat(store_statfs(0x1b8c19000/0x0/0x1bfc00000, data 0x29c993a/0x2a75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 120709120 unmapped: 32448512 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 ms_handle_reset con 0x557980b5a000 session 0x55798528c1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:52.414763+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 31277056 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:53.415030+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 31277056 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 heartbeat osd_stat(store_statfs(0x1b8228000/0x0/0x1bfc00000, data 0x33b994a/0x3466000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:54.415335+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121888768 unmapped: 31268864 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 117 handle_osd_map epochs [117,118], i have 117, src has [1,118]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:55.415465+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 118 ms_handle_reset con 0x557982dd2000 session 0x557980b974a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121896960 unmapped: 31260672 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844da000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1240366 data_alloc: 285212672 data_used: 13524992
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 118 ms_handle_reset con 0x5579844da000 session 0x557983df8d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:56.415605+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 118 handle_osd_map epochs [119,119], i have 118, src has [1,119]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121905152 unmapped: 31252480 heap: 153157632 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:57.415756+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.449087143s of 10.956945419s, submitted: 129
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131014656 unmapped: 30539776 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:58.415913+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121937920 unmapped: 39616512 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 119 heartbeat osd_stat(store_statfs(0x1b7a19000/0x0/0x1bfc00000, data 0x3bbef12/0x3c74000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:59.416096+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121208832 unmapped: 40345600 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:00.416354+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121233408 unmapped: 40321024 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1355571 data_alloc: 285212672 data_used: 13541376
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844dac00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:01.416506+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121167872 unmapped: 40386560 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:02.416676+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 119 handle_osd_map epochs [120,120], i have 119, src has [1,120]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 120 ms_handle_reset con 0x557982d28400 session 0x557980b97e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121257984 unmapped: 40296448 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:03.416818+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 120 heartbeat osd_stat(store_statfs(0x1b6a15000/0x0/0x1bfc00000, data 0x4bc1832/0x4c78000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121257984 unmapped: 40296448 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bbe400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:04.416960+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 120 handle_osd_map epochs [121,121], i have 120, src has [1,121]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121274368 unmapped: 40280064 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 121 ms_handle_reset con 0x557982bbe400 session 0x557981332000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:05.417111+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 121 handle_osd_map epochs [121,122], i have 121, src has [1,122]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 122 handle_osd_map epochs [122,122], i have 122, src has [1,122]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121331712 unmapped: 40222720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1268171 data_alloc: 285212672 data_used: 15204352
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 122 ms_handle_reset con 0x557980b5a000 session 0x55798236a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:06.417277+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121348096 unmapped: 40206336 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:07.417776+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 122 heartbeat osd_stat(store_statfs(0x1b820f000/0x0/0x1bfc00000, data 0x33c66ad/0x347b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121348096 unmapped: 40206336 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:08.418357+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121348096 unmapped: 40206336 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:09.418735+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 121348096 unmapped: 40206336 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:10.419095+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 122 heartbeat osd_stat(store_statfs(0x1b820f000/0x0/0x1bfc00000, data 0x33c66ad/0x347b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 12.750495911s of 13.236248016s, submitted: 90
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 122798080 unmapped: 38756352 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1280708 data_alloc: 285212672 data_used: 15200256
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:11.419314+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126181376 unmapped: 35373056 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:12.419467+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126066688 unmapped: 35487744 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:13.419737+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b77cb000/0x0/0x1bfc00000, data 0x3dfceb1/0x3eb3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126066688 unmapped: 35487744 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:14.419917+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126287872 unmapped: 35266560 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:15.420062+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126287872 unmapped: 35266560 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1365628 data_alloc: 285212672 data_used: 15642624
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:16.420590+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126296064 unmapped: 35258368 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:17.420763+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126296064 unmapped: 35258368 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:18.420924+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b77cb000/0x0/0x1bfc00000, data 0x3dfceb1/0x3eb3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126296064 unmapped: 35258368 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:19.421129+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126296064 unmapped: 35258368 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:20.421326+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126296064 unmapped: 35258368 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1365628 data_alloc: 285212672 data_used: 15642624
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:21.421844+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b77cb000/0x0/0x1bfc00000, data 0x3dfceb1/0x3eb3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 ms_handle_reset con 0x5579844dac00 session 0x557980b963c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126296064 unmapped: 35258368 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.065101624s of 11.456677437s, submitted: 103
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:22.421980+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 ms_handle_reset con 0x557982a13c00 session 0x557983df9e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:23.422274+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:24.422549+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:25.422705+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190766 data_alloc: 285212672 data_used: 13574144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:26.422854+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:27.423088+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:28.423251+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:29.423360+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:30.423509+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190766 data_alloc: 285212672 data_used: 13574144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:31.423658+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:32.423889+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:33.424070+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:34.424379+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:35.424531+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190766 data_alloc: 285212672 data_used: 13574144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:36.424719+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:37.424881+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:38.425046+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124403712 unmapped: 37150720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:39.425264+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:40.425420+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190766 data_alloc: 285212672 data_used: 13574144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:41.425597+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:42.425766+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:43.425906+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:44.426056+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:45.426235+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190766 data_alloc: 285212672 data_used: 13574144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:46.426428+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:47.426660+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124411904 unmapped: 37142528 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:48.426880+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124420096 unmapped: 37134336 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:49.427041+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124428288 unmapped: 37126144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:50.427209+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124428288 unmapped: 37126144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1190766 data_alloc: 285212672 data_used: 13574144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:51.427376+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124428288 unmapped: 37126144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:52.427516+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124428288 unmapped: 37126144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 heartbeat osd_stat(store_statfs(0x1b8c00000/0x0/0x1bfc00000, data 0x29d8e7e/0x2a8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:53.427677+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124428288 unmapped: 37126144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:54.427839+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124428288 unmapped: 37126144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 123 handle_osd_map epochs [124,124], i have 123, src has [1,124]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 32.914691925s of 33.030731201s, submitted: 33
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:55.428016+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124436480 unmapped: 37117952 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1194792 data_alloc: 285212672 data_used: 13586432
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:56.428147+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 125 ms_handle_reset con 0x557982ccf000 session 0x557982bc30e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124379136 unmapped: 37175296 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:57.428333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124379136 unmapped: 37175296 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:58.428473+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 125 handle_osd_map epochs [126,126], i have 125, src has [1,126]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 126 heartbeat osd_stat(store_statfs(0x1b8bf1000/0x0/0x1bfc00000, data 0x29e0aba/0x2a9b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:59.428683+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c1c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 126 ms_handle_reset con 0x5579810c1c00 session 0x557982bc2f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124395520 unmapped: 37158912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:00.429992+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124461056 unmapped: 37093376 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1208845 data_alloc: 285212672 data_used: 13598720
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:01.430155+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124502016 unmapped: 37052416 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 128 handle_osd_map epochs [128,128], i have 128, src has [1,128]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:02.430319+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 128 ms_handle_reset con 0x557980b5a000 session 0x557982bc2d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124502016 unmapped: 37052416 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:03.430481+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124502016 unmapped: 37052416 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:04.430638+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 128 heartbeat osd_stat(store_statfs(0x1b8bed000/0x0/0x1bfc00000, data 0x29e5bea/0x2aa1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124502016 unmapped: 37052416 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:05.430801+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124502016 unmapped: 37052416 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1209855 data_alloc: 285212672 data_used: 13606912
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:06.430979+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.607960701s of 11.800096512s, submitted: 76
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124510208 unmapped: 37044224 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:07.431179+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124510208 unmapped: 37044224 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:08.431370+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124510208 unmapped: 37044224 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:09.431580+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124510208 unmapped: 37044224 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:10.431773+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124510208 unmapped: 37044224 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1214057 data_alloc: 285212672 data_used: 13619200
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:11.431961+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124518400 unmapped: 37036032 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:12.432189+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124518400 unmapped: 37036032 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:13.432458+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124518400 unmapped: 37036032 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:14.432662+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124526592 unmapped: 37027840 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:15.433425+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124526592 unmapped: 37027840 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1214057 data_alloc: 285212672 data_used: 13619200
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:16.434353+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124526592 unmapped: 37027840 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:17.434815+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124526592 unmapped: 37027840 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:18.435714+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124526592 unmapped: 37027840 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:19.436225+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:20.436733+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1214057 data_alloc: 285212672 data_used: 13619200
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets getting new tickets!
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:21.437893+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _finish_auth 0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:21.438973+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:22.438115+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:23.438842+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:24.439470+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e83ee/0x2aa5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:25.439915+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 18.385051727s of 18.411876678s, submitted: 18
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 ms_handle_reset con 0x557982a10400 session 0x557982bc2960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1214959 data_alloc: 285212672 data_used: 13619200
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:26.440153+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e8450/0x2aa6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124534784 unmapped: 37019648 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:27.440454+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124567552 unmapped: 36986880 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:28.440785+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124567552 unmapped: 36986880 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:29.441164+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124567552 unmapped: 36986880 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:30.441360+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 heartbeat osd_stat(store_statfs(0x1b8be8000/0x0/0x1bfc00000, data 0x29e8450/0x2aa6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 129 handle_osd_map epochs [130,130], i have 129, src has [1,130]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124575744 unmapped: 36978688 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1220729 data_alloc: 285212672 data_used: 13631488
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:31.441622+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 130 handle_osd_map epochs [130,131], i have 130, src has [1,131]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 131 handle_osd_map epochs [131,131], i have 131, src has [1,131]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124600320 unmapped: 36954112 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 131 handle_osd_map epochs [130,131], i have 131, src has [1,131]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 131 ms_handle_reset con 0x557982d29c00 session 0x5579853cf2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:32.441783+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124616704 unmapped: 36937728 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 131 handle_osd_map epochs [132,132], i have 131, src has [1,132]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:33.441903+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844db000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 132 ms_handle_reset con 0x5579844db000 session 0x5579853ce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124649472 unmapped: 36904960 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 132 handle_osd_map epochs [133,133], i have 132, src has [1,133]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:34.442030+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557982d28400 session 0x557983f7ba40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557980b5a000 session 0x557983f7ba40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557982e36400 session 0x5579853ceb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124698624 unmapped: 36855808 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:35.442170+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 heartbeat osd_stat(store_statfs(0x1b8bcf000/0x0/0x1bfc00000, data 0x29f3004/0x2abd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124698624 unmapped: 36855808 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1245138 data_alloc: 285212672 data_used: 13656064
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:36.442348+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.873525620s of 11.041686058s, submitted: 48
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557982d29c00 session 0x5579853cf2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557982a10400 session 0x5579853ce780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124747776 unmapped: 36806656 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:37.442549+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 heartbeat osd_stat(store_statfs(0x1b8bd1000/0x0/0x1bfc00000, data 0x29f3004/0x2abd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557982d29400 session 0x5579816e21e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124747776 unmapped: 36806656 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:38.442682+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 ms_handle_reset con 0x557982a10400 session 0x557982bc2960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124764160 unmapped: 36790272 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:39.442852+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 handle_osd_map epochs [134,134], i have 133, src has [1,134]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 133 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 134 ms_handle_reset con 0x557980b5a000 session 0x557983d565a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 134 ms_handle_reset con 0x557982d29c00 session 0x557980b974a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124829696 unmapped: 36724736 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:40.443018+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124829696 unmapped: 36724736 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1249793 data_alloc: 285212672 data_used: 13672448
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:41.443173+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 134 handle_osd_map epochs [135,135], i have 134, src has [1,135]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 135 ms_handle_reset con 0x557982e36400 session 0x557983df9680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 135 heartbeat osd_stat(store_statfs(0x1b8bcb000/0x0/0x1bfc00000, data 0x29f59a0/0x2ac2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124878848 unmapped: 36675584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:42.443375+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 135 heartbeat osd_stat(store_statfs(0x1b8bca000/0x0/0x1bfc00000, data 0x29f8224/0x2ac3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 124878848 unmapped: 36675584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:43.443568+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 135 handle_osd_map epochs [135,136], i have 135, src has [1,136]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 136 handle_osd_map epochs [136,136], i have 136, src has [1,136]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 136 ms_handle_reset con 0x557987778400 session 0x55798528cd20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 125386752 unmapped: 36167680 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:44.443779+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 125386752 unmapped: 36167680 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:45.444196+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 136 handle_osd_map epochs [137,137], i have 136, src has [1,137]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 137 ms_handle_reset con 0x557980b5a000 session 0x557983f7a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 137 ms_handle_reset con 0x557982a10400 session 0x55798528c1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 125419520 unmapped: 36134912 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1254040 data_alloc: 285212672 data_used: 13688832
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:46.444366+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 137 handle_osd_map epochs [137,138], i have 137, src has [1,138]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.322729111s of 10.004206657s, submitted: 192
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 125427712 unmapped: 36126720 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:47.444587+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844da000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 138 ms_handle_reset con 0x5579844da000 session 0x557981612960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 138 heartbeat osd_stat(store_statfs(0x1b8bc2000/0x0/0x1bfc00000, data 0x29ff741/0x2aca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:48.444761+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126476288 unmapped: 35078144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:49.444909+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126476288 unmapped: 35078144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:50.445776+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126476288 unmapped: 35078144 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 138 ms_handle_reset con 0x557982a13800 session 0x55798528d4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:51.446031+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126509056 unmapped: 35045376 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1265616 data_alloc: 285212672 data_used: 13701120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982388c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 139 ms_handle_reset con 0x557982388c00 session 0x55798528de00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 139 handle_osd_map epochs [140,140], i have 139, src has [1,140]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 140 ms_handle_reset con 0x557980b5a000 session 0x557982bc21e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 140 handle_osd_map epochs [139,140], i have 140, src has [1,140]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:52.446338+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126566400 unmapped: 34988032 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 140 heartbeat osd_stat(store_statfs(0x1b8bb5000/0x0/0x1bfc00000, data 0x2a05061/0x2ad7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557985125c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:53.446976+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126574592 unmapped: 34979840 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 140 handle_osd_map epochs [141,141], i have 140, src has [1,141]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 141 heartbeat osd_stat(store_statfs(0x1b8bb5000/0x0/0x1bfc00000, data 0x2a04fff/0x2ad6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 141 ms_handle_reset con 0x557985125c00 session 0x55798236a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:54.447687+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126607360 unmapped: 34947072 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:55.447985+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126607360 unmapped: 34947072 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065d800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 141 handle_osd_map epochs [142,142], i have 141, src has [1,142]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 142 ms_handle_reset con 0x55798065d800 session 0x557983f40960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:56.448562+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126648320 unmapped: 34906112 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1273740 data_alloc: 285212672 data_used: 13701120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:57.448798+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126648320 unmapped: 34906112 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:58.449027+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065c800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126648320 unmapped: 34906112 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.589912415s of 11.884859085s, submitted: 121
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 142 ms_handle_reset con 0x55798065c800 session 0x557985a105a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:59.449221+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126656512 unmapped: 34897920 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 143 heartbeat osd_stat(store_statfs(0x1b8bb2000/0x0/0x1bfc00000, data 0x2a09d58/0x2adc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:00.449535+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126656512 unmapped: 34897920 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 143 handle_osd_map epochs [144,144], i have 143, src has [1,144]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:01.449690+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126664704 unmapped: 34889728 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1285944 data_alloc: 285212672 data_used: 13725696
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 144 handle_osd_map epochs [145,145], i have 144, src has [1,145]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 145 ms_handle_reset con 0x5579823e5000 session 0x557985a10780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:02.449857+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126689280 unmapped: 34865152 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:03.450106+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126689280 unmapped: 34865152 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:04.450373+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 145 heartbeat osd_stat(store_statfs(0x1b8ba4000/0x0/0x1bfc00000, data 0x2a11840/0x2ae8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126689280 unmapped: 34865152 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:05.450576+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126713856 unmapped: 34840576 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:06.450761+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126713856 unmapped: 34840576 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1296883 data_alloc: 285212672 data_used: 13737984
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 146 handle_osd_map epochs [146,146], i have 146, src has [1,146]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 146 handle_osd_map epochs [147,147], i have 146, src has [1,147]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:07.451087+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c8000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 147 ms_handle_reset con 0x557982e36800 session 0x557985a10b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126738432 unmapped: 34816000 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 147 ms_handle_reset con 0x5579842c8000 session 0x557980b963c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:08.451207+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.589947701s of 10.006390572s, submitted: 117
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126771200 unmapped: 34783232 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 148 ms_handle_reset con 0x5579810c0000 session 0x5579827be3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b8b97000/0x0/0x1bfc00000, data 0x2a17246/0x2af6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 148 ms_handle_reset con 0x557982c17c00 session 0x5579827a6d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065d400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 148 ms_handle_reset con 0x55798065d400 session 0x557985a10d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:09.451351+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126803968 unmapped: 34750464 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 148 heartbeat osd_stat(store_statfs(0x1b8b93000/0x0/0x1bfc00000, data 0x2a1a1f3/0x2afb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:10.451512+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126803968 unmapped: 34750464 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:11.451704+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126795776 unmapped: 34758656 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1317761 data_alloc: 285212672 data_used: 13766656
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:12.451859+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126795776 unmapped: 34758656 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 150 heartbeat osd_stat(store_statfs(0x1b8b8e000/0x0/0x1bfc00000, data 0x2a1cb11/0x2aff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 150 ms_handle_reset con 0x5579810c0000 session 0x557985a110e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:13.452058+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd1000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126795776 unmapped: 34758656 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 151 ms_handle_reset con 0x557982cd1000 session 0x557985a112c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:14.452282+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126844928 unmapped: 34709504 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844da000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:15.452554+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126828544 unmapped: 34725888 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 151 handle_osd_map epochs [152,152], i have 151, src has [1,152]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 152 ms_handle_reset con 0x5579844da000 session 0x557985a11680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:16.452708+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126828544 unmapped: 34725888 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1322379 data_alloc: 285212672 data_used: 13766656
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 152 handle_osd_map epochs [152,153], i have 152, src has [1,153]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 153 heartbeat osd_stat(store_statfs(0x1b8b83000/0x0/0x1bfc00000, data 0x2a262a3/0x2b09000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 153 ms_handle_reset con 0x557982a10400 session 0x557985a11a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:17.452836+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126869504 unmapped: 34684928 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 154 ms_handle_reset con 0x557982a13400 session 0x557985a11c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 154 heartbeat osd_stat(store_statfs(0x1b8b81000/0x0/0x1bfc00000, data 0x2a28bf2/0x2b0c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:18.452996+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126910464 unmapped: 34643968 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.314000130s of 10.054040909s, submitted: 200
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:19.453138+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 155 ms_handle_reset con 0x5579810c0000 session 0x557985a11e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:20.453330+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:21.453481+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1328777 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 155 handle_osd_map epochs [156,156], i have 155, src has [1,156]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:22.453641+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:23.463413+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 156 heartbeat osd_stat(store_statfs(0x1b8b7b000/0x0/0x1bfc00000, data 0x2a2dd51/0x2b12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 156 heartbeat osd_stat(store_statfs(0x1b8b7b000/0x0/0x1bfc00000, data 0x2a2dd51/0x2b12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:24.463559+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:25.463699+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:26.463806+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 156 heartbeat osd_stat(store_statfs(0x1b8b7b000/0x0/0x1bfc00000, data 0x2a2dd51/0x2b12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126926848 unmapped: 34627584 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1330915 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 156 handle_osd_map epochs [156,157], i have 156, src has [1,157]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:27.463981+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:28.464134+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:29.464340+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:30.464471+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b8b77000/0x0/0x1bfc00000, data 0x2a30555/0x2b16000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:31.464620+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1333917 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:32.464712+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:33.464831+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:34.464986+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 15.973342896s of 16.204427719s, submitted: 53
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x55798777a800 session 0x557983f414a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b8b76000/0x0/0x1bfc00000, data 0x2a305c7/0x2b18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:35.465110+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:36.465255+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126935040 unmapped: 34619392 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1338121 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:37.465429+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126943232 unmapped: 34611200 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x557982de2c00 session 0x557983f40960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:38.465600+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126943232 unmapped: 34611200 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:39.465776+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135176192 unmapped: 26378240 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:40.465890+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b7375000/0x0/0x1bfc00000, data 0x42305d7/0x4319000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126763008 unmapped: 34791424 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:41.466022+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126763008 unmapped: 34791424 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1615544 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:42.466166+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126590976 unmapped: 34963456 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:43.466355+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126590976 unmapped: 34963456 heap: 161554432 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:44.466496+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126492672 unmapped: 43458560 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.785685539s of 10.034481049s, submitted: 21
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x557982e36400 session 0x5579852e01e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:45.466587+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126500864 unmapped: 43450368 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x55798777a400 session 0x5579823db0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:46.466728+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b4373000/0x0/0x1bfc00000, data 0x723069b/0x731b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126558208 unmapped: 43393024 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1894894 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b4373000/0x0/0x1bfc00000, data 0x723069b/0x731b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x5579810c0000 session 0x557983faab40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:47.466927+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135135232 unmapped: 34816000 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:48.467060+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 126885888 unmapped: 43065344 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x55798777b800 session 0x557983faad20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:49.467200+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 127287296 unmapped: 42663936 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x557987778800 session 0x557983fab2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:50.467337+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 127500288 unmapped: 42450944 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1b0065000/0x0/0x1bfc00000, data 0xb53f639/0xb629000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:51.467498+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 ms_handle_reset con 0x557987779800 session 0x557983fab680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 127574016 unmapped: 42377216 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2371714 data_alloc: 285212672 data_used: 13778944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 heartbeat osd_stat(store_statfs(0x1af864000/0x0/0x1bfc00000, data 0xbd3f649/0xbe2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:52.467664+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 157 handle_osd_map epochs [158,158], i have 157, src has [1,158]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 158 heartbeat osd_stat(store_statfs(0x1af864000/0x0/0x1bfc00000, data 0xbd3f649/0xbe2a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 127664128 unmapped: 42287104 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982388400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 158 ms_handle_reset con 0x557982d28800 session 0x557983fab860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 158 heartbeat osd_stat(store_statfs(0x1af03b000/0x0/0x1bfc00000, data 0xc565f67/0xc652000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:53.467817+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 128180224 unmapped: 41771008 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 158 ms_handle_reset con 0x557982388400 session 0x5579816e3c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:54.467992+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 128221184 unmapped: 41730048 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.362970352s of 10.160252571s, submitted: 118
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 ms_handle_reset con 0x5579810c0000 session 0x55798165fe00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:55.468151+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 128237568 unmapped: 41713664 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 ms_handle_reset con 0x557982d28800 session 0x5579823a1680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:56.468314+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 128262144 unmapped: 41689088 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2548747 data_alloc: 285212672 data_used: 13791232
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:57.468508+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 ms_handle_reset con 0x557987779800 session 0x55798236a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131612672 unmapped: 38338560 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 handle_osd_map epochs [160,160], i have 159, src has [1,160]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 159 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:58.468615+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 160 heartbeat osd_stat(store_statfs(0x1ad41d000/0x0/0x1bfc00000, data 0xdd7ff57/0xde71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 160 ms_handle_reset con 0x557987778c00 session 0x557983faaf00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 127975424 unmapped: 41975808 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 160 ms_handle_reset con 0x557987778800 session 0x5579816e2780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:59.468766+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 128040960 unmapped: 41910272 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 160 ms_handle_reset con 0x5579810c0000 session 0x557983f7ab40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982388400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:00.468900+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 161 ms_handle_reset con 0x557982388400 session 0x5579803f94a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 161 heartbeat osd_stat(store_statfs(0x1acc16000/0x0/0x1bfc00000, data 0xe582939/0xe677000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130260992 unmapped: 39690240 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 161 handle_osd_map epochs [162,162], i have 161, src has [1,162]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:01.469333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 ms_handle_reset con 0x557982d28800 session 0x557982bc2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130301952 unmapped: 39649280 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2797745 data_alloc: 285212672 data_used: 13815808
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 ms_handle_reset con 0x557987778c00 session 0x5579823f83c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:02.469457+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 heartbeat osd_stat(store_statfs(0x1abc11000/0x0/0x1bfc00000, data 0xf587ae7/0xf67b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130441216 unmapped: 39510016 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:03.469605+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 heartbeat osd_stat(store_statfs(0x1ab55f000/0x0/0x1bfc00000, data 0xfc3aad8/0xfd2d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 162 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130613248 unmapped: 39337984 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 163 ms_handle_reset con 0x557982ccfc00 session 0x55798048b2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:04.469722+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130670592 unmapped: 39280640 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.456454277s of 10.322231293s, submitted: 127
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:05.469882+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130711552 unmapped: 39239680 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:06.469975+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 163 handle_osd_map epochs [164,164], i have 163, src has [1,164]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 164 handle_osd_map epochs [165,165], i have 164, src has [1,165]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130818048 unmapped: 39133184 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3181035 data_alloc: 285212672 data_used: 13840384
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 165 handle_osd_map epochs [165,165], i have 165, src has [1,165]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:07.470170+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 165 ms_handle_reset con 0x557982389000 session 0x5579823f81e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 165 heartbeat osd_stat(store_statfs(0x1a8443000/0x0/0x1bfc00000, data 0x12d53fcf/0x12e49000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130875392 unmapped: 39075840 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844da000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 165 ms_handle_reset con 0x5579844da000 session 0x55798137a960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:08.470330+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131006464 unmapped: 38944768 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:09.470485+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131006464 unmapped: 38944768 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:10.470632+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131055616 unmapped: 38895616 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:11.470824+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131186688 unmapped: 38764544 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3479879 data_alloc: 285212672 data_used: 13840384
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:12.471006+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 139649024 unmapped: 30302208 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:13.471165+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 heartbeat osd_stat(store_statfs(0x1a4f4f000/0x0/0x1bfc00000, data 0x162477b9/0x1633e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131465216 unmapped: 38486016 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:14.471362+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 heartbeat osd_stat(store_statfs(0x1a374f000/0x0/0x1bfc00000, data 0x17a477b9/0x17b3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 139870208 unmapped: 30081024 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:15.471503+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 145162240 unmapped: 24788992 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.880484581s of 10.593580246s, submitted: 141
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:16.471634+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131604480 unmapped: 38346752 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3869127 data_alloc: 285212672 data_used: 13852672
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 ms_handle_reset con 0x5579823e4800 session 0x55798082e1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 ms_handle_reset con 0x557982de2400 session 0x5579803f9c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:17.471775+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 ms_handle_reset con 0x557982389000 session 0x5579823f92c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 ms_handle_reset con 0x5579823e4800 session 0x557983d57860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 ms_handle_reset con 0x557982ccfc00 session 0x5579823a45a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 131088384 unmapped: 38862848 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 166 handle_osd_map epochs [167,167], i have 166, src has [1,167]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:18.471894+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 167 ms_handle_reset con 0x557982421c00 session 0x55798528de00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 167 heartbeat osd_stat(store_statfs(0x1a0d0b000/0x0/0x1bfc00000, data 0x1a48c7b9/0x1a583000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130277376 unmapped: 39673856 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844da000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 167 ms_handle_reset con 0x5579844da000 session 0x557981332000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:19.472017+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 167 heartbeat osd_stat(store_statfs(0x1a0d04000/0x0/0x1bfc00000, data 0x1a48f18a/0x1a589000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130277376 unmapped: 39673856 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 167 handle_osd_map epochs [168,168], i have 167, src has [1,168]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:20.474720+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130424832 unmapped: 39526400 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 168 heartbeat osd_stat(store_statfs(0x1a8500000/0x0/0x1bfc00000, data 0x11c91aec/0x11d8c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,2,3,4] op hist [0,0,0,2])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 168 ms_handle_reset con 0x557982389000 session 0x55798277e780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:21.474843+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 168 handle_osd_map epochs [169,169], i have 168, src has [1,169]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 ms_handle_reset con 0x5579823e4800 session 0x557983d56960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 ms_handle_reset con 0x557982421c00 session 0x5579816125a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 129040384 unmapped: 40910848 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1555117 data_alloc: 285212672 data_used: 13877248
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:22.474986+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 ms_handle_reset con 0x557987779c00 session 0x5579852e1c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065dc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 129196032 unmapped: 40755200 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:23.475151+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 169 handle_osd_map epochs [170,170], i have 169, src has [1,170]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 ms_handle_reset con 0x55798065dc00 session 0x557983faa5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130285568 unmapped: 39665664 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:24.475320+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130293760 unmapped: 39657472 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 ms_handle_reset con 0x557982389000 session 0x5579816e30e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:25.475443+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130293760 unmapped: 39657472 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.025976181s of 10.045113564s, submitted: 200
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 ms_handle_reset con 0x5579823e4800 session 0x5579827a6d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:26.475583+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 heartbeat osd_stat(store_statfs(0x1b6b33000/0x0/0x1bfc00000, data 0x34c0c9f/0x35bb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 handle_osd_map epochs [171,171], i have 170, src has [1,171]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 170 handle_osd_map epochs [171,171], i have 171, src has [1,171]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130392064 unmapped: 39559168 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1569043 data_alloc: 285212672 data_used: 14454784
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:27.475746+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130392064 unmapped: 39559168 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:28.475889+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130400256 unmapped: 39550976 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:29.476041+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130400256 unmapped: 39550976 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:30.476456+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130400256 unmapped: 39550976 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:31.476605+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130400256 unmapped: 39550976 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1568163 data_alloc: 285212672 data_used: 14454784
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 171 handle_osd_map epochs [172,172], i have 171, src has [1,172]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 172 ms_handle_reset con 0x557982420c00 session 0x55798277f860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 172 heartbeat osd_stat(store_statfs(0x1b6b2f000/0x0/0x1bfc00000, data 0x34c34bf/0x35bf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:32.476781+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130408448 unmapped: 39542784 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:33.476973+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130408448 unmapped: 39542784 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 172 handle_osd_map epochs [173,173], i have 172, src has [1,173]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:34.477087+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 130424832 unmapped: 39526400 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 174 heartbeat osd_stat(store_statfs(0x1b6b21000/0x0/0x1bfc00000, data 0x34caeff/0x35cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:35.477249+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 174 handle_osd_map epochs [174,175], i have 175, src has [1,175]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 59
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.671513557s of 10.001025200s, submitted: 85
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 132366336 unmapped: 37584896 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 175 ms_handle_reset con 0x5579823e4c00 session 0x5579823a12c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:36.477385+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135577600 unmapped: 34373632 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1702851 data_alloc: 285212672 data_used: 14655488
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ce2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 175 ms_handle_reset con 0x557982ce2000 session 0x5579852e05a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ce2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:37.477510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 175 handle_osd_map epochs [176,176], i have 175, src has [1,176]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 175 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 ms_handle_reset con 0x557982ce2000 session 0x5579823a8b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134799360 unmapped: 35151872 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 ms_handle_reset con 0x557982a11800 session 0x5579811f9e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:38.477639+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134823936 unmapped: 35127296 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 heartbeat osd_stat(store_statfs(0x1b5b4a000/0x0/0x1bfc00000, data 0x4496873/0x459b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 handle_osd_map epochs [177,177], i have 176, src has [1,177]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 176 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 177 ms_handle_reset con 0x557982a11400 session 0x5579816e3c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:39.477765+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 60
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134209536 unmapped: 35741696 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 177 handle_osd_map epochs [178,178], i have 177, src has [1,178]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 178 heartbeat osd_stat(store_statfs(0x1b5b4e000/0x0/0x1bfc00000, data 0x44991e5/0x459f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:40.477982+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134217728 unmapped: 35733504 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 178 ms_handle_reset con 0x557982c17400 session 0x5579823db0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:41.478153+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ce3400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 178 ms_handle_reset con 0x557982ce3400 session 0x557983f40960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134217728 unmapped: 35733504 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1719825 data_alloc: 285212672 data_used: 14667776
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:42.478320+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134217728 unmapped: 35733504 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 178 handle_osd_map epochs [179,179], i have 178, src has [1,179]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 179 ms_handle_reset con 0x55798777a000 session 0x557980b963c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 179 heartbeat osd_stat(store_statfs(0x1b5b49000/0x0/0x1bfc00000, data 0x449bb1e/0x45a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:43.478465+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134365184 unmapped: 35586048 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:44.478604+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134438912 unmapped: 35512320 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 179 handle_osd_map epochs [180,180], i have 179, src has [1,180]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 180 ms_handle_reset con 0x55798777b800 session 0x557985634000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:45.478703+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 180 heartbeat osd_stat(store_statfs(0x1b5b21000/0x0/0x1bfc00000, data 0x44c3455/0x45cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 180 ms_handle_reset con 0x557982420000 session 0x5579856343c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134438912 unmapped: 35512320 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.465527534s of 10.156942368s, submitted: 186
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:46.478946+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 180 handle_osd_map epochs [181,181], i have 180, src has [1,181]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 181 ms_handle_reset con 0x557982a13800 session 0x5579856345a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1731658 data_alloc: 285212672 data_used: 14671872
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134438912 unmapped: 35512320 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:47.479191+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134438912 unmapped: 35512320 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 181 handle_osd_map epochs [182,182], i have 181, src has [1,182]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:48.479333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 182 ms_handle_reset con 0x557982ccf000 session 0x557985634960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 182 heartbeat osd_stat(store_statfs(0x1b5b17000/0x0/0x1bfc00000, data 0x44c8739/0x45d4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134438912 unmapped: 35512320 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:49.479468+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 182 ms_handle_reset con 0x557982ccf000 session 0x557985634d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134438912 unmapped: 35512320 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:50.479637+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 183 ms_handle_reset con 0x557982420000 session 0x557985634f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134504448 unmapped: 35446784 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 183 heartbeat osd_stat(store_statfs(0x1b5b05000/0x0/0x1bfc00000, data 0x44d89f5/0x45e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:51.479774+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1740430 data_alloc: 285212672 data_used: 14704640
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134389760 unmapped: 35561472 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:52.479919+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 184 handle_osd_map epochs [185,185], i have 184, src has [1,185]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 185 ms_handle_reset con 0x557982a13800 session 0x557983df90e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134414336 unmapped: 35536896 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:53.480371+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134414336 unmapped: 35536896 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 185 heartbeat osd_stat(store_statfs(0x1b5afd000/0x0/0x1bfc00000, data 0x44ddb4f/0x45ef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 185 ms_handle_reset con 0x557982c16000 session 0x5579856354a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd3c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:54.481170+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 185 handle_osd_map epochs [186,186], i have 185, src has [1,186]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 186 ms_handle_reset con 0x557982dd3c00 session 0x557985635680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134422528 unmapped: 35528704 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 186 heartbeat osd_stat(store_statfs(0x1b5afa000/0x0/0x1bfc00000, data 0x44e0b45/0x45f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:55.481325+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 186 handle_osd_map epochs [186,187], i have 186, src has [1,187]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 187 handle_osd_map epochs [187,187], i have 187, src has [1,187]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 187 ms_handle_reset con 0x557982420000 session 0x55798137ba40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134479872 unmapped: 35471360 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:56.481478+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 187 ms_handle_reset con 0x557982a13800 session 0x557985635c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.264670372s of 10.552164078s, submitted: 121
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 188 ms_handle_reset con 0x557982c16000 session 0x557985635e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 188 heartbeat osd_stat(store_statfs(0x1b5aec000/0x0/0x1bfc00000, data 0x44e8e49/0x45fe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1756160 data_alloc: 285212672 data_used: 14716928
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134512640 unmapped: 35438592 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:57.481646+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 188 heartbeat osd_stat(store_statfs(0x1b5aec000/0x0/0x1bfc00000, data 0x44e8e49/0x45fe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134512640 unmapped: 35438592 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579856fe800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 188 ms_handle_reset con 0x5579856fe800 session 0x5579856370e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579856fe000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 189 ms_handle_reset con 0x5579856fe000 session 0x5579856372c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:58.481789+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134561792 unmapped: 35389440 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:59.482034+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 189 heartbeat osd_stat(store_statfs(0x1b5aeb000/0x0/0x1bfc00000, data 0x44eb7db/0x4602000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 189 ms_handle_reset con 0x557982420000 session 0x5579827be3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134610944 unmapped: 35340288 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:00.482262+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134610944 unmapped: 35340288 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:01.482776+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 189 heartbeat osd_stat(store_statfs(0x1b5ae8000/0x0/0x1bfc00000, data 0x44ef852/0x4605000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 189 handle_osd_map epochs [190,190], i have 189, src has [1,190]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1758734 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134627328 unmapped: 35323904 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:02.482904+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 134627328 unmapped: 35323904 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:03.483052+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 190 handle_osd_map epochs [190,190], i have 190, src has [1,190]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135692288 unmapped: 34258944 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:04.483339+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135692288 unmapped: 34258944 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 190 ms_handle_reset con 0x557982d28800 session 0x557983fab4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:05.483509+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557985125800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 191 ms_handle_reset con 0x557985125800 session 0x55798048a3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798497a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135749632 unmapped: 34201600 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:06.483712+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 191 handle_osd_map epochs [191,192], i have 191, src has [1,192]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.788307190s of 10.110929489s, submitted: 107
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 192 ms_handle_reset con 0x55798497a400 session 0x557983cc4000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 192 ms_handle_reset con 0x5579823e4800 session 0x55798165fe00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1769244 data_alloc: 285212672 data_used: 14741504
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135774208 unmapped: 34177024 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 192 handle_osd_map epochs [191,192], i have 192, src has [1,192]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 192 heartbeat osd_stat(store_statfs(0x1b5acf000/0x0/0x1bfc00000, data 0x4502907/0x461e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:07.483920+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135790592 unmapped: 34160640 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579836edc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:08.484059+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccdc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 193 ms_handle_reset con 0x557982ccdc00 session 0x5579855b05a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135823360 unmapped: 34127872 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 193 ms_handle_reset con 0x5579842c9c00 session 0x55798518c1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:09.484177+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 193 heartbeat osd_stat(store_statfs(0x1b5acc000/0x0/0x1bfc00000, data 0x4504d47/0x4621000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 193 handle_osd_map epochs [193,194], i have 193, src has [1,194]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 ms_handle_reset con 0x557982bb3800 session 0x55798236a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 handle_osd_map epochs [194,194], i have 194, src has [1,194]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135856128 unmapped: 34095104 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 ms_handle_reset con 0x5579836edc00 session 0x5579856341e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982388400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:10.484347+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 ms_handle_reset con 0x557982388400 session 0x557982bc2b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 heartbeat osd_stat(store_statfs(0x1b5ac5000/0x0/0x1bfc00000, data 0x4509fea/0x4629000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135856128 unmapped: 34095104 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:11.484497+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 heartbeat osd_stat(store_statfs(0x1b5abe000/0x0/0x1bfc00000, data 0x450cdf6/0x462d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 194 handle_osd_map epochs [195,195], i have 194, src has [1,195]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 195 ms_handle_reset con 0x557982bb3800 session 0x557982bc3680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1785883 data_alloc: 285212672 data_used: 14753792
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135880704 unmapped: 34070528 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:12.484966+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135880704 unmapped: 34070528 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:13.485096+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135921664 unmapped: 34029568 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:14.485330+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135921664 unmapped: 34029568 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 196 ms_handle_reset con 0x5579823e5c00 session 0x5579853cf860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d8c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 196 heartbeat osd_stat(store_statfs(0x1b5ab4000/0x0/0x1bfc00000, data 0x451808c/0x463a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:15.485473+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 196 handle_osd_map epochs [197,197], i have 196, src has [1,197]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135979008 unmapped: 33972224 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 197 ms_handle_reset con 0x5579844d8c00 session 0x55798048b680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 197 ms_handle_reset con 0x55798777a000 session 0x557983f7a780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:16.485619+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.655076981s of 10.075615883s, submitted: 144
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1792606 data_alloc: 285212672 data_used: 14766080
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136003584 unmapped: 33947648 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 198 handle_osd_map epochs [197,198], i have 198, src has [1,198]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:17.485774+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 198 handle_osd_map epochs [199,199], i have 198, src has [1,199]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136036352 unmapped: 33914880 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 199 ms_handle_reset con 0x557982ccf800 session 0x5579853ceb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 199 handle_osd_map epochs [199,199], i have 199, src has [1,199]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:18.485906+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 199 handle_osd_map epochs [200,200], i have 199, src has [1,200]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136101888 unmapped: 33849344 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 handle_osd_map epochs [200,200], i have 200, src has [1,200]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 ms_handle_reset con 0x5579823e5c00 session 0x5579855b01e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:19.486042+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 heartbeat osd_stat(store_statfs(0x1b5a9c000/0x0/0x1bfc00000, data 0x452a1d1/0x4652000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136142848 unmapped: 33808384 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:20.486186+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 heartbeat osd_stat(store_statfs(0x1b5a97000/0x0/0x1bfc00000, data 0x452f305/0x4656000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 ms_handle_reset con 0x557987779c00 session 0x557985637a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136142848 unmapped: 33808384 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:21.486338+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 handle_osd_map epochs [201,201], i have 200, src has [1,201]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 200 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 201 ms_handle_reset con 0x557982cce000 session 0x557981612960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1800037 data_alloc: 285212672 data_used: 14774272
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136175616 unmapped: 33775616 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:22.486462+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 201 handle_osd_map epochs [201,201], i have 201, src has [1,201]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 201 ms_handle_reset con 0x557982bb3800 session 0x557985636d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 201 heartbeat osd_stat(store_statfs(0x1b74ec000/0x0/0x1bfc00000, data 0x2ad9cda/0x2c01000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:23.486588+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:24.486757+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:25.486891+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:26.487057+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 201 handle_osd_map epochs [202,202], i have 201, src has [1,202]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.582021713s of 10.025650978s, submitted: 157
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 202 heartbeat osd_stat(store_statfs(0x1b74eb000/0x0/0x1bfc00000, data 0x2adc5f2/0x2c03000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1599500 data_alloc: 285212672 data_used: 14032896
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:27.487228+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:28.487390+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:29.487523+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd1400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 202 handle_osd_map epochs [203,203], i have 202, src has [1,203]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136011776 unmapped: 33939456 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:30.487723+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136052736 unmapped: 33898496 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:31.487898+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 204 handle_osd_map epochs [204,205], i have 204, src has [1,205]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 205 handle_osd_map epochs [205,205], i have 205, src has [1,205]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 205 ms_handle_reset con 0x557982cd1400 session 0x557980b97e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1612199 data_alloc: 285212672 data_used: 14045184
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136069120 unmapped: 33882112 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 205 heartbeat osd_stat(store_statfs(0x1b74dd000/0x0/0x1bfc00000, data 0x2ae57f0/0x2c10000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 205 ms_handle_reset con 0x5579823e5c00 session 0x55798528c960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:32.488135+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135430144 unmapped: 34521088 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:33.488342+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135430144 unmapped: 34521088 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:34.488533+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 205 heartbeat osd_stat(store_statfs(0x1b74d3000/0x0/0x1bfc00000, data 0x2aed19e/0x2c19000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135438336 unmapped: 34512896 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:35.488718+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:36.488897+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.594771385s of 10.000521660s, submitted: 114
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1612114 data_alloc: 285212672 data_used: 14057472
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:37.489090+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 206 heartbeat osd_stat(store_statfs(0x1b74cd000/0x0/0x1bfc00000, data 0x2af294a/0x2c20000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:38.489225+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:39.489438+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:40.489637+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:41.489792+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1612674 data_alloc: 285212672 data_used: 14057472
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:42.489930+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:43.490087+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 206 ms_handle_reset con 0x557982389800 session 0x5579853cf2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 206 heartbeat osd_stat(store_statfs(0x1b74bb000/0x0/0x1bfc00000, data 0x2b04e95/0x2c33000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:44.490245+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 135372800 unmapped: 34578432 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 206 ms_handle_reset con 0x557987778800 session 0x557985a11c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:45.490437+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 206 handle_osd_map epochs [206,207], i have 206, src has [1,207]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 207 ms_handle_reset con 0x55798777b000 session 0x55798518c5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136445952 unmapped: 33505280 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:46.490655+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c8c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 207 ms_handle_reset con 0x5579842c8c00 session 0x55798518dc20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 207 heartbeat osd_stat(store_statfs(0x1b74a2000/0x0/0x1bfc00000, data 0x2b17f03/0x2c4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.920694351s of 10.126612663s, submitted: 54
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 208 ms_handle_reset con 0x557982420800 session 0x557985a11e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1640752 data_alloc: 285212672 data_used: 14069760
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136454144 unmapped: 33497088 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:47.490867+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d8c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 208 ms_handle_reset con 0x557982cce000 session 0x557981333e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 136511488 unmapped: 33439744 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 208 ms_handle_reset con 0x55798065cc00 session 0x557981612960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:48.491073+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 handle_osd_map epochs [209,209], i have 209, src has [1,209]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 heartbeat osd_stat(store_statfs(0x1b749c000/0x0/0x1bfc00000, data 0x2b1d234/0x2c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137625600 unmapped: 32325632 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579856fe800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 ms_handle_reset con 0x5579856fe800 session 0x557985635a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 ms_handle_reset con 0x5579844d8c00 session 0x55798528cf00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:49.491271+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 heartbeat osd_stat(store_statfs(0x1b749c000/0x0/0x1bfc00000, data 0x2b1d234/0x2c51000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137682944 unmapped: 32268288 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:50.491530+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 210 ms_handle_reset con 0x55798065cc00 session 0x557983fabc20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 210 handle_osd_map epochs [211,211], i have 210, src has [1,211]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137699328 unmapped: 32251904 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 211 heartbeat osd_stat(store_statfs(0x1b7488000/0x0/0x1bfc00000, data 0x2b2fe5c/0x2c64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:51.491671+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 211 ms_handle_reset con 0x557982420800 session 0x55798277e1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 heartbeat osd_stat(store_statfs(0x1b7488000/0x0/0x1bfc00000, data 0x2b2fe5c/0x2c64000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x557982cce000 session 0x5579816e30e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1645268 data_alloc: 285212672 data_used: 14098432
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137764864 unmapped: 32186368 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:52.491863+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 heartbeat osd_stat(store_statfs(0x1b707c000/0x0/0x1bfc00000, data 0x2b3c372/0x2c71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x557980b5a000 session 0x557982bc2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137814016 unmapped: 32137216 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x55798777b800 session 0x5579803f9680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:53.492022+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137838592 unmapped: 32112640 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x55798065cc00 session 0x5579853cf680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:54.492174+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137846784 unmapped: 32104448 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x557980b5a000 session 0x557985312780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:55.492327+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x557982420800 session 0x55798048a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137846784 unmapped: 32104448 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:56.492511+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 heartbeat osd_stat(store_statfs(0x1b7069000/0x0/0x1bfc00000, data 0x2b4c9c9/0x2c83000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 ms_handle_reset con 0x557982cce000 session 0x5579855b0f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 212 handle_osd_map epochs [212,213], i have 212, src has [1,213]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983578800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.1a] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.4] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.19] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.13] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[5.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 213 ms_handle_reset con 0x557983578800 session 0x557982bc2960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1658205 data_alloc: 285212672 data_used: 14110720
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137863168 unmapped: 32088064 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.379488945s of 10.335848808s, submitted: 336
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 213 ms_handle_reset con 0x55798065cc00 session 0x557985a10960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:57.492715+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137895936 unmapped: 32055296 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:58.492917+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777ac00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579836edc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 214 heartbeat osd_stat(store_statfs(0x1b705b000/0x0/0x1bfc00000, data 0x2b560de/0x2c93000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 214 ms_handle_reset con 0x55798777ac00 session 0x5579856365a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 214 handle_osd_map epochs [214,215], i have 214, src has [1,215]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 ms_handle_reset con 0x5579836edc00 session 0x557985637860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 137969664 unmapped: 31981568 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:59.493065+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e37000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 ms_handle_reset con 0x557982e36400 session 0x5579856343c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 ms_handle_reset con 0x557981272000 session 0x557985a10d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 ms_handle_reset con 0x557982e37000 session 0x5579853cf2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 216 ms_handle_reset con 0x55798065cc00 session 0x5579855b05a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140148736 unmapped: 29802496 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:00.493234+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 216 ms_handle_reset con 0x557987779800 session 0x557983faba40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 216 handle_osd_map epochs [217,217], i have 216, src has [1,217]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140189696 unmapped: 29761536 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 ms_handle_reset con 0x557981272000 session 0x557982a0de00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579836edc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:01.493430+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 ms_handle_reset con 0x5579836edc00 session 0x55798518d4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 ms_handle_reset con 0x557982e36400 session 0x55798048a3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1677569 data_alloc: 285212672 data_used: 14135296
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 heartbeat osd_stat(store_statfs(0x1b5e95000/0x0/0x1bfc00000, data 0x2b74ffc/0x2cb7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x70af9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 217 handle_osd_map epochs [218,218], i have 218, src has [1,218]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140197888 unmapped: 29753344 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:02.493728+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 218 ms_handle_reset con 0x55798065cc00 session 0x55798528cb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 218 ms_handle_reset con 0x557981272000 session 0x55798236b2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140247040 unmapped: 29704192 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:03.493928+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777bc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579856fe800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 218 ms_handle_reset con 0x5579856fe800 session 0x5579823f8960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 218 ms_handle_reset con 0x557982bb3800 session 0x557981332b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 218 ms_handle_reset con 0x55798777bc00 session 0x557983fab4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140288000 unmapped: 29663232 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 219 ms_handle_reset con 0x5579823e5c00 session 0x557985a11c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:04.494070+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065cc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 219 ms_handle_reset con 0x55798065cc00 session 0x55798518d860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140312576 unmapped: 29638656 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:05.494190+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 219 ms_handle_reset con 0x557981272000 session 0x5579852e0f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 219 ms_handle_reset con 0x557982e36400 session 0x5579852e0d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140419072 unmapped: 29532160 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:06.494375+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 219 handle_osd_map epochs [219,220], i have 219, src has [1,220]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1687775 data_alloc: 285212672 data_used: 14147584
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140419072 unmapped: 29532160 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:07.494584+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.426266670s of 10.685055733s, submitted: 298
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 220 ms_handle_reset con 0x557987778800 session 0x557981612000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 220 heartbeat osd_stat(store_statfs(0x1b6e41000/0x0/0x1bfc00000, data 0x2ba9bf1/0x2cec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140443648 unmapped: 29507584 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:08.494750+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140443648 unmapped: 29507584 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 220 ms_handle_reset con 0x55798163f800 session 0x557983df9680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:09.494909+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140451840 unmapped: 29499392 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:10.495077+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 220 heartbeat osd_stat(store_statfs(0x1b6e30000/0x0/0x1bfc00000, data 0x2bba5c1/0x2cfd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140451840 unmapped: 29499392 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:11.495216+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 ms_handle_reset con 0x557982389800 session 0x557983d565a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1703099 data_alloc: 285212672 data_used: 14159872
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140468224 unmapped: 29483008 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:12.495418+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 ms_handle_reset con 0x557987778000 session 0x557983f412c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140468224 unmapped: 29483008 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:13.495606+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 ms_handle_reset con 0x557982ccfc00 session 0x5579853ce1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140541952 unmapped: 29409280 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982dd2000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 ms_handle_reset con 0x557982dd2000 session 0x557983cc5680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:14.495754+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 140541952 unmapped: 29409280 heap: 169951232 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:15.495924+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 ms_handle_reset con 0x55798163f800 session 0x5579852b45a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 ms_handle_reset con 0x557982ccfc00 session 0x5579853ced20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 221 handle_osd_map epochs [222,222], i have 221, src has [1,222]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 222 ms_handle_reset con 0x557987778000 session 0x5579852e14a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 142827520 unmapped: 31326208 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 222 heartbeat osd_stat(store_statfs(0x1b6bfe000/0x0/0x1bfc00000, data 0x2deae41/0x2f30000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 222 ms_handle_reset con 0x557982389800 session 0x5579816e30e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:16.496058+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1828421 data_alloc: 285212672 data_used: 14176256
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 222 handle_osd_map epochs [222,223], i have 222, src has [1,223]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 142876672 unmapped: 31277056 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:17.496222+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 heartbeat osd_stat(store_statfs(0x1b5f2b000/0x0/0x1bfc00000, data 0x3abb798/0x3c02000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x55798163f800 session 0x5579816134a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x557982ccfc00 session 0x557980b96000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x5579844d9400 session 0x557982bc32c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x557982e36800 session 0x55798240d0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c16000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.338910103s of 10.141191483s, submitted: 189
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 142934016 unmapped: 31219712 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:18.496373+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x557982c16000 session 0x557985637e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x55798163f800 session 0x557983df85a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 142983168 unmapped: 31170560 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 223 ms_handle_reset con 0x557982e36800 session 0x557983df83c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 ms_handle_reset con 0x5579844d9400 session 0x557983f7b0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 ms_handle_reset con 0x557982ccfc00 session 0x557981612b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:19.496513+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844dac00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 ms_handle_reset con 0x5579844dac00 session 0x5579823f8d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 handle_osd_map epochs [224,224], i have 224, src has [1,224]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 ms_handle_reset con 0x557987779400 session 0x557983f40780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 heartbeat osd_stat(store_statfs(0x1b5f29000/0x0/0x1bfc00000, data 0x3abe055/0x3c05000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143015936 unmapped: 31137792 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 224 handle_osd_map epochs [224,225], i have 224, src has [1,225]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x557982ccfc00 session 0x557983f40960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:20.496658+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x55798163f800 session 0x55798137a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x557982e36800 session 0x5579829d6b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 handle_osd_map epochs [225,225], i have 225, src has [1,225]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143056896 unmapped: 31096832 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:21.496810+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x5579844d9400 session 0x557985a11e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x557982ccfc00 session 0x557983f41a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x55798163f800 session 0x557983f7b2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x557982e36800 session 0x557983d56000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1841966 data_alloc: 285212672 data_used: 14192640
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143138816 unmapped: 31014912 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:22.496951+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 ms_handle_reset con 0x5579844d9400 session 0x557983d57c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798551a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 heartbeat osd_stat(store_statfs(0x1b5f17000/0x0/0x1bfc00000, data 0x3acaad2/0x3c17000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143147008 unmapped: 31006720 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:23.497161+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 225 handle_osd_map epochs [225,226], i have 225, src has [1,226]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x557982389000 session 0x55798165f2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143196160 unmapped: 30957568 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x55798551a400 session 0x5579823a7680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:24.497342+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x55798163f400 session 0x5579823a72c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x557982389000 session 0x5579823a6b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 heartbeat osd_stat(store_statfs(0x1b5f11000/0x0/0x1bfc00000, data 0x3acd46e/0x3c1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143253504 unmapped: 30900224 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccfc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:25.497494+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x557982ccfc00 session 0x55798048bc20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x557982e36800 session 0x5579823a6f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x55798163f800 session 0x5579823a6780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 ms_handle_reset con 0x5579844d9400 session 0x55798236ad20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143286272 unmapped: 30867456 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:26.497659+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 226 handle_osd_map epochs [227,227], i have 226, src has [1,227]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 227 ms_handle_reset con 0x55798163f400 session 0x5579827bfe00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1859129 data_alloc: 285212672 data_used: 14213120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 143286272 unmapped: 30867456 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:27.497879+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 227 ms_handle_reset con 0x55798777b400 session 0x55798137a3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 227 handle_osd_map epochs [227,228], i have 227, src has [1,228]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 228 heartbeat osd_stat(store_statfs(0x1b5efe000/0x0/0x1bfc00000, data 0x3add25e/0x3c2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 228 ms_handle_reset con 0x557982ccf000 session 0x5579823a6d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.559810638s of 10.150060654s, submitted: 246
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 144416768 unmapped: 29736960 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:28.498066+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 228 ms_handle_reset con 0x55798163f400 session 0x55798240de00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 144441344 unmapped: 29712384 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:29.498191+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 228 ms_handle_reset con 0x55798163f800 session 0x5579823a8780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 228 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 229 ms_handle_reset con 0x557982ccf000 session 0x5579827be960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 145096704 unmapped: 29057024 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:30.498390+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 229 handle_osd_map epochs [230,230], i have 229, src has [1,230]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 229 handle_osd_map epochs [230,230], i have 230, src has [1,230]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 ms_handle_reset con 0x5579844d9400 session 0x55798240cd20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 heartbeat osd_stat(store_statfs(0x1b59a1000/0x0/0x1bfc00000, data 0x4037953/0x418c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 145154048 unmapped: 28999680 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:31.498544+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 ms_handle_reset con 0x55798777b400 session 0x55798048b4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1874204 data_alloc: 285212672 data_used: 14229504
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 145178624 unmapped: 28975104 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:32.498752+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 ms_handle_reset con 0x5579810c0000 session 0x5579803f90e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 145178624 unmapped: 28975104 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:33.498894+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 144457728 unmapped: 29696000 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:34.499085+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 ms_handle_reset con 0x557982c17400 session 0x55798165ed20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 144457728 unmapped: 29696000 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:35.499251+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557985124400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 ms_handle_reset con 0x557985124400 session 0x557980b96780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798551a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 ms_handle_reset con 0x55798551a400 session 0x55798048b0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 144457728 unmapped: 29696000 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:36.499417+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 230 handle_osd_map epochs [230,231], i have 230, src has [1,231]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 heartbeat osd_stat(store_statfs(0x1b5eb4000/0x0/0x1bfc00000, data 0x3b1f664/0x3c79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1888528 data_alloc: 285212672 data_used: 14241792
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 144474112 unmapped: 29679616 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:37.499584+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd0800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982cd0800 session 0x5579823a5e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x5579810c0000 session 0x5579813334a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.219019890s of 10.098595619s, submitted: 243
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 145670144 unmapped: 28483584 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:38.499722+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd0800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982cd0800 session 0x5579823f94a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557985124400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982c17400 session 0x55798136c960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798551a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x55798551a400 session 0x5579823a43c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982a10800 session 0x557980b941e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557985124400 session 0x5579852e0780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147537920 unmapped: 26615808 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:39.499864+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x5579810c0000 session 0x557983f7b4a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147521536 unmapped: 26632192 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:40.499991+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982a10800 session 0x557980b952c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982cd0800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147521536 unmapped: 26632192 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:41.500139+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982c17400 session 0x557983f7a780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982cd0800 session 0x557983df9a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 heartbeat osd_stat(store_statfs(0x1b51f0000/0x0/0x1bfc00000, data 0x47e59d4/0x493e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x5579810c0000 session 0x557983df8780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1902834 data_alloc: 285212672 data_used: 14245888
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147546112 unmapped: 26607616 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:42.500281+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983578400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982a10800 session 0x55798277f0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557983578400 session 0x55798277e5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147546112 unmapped: 26607616 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:43.500462+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 heartbeat osd_stat(store_statfs(0x1b5e8b000/0x0/0x1bfc00000, data 0x3b4ebdb/0x3ca3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147554304 unmapped: 26599424 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:44.500575+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579852fec00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x5579852fec00 session 0x5579816e21e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 ms_handle_reset con 0x557982de2c00 session 0x557985313e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147562496 unmapped: 26591232 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:45.500713+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 ms_handle_reset con 0x5579810c0000 session 0x5579816e2780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:46.500851+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147587072 unmapped: 26566656 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 ms_handle_reset con 0x557982a10800 session 0x557983faa1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983578400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 ms_handle_reset con 0x557982de2c00 session 0x5579852e03c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 ms_handle_reset con 0x557983578400 session 0x5579852e0780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579852fec00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1914693 data_alloc: 285212672 data_used: 14254080
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 ms_handle_reset con 0x5579852fec00 session 0x557980b941e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:47.501002+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147619840 unmapped: 26533888 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:48.501186+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147619840 unmapped: 26533888 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.309727669s of 10.283976555s, submitted: 214
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 232 handle_osd_map epochs [232,233], i have 232, src has [1,233]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 ms_handle_reset con 0x5579810c0000 session 0x5579823a43c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 ms_handle_reset con 0x557982de2c00 session 0x557983df8d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 ms_handle_reset con 0x557982a10800 session 0x55798528d860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:49.501333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147636224 unmapped: 26517504 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983578400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 ms_handle_reset con 0x557983578400 session 0x5579823f94a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 heartbeat osd_stat(store_statfs(0x1b5a33000/0x0/0x1bfc00000, data 0x3b9a150/0x3cfa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 handle_osd_map epochs [234,234], i have 233, src has [1,234]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 233 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:50.501510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147677184 unmapped: 26476544 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557985125c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557985125c00 session 0x5579823a5e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x5579810c0000 session 0x55798165ed20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 heartbeat osd_stat(store_statfs(0x1b5a30000/0x0/0x1bfc00000, data 0x3b9ca17/0x3cfc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:51.501685+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 146538496 unmapped: 27615232 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557982a10800 session 0x5579823a6f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983578400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557982de2c00 session 0x5579823a6b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557983578400 session 0x5579823a10e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1933573 data_alloc: 285212672 data_used: 14258176
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:52.501823+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 146554880 unmapped: 27598848 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x5579844d9400 session 0x5579827a72c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557982a11400 session 0x5579823a72c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:53.501988+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 146571264 unmapped: 27582464 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x5579844d9400 session 0x55798165f2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:54.502120+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147701760 unmapped: 26451968 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579810c0000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x5579810c0000 session 0x55798136c5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557982a10800 session 0x55798277f2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:55.502414+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147718144 unmapped: 26435584 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982de2c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 ms_handle_reset con 0x557982de2c00 session 0x5579803f8780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:56.502591+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147718144 unmapped: 26435584 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 heartbeat osd_stat(store_statfs(0x1b59f6000/0x0/0x1bfc00000, data 0x3bdac2a/0x3d38000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1935316 data_alloc: 285212672 data_used: 14254080
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:57.502789+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147718144 unmapped: 26435584 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 234 handle_osd_map epochs [235,235], i have 234, src has [1,235]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 ms_handle_reset con 0x557982d28800 session 0x55798137b2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 handle_osd_map epochs [236,236], i have 236, src has [1,236]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:58.502950+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147734528 unmapped: 26419200 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.464834213s of 10.229992867s, submitted: 216
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 ms_handle_reset con 0x557982bb3400 session 0x5579829d7c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798497b000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 ms_handle_reset con 0x55798497b000 session 0x55798137a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:59.503082+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 heartbeat osd_stat(store_statfs(0x1b59d9000/0x0/0x1bfc00000, data 0x3bf17b7/0x3d53000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147800064 unmapped: 26353664 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798493bc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 ms_handle_reset con 0x55798493bc00 session 0x557980b94b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 ms_handle_reset con 0x5579842c9c00 session 0x557980b943c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:00.503272+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147800064 unmapped: 26353664 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 ms_handle_reset con 0x557982bb3400 session 0x557983df8960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:01.503503+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147800064 unmapped: 26353664 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1951890 data_alloc: 285212672 data_used: 14266368
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:02.503663+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147816448 unmapped: 26337280 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579836ed400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 236 handle_osd_map epochs [236,237], i have 236, src has [1,237]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:03.503819+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147816448 unmapped: 26337280 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 237 heartbeat osd_stat(store_statfs(0x1b59b0000/0x0/0x1bfc00000, data 0x3c1d8c6/0x3d7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 237 ms_handle_reset con 0x5579836ed400 session 0x55798136d0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:04.503995+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147816448 unmapped: 26337280 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 237 ms_handle_reset con 0x557982a13800 session 0x557983f7a780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:05.504196+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147816448 unmapped: 26337280 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 237 ms_handle_reset con 0x557982389000 session 0x557983f7a1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccdc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:06.504344+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 147841024 unmapped: 26312704 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 237 ms_handle_reset con 0x557982ccdc00 session 0x55798528d680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 238 ms_handle_reset con 0x557982389000 session 0x5579823da000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 238 ms_handle_reset con 0x557982a13800 session 0x5579823a85a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982bb3400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:07.504501+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1973913 data_alloc: 285212672 data_used: 14299136
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 148987904 unmapped: 25165824 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 238 ms_handle_reset con 0x557982bb3400 session 0x5579852b50e0
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata"} v 0)
Feb 01 10:08:55 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/35268448' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 239 ms_handle_reset con 0x55798163f800 session 0x557985a105a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:08.504646+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 149020672 unmapped: 25133056 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983da6400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d28800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 239 ms_handle_reset con 0x557982d28800 session 0x557983f7ad20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 239 ms_handle_reset con 0x55798163f800 session 0x557980b95a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 239 heartbeat osd_stat(store_statfs(0x1b5935000/0x0/0x1bfc00000, data 0x3c8a11a/0x3df4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.542766571s of 10.474296570s, submitted: 232
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:09.504833+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 239 handle_osd_map epochs [240,240], i have 239, src has [1,240]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 ms_handle_reset con 0x557983da6400 session 0x557982a0c1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 149086208 unmapped: 25067520 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:10.505085+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 149053440 unmapped: 25100288 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b5923000/0x0/0x1bfc00000, data 0x3ca014a/0x3e0b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 ms_handle_reset con 0x557982389000 session 0x5579853ce000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:11.505374+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 149094400 unmapped: 25059328 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b5911000/0x0/0x1bfc00000, data 0x3cb1df0/0x3e1d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a13800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 ms_handle_reset con 0x557982a13800 session 0x5579823db860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:12.505560+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1985871 data_alloc: 285212672 data_used: 14290944
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 149094400 unmapped: 25059328 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:13.505719+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 149135360 unmapped: 25018368 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982c17c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:14.505850+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 150274048 unmapped: 23879680 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 ms_handle_reset con 0x557982c17c00 session 0x5579803f83c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 heartbeat osd_stat(store_statfs(0x1b58de000/0x0/0x1bfc00000, data 0x3ce6dc8/0x3e50000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 ms_handle_reset con 0x55798163f800 session 0x557983f40f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:15.506002+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153149440 unmapped: 21004288 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:16.506182+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152920064 unmapped: 21233664 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 240 handle_osd_map epochs [240,241], i have 240, src has [1,241]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 241 handle_osd_map epochs [241,241], i have 241, src has [1,241]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 241 ms_handle_reset con 0x557982389000 session 0x55798136d0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:17.506388+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2096707 data_alloc: 285212672 data_used: 14303232
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152920064 unmapped: 21233664 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 241 handle_osd_map epochs [242,242], i have 241, src has [1,242]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:18.506537+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987779000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152928256 unmapped: 21225472 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 242 ms_handle_reset con 0x557987779000 session 0x557983df8960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 242 heartbeat osd_stat(store_statfs(0x1b4bff000/0x0/0x1bfc00000, data 0x49c1083/0x4b2f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.037853241s of 10.004800797s, submitted: 269
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:19.506737+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152887296 unmapped: 21266432 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982e36400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 242 ms_handle_reset con 0x557982e36400 session 0x557983cc5a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 242 ms_handle_reset con 0x557982420800 session 0x557980b94b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 16K writes, 62K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s
                                                          Cumulative WAL: 16K writes, 5389 syncs, 3.08 writes per sync, written: 0.05 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 38K keys, 11K commit groups, 1.0 writes per commit group, ingest: 30.50 MB, 0.05 MB/s
                                                          Interval WAL: 11K writes, 4649 syncs, 2.40 writes per sync, written: 0.03 GB, 0.05 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:20.506905+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152911872 unmapped: 21241856 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 242 handle_osd_map epochs [243,243], i have 242, src has [1,243]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 243 ms_handle_reset con 0x55798163f800 session 0x5579829d7c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:21.507096+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152911872 unmapped: 21241856 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:22.507270+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2027538 data_alloc: 285212672 data_used: 14319616
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153157632 unmapped: 20996096 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 243 handle_osd_map epochs [244,244], i have 243, src has [1,244]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:23.507446+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 244 ms_handle_reset con 0x557982389000 session 0x55798136c960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 152469504 unmapped: 21684224 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:24.507611+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 244 ms_handle_reset con 0x557982421000 session 0x557980b943c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798497b400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 244 heartbeat osd_stat(store_statfs(0x1b5835000/0x0/0x1bfc00000, data 0x3d8359a/0x3ef7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153534464 unmapped: 20619264 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:25.507774+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 245 ms_handle_reset con 0x55798497b400 session 0x55798277f2c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153804800 unmapped: 20348928 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 245 handle_osd_map epochs [245,245], i have 245, src has [1,245]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777b000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 245 ms_handle_reset con 0x55798777b000 session 0x55798136c5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:26.507882+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 245 handle_osd_map epochs [246,246], i have 245, src has [1,246]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 246 ms_handle_reset con 0x55798163f800 session 0x5579823a72c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153812992 unmapped: 20340736 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 246 handle_osd_map epochs [245,246], i have 246, src has [1,246]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 246 ms_handle_reset con 0x557982389000 session 0x557983d561e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 246 handle_osd_map epochs [246,247], i have 246, src has [1,247]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 247 ms_handle_reset con 0x557982421000 session 0x5579823a6b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:27.508057+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2044030 data_alloc: 285212672 data_used: 14344192
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153870336 unmapped: 20283392 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 247 ms_handle_reset con 0x5579823e4c00 session 0x5579823a10e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:28.508220+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153976832 unmapped: 20176896 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 247 heartbeat osd_stat(store_statfs(0x1b57c5000/0x0/0x1bfc00000, data 0x3df320a/0x3f65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:29.508385+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 247 handle_osd_map epochs [247,248], i have 247, src has [1,248]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.247528076s of 10.151292801s, submitted: 192
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 153976832 unmapped: 20176896 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987778800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 248 ms_handle_reset con 0x557987778800 session 0x55798165ed20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:30.508625+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154001408 unmapped: 20152320 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 248 ms_handle_reset con 0x557982389000 session 0x55798165e960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 248 ms_handle_reset con 0x55798163f800 session 0x5579823a5e00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:31.508783+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154017792 unmapped: 20135936 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 249 handle_osd_map epochs [249,249], i have 249, src has [1,249]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579823e4c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 249 ms_handle_reset con 0x5579823e4c00 session 0x55798528d860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 249 handle_osd_map epochs [250,250], i have 249, src has [1,250]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 ms_handle_reset con 0x557982421000 session 0x557983df8d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983da5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b57bc000/0x0/0x1bfc00000, data 0x3df8559/0x3f71000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 ms_handle_reset con 0x557983da5c00 session 0x5579852e0780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:32.508976+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2055924 data_alloc: 285212672 data_used: 14368768
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154058752 unmapped: 20094976 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983da5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 ms_handle_reset con 0x557983da5c00 session 0x557982a0c960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 ms_handle_reset con 0x557982421000 session 0x5579816e21e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 heartbeat osd_stat(store_statfs(0x1b57b5000/0x0/0x1bfc00000, data 0x3dfadd0/0x3f77000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:33.509138+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154058752 unmapped: 20094976 heap: 174153728 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:34.509309+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 171196416 unmapped: 19759104 heap: 190955520 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:35.509446+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154058752 unmapped: 45301760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:36.509601+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 169844736 unmapped: 29515776 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:37.509793+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 ms_handle_reset con 0x55798777a000 session 0x557983df8d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3197290 data_alloc: 285212672 data_used: 14368768
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 162529280 unmapped: 36831232 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:38.509941+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 heartbeat osd_stat(store_statfs(0x1aafb5000/0x0/0x1bfc00000, data 0xe5faea9/0xe779000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154148864 unmapped: 45211648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 250 handle_osd_map epochs [251,251], i have 250, src has [1,251]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 handle_osd_map epochs [251,251], i have 251, src has [1,251]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:39.510086+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 8.063263893s of 10.029263496s, submitted: 147
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 158416896 unmapped: 40943616 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 heartbeat osd_stat(store_statfs(0x1a4bb2000/0x0/0x1bfc00000, data 0x149fd7b9/0x14b7c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:40.510247+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 154329088 unmapped: 45031424 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:41.510412+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 158883840 unmapped: 40476672 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 ms_handle_reset con 0x5579842c9400 session 0x55798165e960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:42.510554+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 heartbeat osd_stat(store_statfs(0x19fbb1000/0x0/0x1bfc00000, data 0x199fd88d/0x19b7d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4734946 data_alloc: 285212672 data_used: 14381056
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a10400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164569088 unmapped: 34791424 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 ms_handle_reset con 0x557982a10400 session 0x5579823a6b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 ms_handle_reset con 0x557982421000 session 0x55798136c5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:43.510690+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164937728 unmapped: 34422784 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:44.510819+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 251 handle_osd_map epochs [252,252], i have 251, src has [1,252]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161120256 unmapped: 38240256 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:45.510988+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161415168 unmapped: 37945344 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 252 heartbeat osd_stat(store_statfs(0x194bb0000/0x0/0x1bfc00000, data 0x24a0005f/0x24b7e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [0,0,0,0,0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:46.511152+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165904384 unmapped: 33456128 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 252 handle_osd_map epochs [253,253], i have 252, src has [1,253]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 ms_handle_reset con 0x557982a11800 session 0x55798240da40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 ms_handle_reset con 0x5579842c9000 session 0x557980b952c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983da5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 ms_handle_reset con 0x557983da5c00 session 0x557980b95a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579842c9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:47.511350+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5999719 data_alloc: 285212672 data_used: 14405632
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157720576 unmapped: 41639936 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 heartbeat osd_stat(store_statfs(0x1927ab000/0x0/0x1bfc00000, data 0x26e0287f/0x26f82000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,2,3,4] op hist [2,1,0,2])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 ms_handle_reset con 0x55798777a000 session 0x5579823f9680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 ms_handle_reset con 0x5579842c9400 session 0x557983f7ad20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:48.511485+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 ms_handle_reset con 0x55798777a000 session 0x5579823a6d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156327936 unmapped: 43032576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:49.511665+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156327936 unmapped: 43032576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 253 handle_osd_map epochs [254,254], i have 253, src has [1,254]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 7.384384155s of 10.196742058s, submitted: 364
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 254 ms_handle_reset con 0x557982421000 session 0x5579823a8780
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:50.511823+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 254 heartbeat osd_stat(store_statfs(0x1b53aa000/0x0/0x1bfc00000, data 0x3e051c1/0x3f83000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156336128 unmapped: 43024384 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:51.512031+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982a11800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 255 ms_handle_reset con 0x557982a11800 session 0x5579803f90e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156344320 unmapped: 43016192 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557983da5c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 255 handle_osd_map epochs [256,256], i have 255, src has [1,256]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 256 ms_handle_reset con 0x557983da5c00 session 0x557983df9a40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557986ab9000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:52.512201+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2240108 data_alloc: 285212672 data_used: 14405632
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155648000 unmapped: 43712512 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 256 ms_handle_reset con 0x557986ab9000 session 0x5579823a74a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 256 ms_handle_reset con 0x557982421000 session 0x557983f7a5a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:53.512389+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798551a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 257 ms_handle_reset con 0x55798551a400 session 0x557980b97860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155672576 unmapped: 43687936 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 257 ms_handle_reset con 0x557982d65c00 session 0x55798528c1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:54.512551+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155672576 unmapped: 43687936 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798065c800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 257 ms_handle_reset con 0x55798065c800 session 0x55798165fc20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:55.512695+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 257 handle_osd_map epochs [258,258], i have 257, src has [1,258]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155721728 unmapped: 43638784 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 258 heartbeat osd_stat(store_statfs(0x1b539c000/0x0/0x1bfc00000, data 0x3e0cd9b/0x3f92000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:56.512918+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155721728 unmapped: 43638784 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 ms_handle_reset con 0x557982d65c00 session 0x55798240d680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 ms_handle_reset con 0x557982421000 session 0x55798240cb40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798551a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 ms_handle_reset con 0x55798551a400 session 0x557983cc5860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:57.513131+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2255539 data_alloc: 285212672 data_used: 14417920
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156778496 unmapped: 42582016 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 ms_handle_reset con 0x55798777a800 session 0x55798165f0e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d29c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:58.513330+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 ms_handle_reset con 0x557982d29c00 session 0x5579852b52c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156803072 unmapped: 42557440 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 ms_handle_reset con 0x557982421000 session 0x557983f7b860
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:59.526528+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156803072 unmapped: 42557440 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 259 handle_osd_map epochs [259,260], i have 259, src has [1,260]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.721676826s of 10.206754684s, submitted: 156
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982d65c00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:00.526683+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 260 heartbeat osd_stat(store_statfs(0x1b5397000/0x0/0x1bfc00000, data 0x3e11e62/0x3f97000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 260 ms_handle_reset con 0x557982d65c00 session 0x5579852b4b40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156819456 unmapped: 42541056 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798551a400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 260 ms_handle_reset con 0x55798551a400 session 0x5579823a85a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:01.526831+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156835840 unmapped: 42524672 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777a800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 260 handle_osd_map epochs [261,261], i have 260, src has [1,261]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 261 ms_handle_reset con 0x55798777a800 session 0x5579856345a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982389000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 261 ms_handle_reset con 0x557982389000 session 0x55798528da40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982421000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:02.526957+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 261 ms_handle_reset con 0x557982421000 session 0x557982a1a3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2260249 data_alloc: 285212672 data_used: 14442496
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155803648 unmapped: 43556864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:03.527090+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155811840 unmapped: 43548672 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 262 handle_osd_map epochs [263,263], i have 262, src has [1,263]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:04.527236+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155828224 unmapped: 43532288 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557981272000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798163f400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 ms_handle_reset con 0x55798163f400 session 0x557982a0d680
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 ms_handle_reset con 0x557981272000 session 0x55798137a3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 heartbeat osd_stat(store_statfs(0x1b538a000/0x0/0x1bfc00000, data 0x3e1c0b1/0x3fa3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:05.527410+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155844608 unmapped: 43515904 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 heartbeat osd_stat(store_statfs(0x1b538a000/0x0/0x1bfc00000, data 0x3e1c0b1/0x3fa3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798493bc00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 ms_handle_reset con 0x55798493bc00 session 0x55798082e960
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982ccf400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:06.527532+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155844608 unmapped: 43515904 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 ms_handle_reset con 0x557982ccf400 session 0x557983d56d20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 263 handle_osd_map epochs [263,264], i have 263, src has [1,264]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:07.527683+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x5579844d9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 264 ms_handle_reset con 0x5579844d9400 session 0x557983f410e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2271670 data_alloc: 285212672 data_used: 14454784
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155852800 unmapped: 43507712 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 264 handle_osd_map epochs [265,265], i have 264, src has [1,265]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 265 heartbeat osd_stat(store_statfs(0x1b5386000/0x0/0x1bfc00000, data 0x3e1e8e1/0x3fa7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:08.527866+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155639808 unmapped: 43720704 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557982420800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 266 ms_handle_reset con 0x557982420800 session 0x5579823e9c20
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:09.528003+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155680768 unmapped: 43679744 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.205329895s of 10.003293037s, submitted: 262
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798777ac00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:10.528159+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 267 ms_handle_reset con 0x55798777ac00 session 0x557982a0c1e0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155705344 unmapped: 43655168 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:11.528323+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155705344 unmapped: 43655168 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:12.528510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2285748 data_alloc: 285212672 data_used: 14454784
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155713536 unmapped: 43646976 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 268 heartbeat osd_stat(store_statfs(0x1b5376000/0x0/0x1bfc00000, data 0x3e290cb/0x3fb6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 268 handle_osd_map epochs [268,269], i have 268, src has [1,269]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:13.528665+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155754496 unmapped: 43606016 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:14.528797+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155754496 unmapped: 43606016 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:15.528916+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155754496 unmapped: 43606016 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:16.529083+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155754496 unmapped: 43606016 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 269 handle_osd_map epochs [269,270], i have 269, src has [1,270]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.17] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.12] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.8] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.18] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.b] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[3.e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:17.529320+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2293020 data_alloc: 285212672 data_used: 14467072
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155770880 unmapped: 43589632 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 271 handle_osd_map epochs [270,271], i have 271, src has [1,271]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:18.529453+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 271 heartbeat osd_stat(store_statfs(0x1b536f000/0x0/0x1bfc00000, data 0x3e2e336/0x3fbe000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155803648 unmapped: 43556864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:19.529635+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155803648 unmapped: 43556864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 271 handle_osd_map epochs [271,272], i have 271, src has [1,272]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.455296516s of 10.043914795s, submitted: 191
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:20.529774+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155820032 unmapped: 43540480 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:21.529927+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155820032 unmapped: 43540480 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 272 handle_osd_map epochs [272,273], i have 272, src has [1,273]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:22.530119+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2299996 data_alloc: 285212672 data_used: 14467072
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155828224 unmapped: 43532288 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:23.530324+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155828224 unmapped: 43532288 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:24.530461+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 273 heartbeat osd_stat(store_statfs(0x1b5365000/0x0/0x1bfc00000, data 0x3e35dbe/0x3fc8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 155828224 unmapped: 43532288 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 61
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:25.530929+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156065792 unmapped: 43294720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:26.531137+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156065792 unmapped: 43294720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 273 handle_osd_map epochs [274,274], i have 273, src has [1,274]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:27.531323+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2308502 data_alloc: 285212672 data_used: 14479360
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156073984 unmapped: 43286528 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:28.531453+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156082176 unmapped: 43278336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:29.531619+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156082176 unmapped: 43278336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:30.531777+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535f000/0x0/0x1bfc00000, data 0x3e38804/0x3fce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156082176 unmapped: 43278336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535f000/0x0/0x1bfc00000, data 0x3e38804/0x3fce000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:31.531948+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.483486176s of 11.748614311s, submitted: 85
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156106752 unmapped: 43253760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:32.532126+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535f000/0x0/0x1bfc00000, data 0x3e386f3/0x3fcd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2307764 data_alloc: 285212672 data_used: 14479360
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156106752 unmapped: 43253760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:33.532268+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156106752 unmapped: 43253760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:34.532464+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156106752 unmapped: 43253760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:35.532633+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535f000/0x0/0x1bfc00000, data 0x3e386f3/0x3fcd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156106752 unmapped: 43253760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:36.532788+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156114944 unmapped: 43245568 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535f000/0x0/0x1bfc00000, data 0x3e386f3/0x3fcd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:37.532980+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2307764 data_alloc: 285212672 data_used: 14479360
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156114944 unmapped: 43245568 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:38.552140+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156114944 unmapped: 43245568 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:39.552365+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156123136 unmapped: 43237376 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:40.552650+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156123136 unmapped: 43237376 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:41.552917+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.994858742s of 10.004966736s, submitted: 2
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156123136 unmapped: 43237376 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b5361000/0x0/0x1bfc00000, data 0x3e3867d/0x3fcd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:42.553274+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2306580 data_alloc: 285212672 data_used: 14479360
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156114944 unmapped: 43245568 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:43.553583+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156114944 unmapped: 43245568 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:44.553719+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156147712 unmapped: 43212800 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:45.553919+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:46.554076+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:47.554279+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2312772 data_alloc: 285212672 data_used: 14479360
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535d000/0x0/0x1bfc00000, data 0x3e388e1/0x3fd1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:48.554603+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535d000/0x0/0x1bfc00000, data 0x3e388e1/0x3fd1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:49.554897+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:50.555040+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:51.555231+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.926241875s of 10.014874458s, submitted: 15
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156155904 unmapped: 43204608 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:52.555410+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2316598 data_alloc: 285212672 data_used: 14495744
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156164096 unmapped: 43196416 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:53.555554+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156164096 unmapped: 43196416 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:54.555677+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 156164096 unmapped: 43196416 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 heartbeat osd_stat(store_statfs(0x1b535c000/0x0/0x1bfc00000, data 0x3e388e1/0x3fd1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 handle_osd_map epochs [275,275], i have 274, src has [1,275]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 274 handle_osd_map epochs [275,275], i have 275, src has [1,275]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:55.555921+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157220864 unmapped: 42139648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:56.556170+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157220864 unmapped: 42139648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:57.556352+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2318038 data_alloc: 285212672 data_used: 14508032
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157220864 unmapped: 42139648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:58.556552+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157229056 unmapped: 42131456 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:59.556681+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157229056 unmapped: 42131456 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:00.556886+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 275 heartbeat osd_stat(store_statfs(0x1b535a000/0x0/0x1bfc00000, data 0x3e3b1cd/0x3fd4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157237248 unmapped: 42123264 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:01.557073+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157237248 unmapped: 42123264 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 275 handle_osd_map epochs [276,276], i have 275, src has [1,276]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.189381599s of 10.408231735s, submitted: 45
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:02.557502+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2324848 data_alloc: 285212672 data_used: 14520320
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157253632 unmapped: 42106880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b5353000/0x0/0x1bfc00000, data 0x3e3da9a/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:03.557644+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b5353000/0x0/0x1bfc00000, data 0x3e3da9a/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157253632 unmapped: 42106880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:04.557789+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157253632 unmapped: 42106880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:05.557994+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157253632 unmapped: 42106880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:06.558145+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157253632 unmapped: 42106880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:07.558342+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2324848 data_alloc: 285212672 data_used: 14520320
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157253632 unmapped: 42106880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:08.558501+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157261824 unmapped: 42098688 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:09.558684+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b5353000/0x0/0x1bfc00000, data 0x3e3da9a/0x3fd9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157261824 unmapped: 42098688 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:10.558837+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157261824 unmapped: 42098688 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:11.559046+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157261824 unmapped: 42098688 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:12.559238+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.541307449s of 10.609442711s, submitted: 29
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2324392 data_alloc: 285212672 data_used: 14520320
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157270016 unmapped: 42090496 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 276 heartbeat osd_stat(store_statfs(0x1b5356000/0x0/0x1bfc00000, data 0x3e3d93a/0x3fd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:13.559442+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157270016 unmapped: 42090496 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:14.559598+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 276 handle_osd_map epochs [277,277], i have 276, src has [1,277]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157302784 unmapped: 42057728 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 277 handle_osd_map epochs [277,278], i have 277, src has [1,278]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 278 handle_osd_map epochs [277,278], i have 278, src has [1,278]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:15.559736+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157343744 unmapped: 42016768 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:16.559846+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157351936 unmapped: 42008576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:17.560042+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 278 heartbeat osd_stat(store_statfs(0x1b534f000/0x0/0x1bfc00000, data 0x3e42b76/0x3fdd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2330470 data_alloc: 285212672 data_used: 14532608
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157351936 unmapped: 42008576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:18.560202+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157351936 unmapped: 42008576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:19.560354+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157351936 unmapped: 42008576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:20.560490+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157351936 unmapped: 42008576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:21.560700+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157351936 unmapped: 42008576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 278 handle_osd_map epochs [279,279], i have 278, src has [1,279]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 278 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 278 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:22.560822+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2338932 data_alloc: 285212672 data_used: 14544896
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157368320 unmapped: 41992192 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:23.561092+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5336000/0x0/0x1bfc00000, data 0x3e59529/0x3ff6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157368320 unmapped: 41992192 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.196981430s of 11.580804825s, submitted: 120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:24.561377+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 157376512 unmapped: 41984000 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:25.561591+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 158613504 unmapped: 40747008 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:26.561824+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 158613504 unmapped: 40747008 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:27.562110+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2346696 data_alloc: 285212672 data_used: 14544896
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 158613504 unmapped: 40747008 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:28.562263+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 158941184 unmapped: 40419328 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:29.562496+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b52d7000/0x0/0x1bfc00000, data 0x3eba358/0x4057000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 159064064 unmapped: 40296448 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:30.562799+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 159064064 unmapped: 40296448 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:31.563057+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 159522816 unmapped: 39837696 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:32.563215+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2357866 data_alloc: 285212672 data_used: 14544896
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 159522816 unmapped: 39837696 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:33.563418+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 159522816 unmapped: 39837696 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5243000/0x0/0x1bfc00000, data 0x3f4bd96/0x40ea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.604010582s of 10.009015083s, submitted: 65
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:34.563615+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 160907264 unmapped: 38453248 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b5229000/0x0/0x1bfc00000, data 0x3f66fc2/0x4105000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:35.563759+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161316864 unmapped: 38043648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:36.563978+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b51ca000/0x0/0x1bfc00000, data 0x3fc5586/0x4164000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161316864 unmapped: 38043648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:37.564185+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2366558 data_alloc: 285212672 data_used: 14544896
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161800192 unmapped: 37560320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:38.564361+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161800192 unmapped: 37560320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557986ab9400
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:39.564501+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 161087488 unmapped: 38273024 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:40.564709+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 heartbeat osd_stat(store_statfs(0x1b51a4000/0x0/0x1bfc00000, data 0x3febebc/0x418a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 279 handle_osd_map epochs [280,280], i have 280, src has [1,280]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 160849920 unmapped: 38510592 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:41.564884+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 160849920 unmapped: 38510592 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:42.565094+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2365354 data_alloc: 285212672 data_used: 14557184
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 160849920 unmapped: 38510592 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:43.565283+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 160849920 unmapped: 38510592 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.708598137s of 10.004078865s, submitted: 78
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:44.565445+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 162430976 unmapped: 36929536 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 280 heartbeat osd_stat(store_statfs(0x1b5128000/0x0/0x1bfc00000, data 0x4063e18/0x4206000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:45.565669+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 162430976 unmapped: 36929536 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:46.565887+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 280 handle_osd_map epochs [281,281], i have 280, src has [1,281]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163127296 unmapped: 36233216 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:47.566075+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2392296 data_alloc: 285212672 data_used: 14573568
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163135488 unmapped: 36225024 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:48.566278+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b50a0000/0x0/0x1bfc00000, data 0x40e9932/0x428d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163143680 unmapped: 36216832 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:49.566492+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163348480 unmapped: 36012032 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:50.566675+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b505b000/0x0/0x1bfc00000, data 0x412f78f/0x42d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163348480 unmapped: 36012032 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:51.566876+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163348480 unmapped: 36012032 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:52.567044+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2397196 data_alloc: 285212672 data_used: 14573568
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163364864 unmapped: 35995648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b505b000/0x0/0x1bfc00000, data 0x412f78f/0x42d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:53.567242+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163364864 unmapped: 35995648 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.492445946s of 10.003144264s, submitted: 117
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:54.567402+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 163594240 unmapped: 35766272 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:55.567611+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164020224 unmapped: 35340288 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:56.567830+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165076992 unmapped: 34283520 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:57.568133+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b4f83000/0x0/0x1bfc00000, data 0x4205e95/0x43aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2411678 data_alloc: 285212672 data_used: 14573568
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165076992 unmapped: 34283520 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:58.568363+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165306368 unmapped: 34054144 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:59.568553+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164528128 unmapped: 34832384 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 heartbeat osd_stat(store_statfs(0x1b4f84000/0x0/0x1bfc00000, data 0x4205ee9/0x43aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 handle_osd_map epochs [282,282], i have 281, src has [1,282]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 281 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:00.568811+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164544512 unmapped: 34816000 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 282 handle_osd_map epochs [282,282], i have 282, src has [1,282]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:01.568965+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 282 heartbeat osd_stat(store_statfs(0x1b3d72000/0x0/0x1bfc00000, data 0x4274874/0x441c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164880384 unmapped: 34480128 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:02.569152+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2433278 data_alloc: 285212672 data_used: 14585856
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165019648 unmapped: 34340864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:03.569352+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165019648 unmapped: 34340864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.507069588s of 10.000179291s, submitted: 105
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 282 heartbeat osd_stat(store_statfs(0x1b3d2a000/0x0/0x1bfc00000, data 0x42bb071/0x4463000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:04.569543+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164642816 unmapped: 34717696 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:05.569718+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 282 handle_osd_map epochs [283,283], i have 282, src has [1,283]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164716544 unmapped: 34643968 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:06.569902+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 283 heartbeat osd_stat(store_statfs(0x1b3cc9000/0x0/0x1bfc00000, data 0x431d01a/0x44c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 164872192 unmapped: 34488320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:07.570078+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 284 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2440080 data_alloc: 285212672 data_used: 14602240
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165289984 unmapped: 34070528 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:08.570257+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 284 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 165289984 unmapped: 34070528 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:09.570415+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 166715392 unmapped: 32645120 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:10.570603+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 167215104 unmapped: 32145408 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 284 heartbeat osd_stat(store_statfs(0x1b3c1f000/0x0/0x1bfc00000, data 0x43c5015/0x456e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7a6f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:11.570788+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 167067648 unmapped: 32292864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 284 handle_osd_map epochs [285,285], i have 284, src has [1,285]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:12.570925+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 285 handle_osd_map epochs [285,285], i have 285, src has [1,285]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2457102 data_alloc: 285212672 data_used: 14614528
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 285 handle_osd_map epochs [286,286], i have 285, src has [1,286]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 167264256 unmapped: 32096256 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:13.571074+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 166469632 unmapped: 32890880 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 286 handle_osd_map epochs [286,286], i have 286, src has [1,286]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.367613792s of 10.000089645s, submitted: 212
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:14.571264+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 166649856 unmapped: 32710656 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:15.571441+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 167755776 unmapped: 31604736 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:16.571638+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 286 heartbeat osd_stat(store_statfs(0x1b36d9000/0x0/0x1bfc00000, data 0x4508e1c/0x46b4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 168312832 unmapped: 31047680 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:17.571896+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2469788 data_alloc: 285212672 data_used: 14614528
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 167985152 unmapped: 31375360 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:18.572048+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 170237952 unmapped: 29122560 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:19.572216+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 286 heartbeat osd_stat(store_statfs(0x1b24b7000/0x0/0x1bfc00000, data 0x458c2cd/0x4737000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 286 heartbeat osd_stat(store_statfs(0x1b24b7000/0x0/0x1bfc00000, data 0x458c2cd/0x4737000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 286 handle_osd_map epochs [287,287], i have 286, src has [1,287]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 169476096 unmapped: 29884416 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:20.572444+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 287 heartbeat osd_stat(store_statfs(0x1b2457000/0x0/0x1bfc00000, data 0x45e9de5/0x4796000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 169558016 unmapped: 29802496 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:21.572605+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 287 handle_osd_map epochs [288,288], i have 287, src has [1,288]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 169566208 unmapped: 29794304 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:22.572775+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 288 handle_osd_map epochs [288,288], i have 288, src has [1,288]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 288 heartbeat osd_stat(store_statfs(0x1b242f000/0x0/0x1bfc00000, data 0x460eb0b/0x47bc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2495770 data_alloc: 285212672 data_used: 14626816
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 171163648 unmapped: 28196864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:23.572957+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 171163648 unmapped: 28196864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.494943619s of 10.105772018s, submitted: 161
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:24.573137+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 288 heartbeat osd_stat(store_statfs(0x1b23b0000/0x0/0x1bfc00000, data 0x468df75/0x483d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 171540480 unmapped: 27820032 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:25.573334+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 170336256 unmapped: 29024256 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:26.573478+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 288 heartbeat osd_stat(store_statfs(0x1b2340000/0x0/0x1bfc00000, data 0x46fad8c/0x48ac000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 288 handle_osd_map epochs [289,289], i have 288, src has [1,289]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 170369024 unmapped: 28991488 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:27.573686+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 289 heartbeat osd_stat(store_statfs(0x1b22fa000/0x0/0x1bfc00000, data 0x4740178/0x48f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 289 handle_osd_map epochs [290,290], i have 289, src has [1,290]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2529024 data_alloc: 285212672 data_used: 14639104
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 171786240 unmapped: 27574272 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 290 handle_osd_map epochs [290,290], i have 290, src has [1,290]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:28.573795+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 173121536 unmapped: 26238976 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:29.573921+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 290 handle_osd_map epochs [291,291], i have 290, src has [1,291]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 291 heartbeat osd_stat(store_statfs(0x1b21fd000/0x0/0x1bfc00000, data 0x483bee8/0x49f0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 173129728 unmapped: 26230784 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:30.574079+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 173146112 unmapped: 26214400 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:31.601704+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 172351488 unmapped: 27009024 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:32.601850+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2541486 data_alloc: 285212672 data_used: 14651392
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 172351488 unmapped: 27009024 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:33.602029+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 291 heartbeat osd_stat(store_statfs(0x1b2167000/0x0/0x1bfc00000, data 0x48d5a64/0x4a87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 172662784 unmapped: 26697728 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:34.602151+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.642681122s of 10.362320900s, submitted: 221
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 291 heartbeat osd_stat(store_statfs(0x1b20f9000/0x0/0x1bfc00000, data 0x4942517/0x4af5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:35.602281+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 172777472 unmapped: 26583040 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:36.602439+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 173842432 unmapped: 25518080 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 291 handle_osd_map epochs [291,292], i have 291, src has [1,292]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:37.602622+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 173858816 unmapped: 25501696 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2560028 data_alloc: 285212672 data_used: 14659584
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:38.602810+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 176439296 unmapped: 22921216 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:39.602947+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 176447488 unmapped: 22913024 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 292 handle_osd_map epochs [293,293], i have 292, src has [1,293]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 292 handle_osd_map epochs [293,293], i have 293, src has [1,293]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 293 handle_osd_map epochs [293,293], i have 293, src has [1,293]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:40.603129+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 176455680 unmapped: 22904832 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 293 handle_osd_map epochs [293,293], i have 293, src has [1,293]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 293 heartbeat osd_stat(store_statfs(0x1b1fd3000/0x0/0x1bfc00000, data 0x4a69195/0x4c1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:41.603351+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175693824 unmapped: 23666688 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:42.603495+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175710208 unmapped: 23650304 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2560966 data_alloc: 285212672 data_used: 14671872
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:43.603640+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175710208 unmapped: 23650304 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:44.603792+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175734784 unmapped: 23625728 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.508491516s of 10.010738373s, submitted: 127
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:45.603963+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175734784 unmapped: 23625728 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:46.604150+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175734784 unmapped: 23625728 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 293 handle_osd_map epochs [294,294], i have 293, src has [1,294]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fd3000/0x0/0x1bfc00000, data 0x4a6914e/0x4c1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:47.604365+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 175734784 unmapped: 23625728 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fd3000/0x0/0x1bfc00000, data 0x4a6914e/0x4c1a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2566198 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:48.604563+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 176799744 unmapped: 22560768 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcd000/0x0/0x1bfc00000, data 0x4a6b9b9/0x4c1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:49.604707+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 176807936 unmapped: 22552576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:50.604902+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 176807936 unmapped: 22552576 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 ms_handle_reset con 0x557986ab9400 session 0x557982bc34a0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:51.605049+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178159616 unmapped: 21200896 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 62
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:52.605174+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178364416 unmapped: 20996096 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2568246 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:53.605354+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178364416 unmapped: 20996096 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 63
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:54.605495+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178569216 unmapped: 20791296 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcf000/0x0/0x1bfc00000, data 0x4a6ba0c/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.743238449s of 10.003871918s, submitted: 661
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcc000/0x0/0x1bfc00000, data 0x4a6ba54/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:55.605641+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178585600 unmapped: 20774912 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcc000/0x0/0x1bfc00000, data 0x4a6ba54/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:56.605785+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178585600 unmapped: 20774912 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcc000/0x0/0x1bfc00000, data 0x4a6ba54/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:57.605936+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178577408 unmapped: 20783104 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2568586 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:58.606108+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178577408 unmapped: 20783104 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:59.606319+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178577408 unmapped: 20783104 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:00.606555+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fce000/0x0/0x1bfc00000, data 0x4a6ba0d/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:01.606691+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:02.606898+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcd000/0x0/0x1bfc00000, data 0x4a6ba54/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2568714 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:03.607136+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:04.607333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.057021141s of 10.148558617s, submitted: 15
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:05.607503+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcf000/0x0/0x1bfc00000, data 0x4a6ba51/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:06.607670+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fce000/0x0/0x1bfc00000, data 0x4a6ba0d/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:07.607879+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178593792 unmapped: 20766720 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fcd000/0x0/0x1bfc00000, data 0x4a6b9e1/0x4c1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2569616 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:08.608069+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:09.608240+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:10.608436+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fd0000/0x0/0x1bfc00000, data 0x4a6b9b1/0x4c1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:11.608626+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fd0000/0x0/0x1bfc00000, data 0x4a6b9b1/0x4c1e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:12.608807+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2566726 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:13.608954+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:14.609138+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.271762848s of 10.327589989s, submitted: 10
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:15.609333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178610176 unmapped: 20750336 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1fd1000/0x0/0x1bfc00000, data 0x4a6b916/0x4c1d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:16.609521+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178626560 unmapped: 20733952 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:17.609730+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178626560 unmapped: 20733952 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2568798 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:18.609892+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178634752 unmapped: 20725760 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:19.610109+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178823168 unmapped: 20537344 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:20.610337+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178823168 unmapped: 20537344 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:21.610524+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1f56000/0x0/0x1bfc00000, data 0x4ae53bd/0x4c98000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178823168 unmapped: 20537344 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:22.610702+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 178905088 unmapped: 20455424 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2589480 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:23.610864+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 179200000 unmapped: 20160512 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:24.611052+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 179216384 unmapped: 20144128 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1ec4000/0x0/0x1bfc00000, data 0x4b766b1/0x4d29000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [0,1,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:25.611248+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 180338688 unmapped: 19021824 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.233828545s of 10.547332764s, submitted: 55
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:26.611468+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 180338688 unmapped: 19021824 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:27.611694+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 180625408 unmapped: 18735104 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2600568 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:28.611840+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 180109312 unmapped: 19251200 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:29.612028+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 180436992 unmapped: 18923520 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1e00000/0x0/0x1bfc00000, data 0x4c3a968/0x4ded000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [0,0,0,0,1])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:30.612164+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 180633600 unmapped: 18726912 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:31.612341+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182009856 unmapped: 17350656 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:32.612484+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182018048 unmapped: 17342464 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:33.612644+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2605406 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182304768 unmapped: 17055744 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:34.612830+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182411264 unmapped: 16949248 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1ce0000/0x0/0x1bfc00000, data 0x4d59e97/0x4f0e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:35.612973+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1ce0000/0x0/0x1bfc00000, data 0x4d59e97/0x4f0e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182427648 unmapped: 16932864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.697450638s of 10.034008026s, submitted: 63
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:36.613146+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182427648 unmapped: 16932864 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:37.613339+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 182878208 unmapped: 16482304 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:38.613480+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2617690 data_alloc: 285212672 data_used: 14684160
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 183934976 unmapped: 15425536 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 heartbeat osd_stat(store_statfs(0x1b1c58000/0x0/0x1bfc00000, data 0x4de4dce/0x4f96000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:39.613637+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184213504 unmapped: 15147008 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 294 handle_osd_map epochs [295,295], i have 294, src has [1,295]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:40.613801+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184311808 unmapped: 15048704 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 295 heartbeat osd_stat(store_statfs(0x1b1c13000/0x0/0x1bfc00000, data 0x4e27ad1/0x4fda000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:41.613948+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184320000 unmapped: 15040512 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:42.614101+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:43.614256+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2629216 data_alloc: 285212672 data_used: 14696448
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:44.614422+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 295 heartbeat osd_stat(store_statfs(0x1b1bcb000/0x0/0x1bfc00000, data 0x4e6e2ca/0x5021000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:45.614567+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 295 heartbeat osd_stat(store_statfs(0x1b1bcd000/0x0/0x1bfc00000, data 0x4e6e2c8/0x5021000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:46.614736+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:47.614934+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 295 handle_osd_map epochs [296,296], i have 295, src has [1,296]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.644596100s of 11.976144791s, submitted: 81
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.16] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.6] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[6.9] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 handle_osd_map epochs [296,296], i have 296, src has [1,296]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:48.615119+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2626450 data_alloc: 285212672 data_used: 14708736
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:49.615265+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:50.615458+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:51.615657+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 heartbeat osd_stat(store_statfs(0x1b1bc8000/0x0/0x1bfc00000, data 0x4e70aa1/0x5025000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:52.615866+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:53.616032+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2628378 data_alloc: 285212672 data_used: 14712832
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:54.616216+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:55.616341+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184328192 unmapped: 15032320 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 heartbeat osd_stat(store_statfs(0x1b1bca000/0x0/0x1bfc00000, data 0x4e70a06/0x5024000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:56.616528+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:57.616712+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:58.616897+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2627322 data_alloc: 285212672 data_used: 14712832
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:59.617015+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.771873474s of 11.822182655s, submitted: 26
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:00.617197+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 heartbeat osd_stat(store_statfs(0x1b1bc8000/0x0/0x1bfc00000, data 0x4e70a36/0x5025000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:01.617345+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:02.617522+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:03.617664+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2627626 data_alloc: 285212672 data_used: 14712832
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 heartbeat osd_stat(store_statfs(0x1b1bc8000/0x0/0x1bfc00000, data 0x4e70a36/0x5025000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184147968 unmapped: 15212544 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:04.617846+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184156160 unmapped: 15204352 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:05.618020+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184156160 unmapped: 15204352 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:06.618174+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184156160 unmapped: 15204352 heap: 199360512 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:07.618350+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184754176 unmapped: 16703488 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 heartbeat osd_stat(store_statfs(0x1b13c8000/0x0/0x1bfc00000, data 0x5670a46/0x5826000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:08.618510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2714784 data_alloc: 285212672 data_used: 14712832
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557980b5a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 296 handle_osd_map epochs [297,297], i have 296, src has [1,297]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:09.618680+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 297 handle_osd_map epochs [298,298], i have 297, src has [1,298]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 10.616436958s of 10.714769363s, submitted: 12
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 297 handle_osd_map epochs [297,298], i have 298, src has [1,298]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 298 handle_osd_map epochs [297,298], i have 298, src has [1,298]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:10.618807+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 298 ms_handle_reset con 0x557980b5a000 session 0x55798518c3c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:11.618920+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 298 heartbeat osd_stat(store_statfs(0x1b1bc1000/0x0/0x1bfc00000, data 0x4e75c96/0x502c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 298 heartbeat osd_stat(store_statfs(0x1b1bc1000/0x0/0x1bfc00000, data 0x4e75c96/0x502c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:12.619095+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:13.619280+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2635354 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:14.619533+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:15.619733+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 298 heartbeat osd_stat(store_statfs(0x1b1bc2000/0x0/0x1bfc00000, data 0x4e75bfb/0x502b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:16.619928+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:17.620139+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _renew_subs
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 298 handle_osd_map epochs [299,299], i have 298, src has [1,299]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.14] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.0] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.3] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1d] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1e] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.2] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.11] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 scrub-queue::remove_from_osd_queue removing pg[4.f] failed. State was: not registered w/ OSD
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:18.620356+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2639948 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184623104 unmapped: 16834560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:19.620523+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184786944 unmapped: 16670720 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1b61000/0x0/0x1bfc00000, data 0x4ed53be/0x508d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:20.620764+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184795136 unmapped: 16662528 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:21.620977+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184795136 unmapped: 16662528 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.960550308s of 12.114134789s, submitted: 44
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:22.621148+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184901632 unmapped: 16556032 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:23.621369+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2650744 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184901632 unmapped: 16556032 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1b1c000/0x0/0x1bfc00000, data 0x4f19687/0x50d2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:24.621587+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 184901632 unmapped: 16556032 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:25.621780+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 185057280 unmapped: 16400384 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:26.621933+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186105856 unmapped: 15351808 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1ad2000/0x0/0x1bfc00000, data 0x4f634ff/0x511c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:27.622115+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186105856 unmapped: 15351808 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:28.622328+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2650330 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186105856 unmapped: 15351808 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:29.622579+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186105856 unmapped: 15351808 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:30.622744+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186105856 unmapped: 15351808 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:31.623263+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:32.623457+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1a99000/0x0/0x1bfc00000, data 0x4f9dcab/0x5155000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:33.623648+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2654474 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1a99000/0x0/0x1bfc00000, data 0x4f9dcab/0x5155000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 11.720134735s of 11.819780350s, submitted: 16
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:34.623817+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:35.624019+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:36.624245+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:37.624479+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1a6b000/0x0/0x1bfc00000, data 0x4fcb86c/0x5183000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:38.624678+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:08:55.774 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2655770 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:39.624896+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:40.625087+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1a57000/0x0/0x1bfc00000, data 0x4fdfb11/0x5197000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186122240 unmapped: 15335424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:41.625241+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186130432 unmapped: 15327232 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:42.625401+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186130432 unmapped: 15327232 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:43.625570+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2659898 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186130432 unmapped: 15327232 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.933634758s of 10.001874924s, submitted: 12
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:44.625733+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186138624 unmapped: 15319040 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:45.625893+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186294272 unmapped: 15163392 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:46.626130+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b19fb000/0x0/0x1bfc00000, data 0x503ad78/0x51f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186294272 unmapped: 15163392 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b19fb000/0x0/0x1bfc00000, data 0x503ad78/0x51f3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:47.626343+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186294272 unmapped: 15163392 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:48.626502+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2663252 data_alloc: 285212672 data_used: 14725120
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186294272 unmapped: 15163392 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:49.626714+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186400768 unmapped: 15056896 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:50.626889+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186400768 unmapped: 15056896 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:51.627024+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b19b2000/0x0/0x1bfc00000, data 0x50852f4/0x523c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186400768 unmapped: 15056896 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:52.627191+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1987000/0x0/0x1bfc00000, data 0x50afb42/0x5267000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1987000/0x0/0x1bfc00000, data 0x50afb42/0x5267000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186400768 unmapped: 15056896 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:53.627392+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2668644 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186400768 unmapped: 15056896 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 9.882411957s of 10.003607750s, submitted: 21
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:54.627560+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186449920 unmapped: 15007744 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:55.627752+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186449920 unmapped: 15007744 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:56.627938+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186449920 unmapped: 15007744 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:57.628108+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186449920 unmapped: 15007744 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:58.628251+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2669496 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b195e000/0x0/0x1bfc00000, data 0x50d9030/0x5290000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186449920 unmapped: 15007744 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:59.628378+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186564608 unmapped: 14893056 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:00.628535+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:01.628686+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:02.628825+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:03.628965+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:04.629115+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:05.629355+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:06.629544+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:07.629759+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186572800 unmapped: 14884864 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:08.629943+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:09.630107+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:10.630328+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:11.630477+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:12.630665+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:13.630818+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:14.631020+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:15.631204+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:16.631351+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186597376 unmapped: 14860288 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:17.631558+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:18.631702+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:19.631850+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 ms_handle_reset con 0x557982ccec00 session 0x5579811f92c0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x557987aa3800
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:20.632000+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 ms_handle_reset con 0x557982bb2000 session 0x5579823dba40
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: handle_auth_request added challenge on 0x55798493a000
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:21.632150+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:22.632283+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:23.632465+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186580992 unmapped: 14876672 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:24.632652+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:25.632824+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:26.632970+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:27.633369+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:28.633643+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:29.633902+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:30.634161+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:31.634342+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186589184 unmapped: 14868480 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:32.634463+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186605568 unmapped: 14852096 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:33.634726+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186605568 unmapped: 14852096 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:34.634894+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:35.635118+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:36.635310+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:37.635533+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:38.635711+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:39.635939+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:40.636103+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:41.636333+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:42.636542+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:43.636723+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:44.636885+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:45.637014+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:46.637153+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:47.637325+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:48.637459+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:49.637639+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:50.637835+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:51.638010+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:52.638194+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:53.638356+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:54.638510+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186613760 unmapped: 14843904 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:55.638672+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186621952 unmapped: 14835712 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:56.638823+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:57.639007+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:58.639216+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:59.639346+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:00.639495+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:01.639642+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:02.639825+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:03.639973+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:04.640093+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:05.640209+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:06.640348+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:07.640490+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:08.640617+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:09.640737+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:10.640904+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:11.641024+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:12.641168+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:13.641323+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:14.641493+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:15.641655+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:16.641793+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e59fd/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:17.641952+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:18.642107+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2670020 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:19.642251+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:20.642398+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore(/var/lib/ceph/osd/ceph-5) _kv_sync_thread utilization: idle 87.037597656s of 87.056594849s, submitted: 5
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 ms_handle_reset con 0x557982389400 session 0x5579823f8f00
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:21.643398+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Got map version 64
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:22.643526+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:23.643645+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:24.643821+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:25.643962+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:26.644146+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:27.644374+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:28.644559+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:29.644729+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:30.644914+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:31.645046+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:32.645901+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187138048 unmapped: 14319616 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:33.646105+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187146240 unmapped: 14311424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:34.649604+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187146240 unmapped: 14311424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:35.649720+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 187146240 unmapped: 14311424 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:36.653820+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:37.653978+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:38.655174+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:39.655351+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:40.655822+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:41.655967+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:42.656120+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:43.656270+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:44.656647+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:45.657018+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:46.657435+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:47.658475+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:48.658653+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:49.658896+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:50.659116+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:51.659348+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186630144 unmapped: 14827520 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:52.659540+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:53.659658+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:54.659865+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:55.660054+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:56.660258+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:57.660498+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:58.660665+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:59.660862+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186646528 unmapped: 14811136 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:00.661052+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:01.661240+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:02.661430+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:03.661611+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:04.661768+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:05.661891+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:06.662070+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:07.662281+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186654720 unmapped: 14802944 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:08.662490+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:09.662667+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:10.662824+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:11.662966+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:12.663347+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:13.663522+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:14.663733+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:15.663938+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186662912 unmapped: 14794752 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:16.664106+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186671104 unmapped: 14786560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:17.664317+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186671104 unmapped: 14786560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:18.664475+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186671104 unmapped: 14786560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:19.664578+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186671104 unmapped: 14786560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:20.664681+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186671104 unmapped: 14786560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:21.664817+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186671104 unmapped: 14786560 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:22.664940+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 186638336 unmapped: 14819328 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'config diff' '{prefix=config diff}'
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'config show' '{prefix=config show}'
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'counter dump' '{prefix=counter dump}'
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'counter schema' '{prefix=counter schema}'
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:23.665082+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: osd.5 299 heartbeat osd_stat(store_statfs(0x1b1951000/0x0/0x1bfc00000, data 0x50e5c10/0x529d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x900f9b7), peers [0,1,2,3,4] op hist [])
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 185753600 unmapped: 15704064 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: bluestore.MempoolThread(0x55797ef89b60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2667684 data_alloc: 285212672 data_used: 14737408
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: tick
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_tickets
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:24.665214+0000)
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: prioritycache tune_memory target: 5709082009 mapped: 185688064 unmapped: 15769600 heap: 201457664 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:55 np0005604215.localdomain ceph-osd[32318]: do_command 'log dump' '{prefix=log dump}'
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60010 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69692 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:55 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49413 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr module ls"} v 0)
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/779323452' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49428 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60025 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.69671 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.49401 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3402448316' entity='client.admin' cmd={"prefix": "mgr dump"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/35268448' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.60010 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.69692 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.49413 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/731191173' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3202185444' entity='client.admin' cmd={"prefix": "mgr metadata"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/779323452' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.49428 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.69704 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.60025 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3583931005' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4222843250' entity='client.admin' cmd={"prefix": "mgr module ls"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr services"} v 0)
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4080868927' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69719 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v799: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49440 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60046 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:56 np0005604215.localdomain crontab[321471]: (root) LIST (root)
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: [progress INFO root] Writing back 50 completed events
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command([{prefix=config-key set, key=mgr/progress/completed}] v 0)
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr versions"} v 0)
Feb 01 10:08:56 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4019706384' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:56 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69740 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49452 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60061 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49464 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60076 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4080868927' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.69719 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: pgmap v799: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.49440 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.60046 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3080013294' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3882748401' entity='client.admin' cmd={"prefix": "mgr services"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4019706384' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.69740 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.49452 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.60061 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='mgr.34541 ' entity='mgr.np0005604215.uhhqtv' 
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2183146669' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4249651255' entity='client.admin' cmd={"prefix": "mgr versions"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mon stat"} v 0)
Feb 01 10:08:57 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2527544172' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69767 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:57.788+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49476 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:57 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60091 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} v 0)
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1428303598' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49494 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:58.569+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:58 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v800: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "node ls"} v 0)
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1957382609' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr dump", "format": "json-pretty"} v 0)
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1971514350' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.49464 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.60076 -' entity='client.admin' cmd=[{"prefix": "balancer eval", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1654686825' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2527544172' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.69767 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.49476 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/6990767' entity='client.admin' cmd={"prefix": "mon stat"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.60091 -' entity='client.admin' cmd=[{"prefix": "balancer status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1428303598' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60115 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:58 np0005604215.localdomain ceph-mgr[278126]: mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:58 np0005604215.localdomain ceph-33fac0b9-80c7-560f-918a-c92d3021ca1e-mgr-np0005604215-uhhqtv[278122]: 2026-02-01T10:08:58.766+0000 7f941d3ee640 -1 mgr.server reply reply (95) Operation not supported Module 'prometheus' is not enabled/loaded (required by command 'healthcheck history ls'): use `ceph mgr module enable prometheus` to enable it
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd crush class ls"} v 0)
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3789203360' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr metadata", "format": "json-pretty"} v 0)
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/395360189' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd crush dump"} v 0)
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/644898619' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr module ls", "format": "json-pretty"} v 0)
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3276737372' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.49494 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: pgmap v800: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1957382609' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1971514350' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/34763222' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.60115 -' entity='client.admin' cmd=[{"prefix": "healthcheck history ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2804368661' entity='client.admin' cmd={"prefix": "node ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/4286154104' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1980956082' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3789203360' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/395360189' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3471190857' entity='client.admin' cmd={"prefix": "log last", "channel": "cephadm", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/353988189' entity='client.admin' cmd={"prefix": "osd crush class ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2140666632' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1639595185' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/644898619' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/779386158' entity='client.admin' cmd={"prefix": "mgr dump", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3276737372' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/66361793' entity='client.admin' cmd={"prefix": "osd crush dump"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:32.533977+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b8dbf000/0x0/0x1bfc00000, data 0x2c4e572/0x2ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:33.534197+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:34.534377+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:35.534538+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 958471 data_alloc: 301989888 data_used: 12951552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:36.534706+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 ms_handle_refused con 0x55aabe97f000 session 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:37.534867+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b8dbf000/0x0/0x1bfc00000, data 0x2c4e572/0x2ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:38.535053+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:39.535212+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:40.535371+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 958471 data_alloc: 301989888 data_used: 12951552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b8dbf000/0x0/0x1bfc00000, data 0x2c4e572/0x2ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:41.535507+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:42.535656+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:43.535844+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:44.536001+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:45.536151+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 958471 data_alloc: 301989888 data_used: 12951552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b8dbf000/0x0/0x1bfc00000, data 0x2c4e572/0x2ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:46.536502+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:47.536685+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:48.536882+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b8dbf000/0x0/0x1bfc00000, data 0x2c4e572/0x2ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:49.537038+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 ms_handle_refused con 0x55aabe97f000 session 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:50.537280+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 heartbeat osd_stat(store_statfs(0x1b8dbf000/0x0/0x1bfc00000, data 0x2c4e572/0x2ccf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 958471 data_alloc: 301989888 data_used: 12951552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:51.537475+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:52.537560+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98697216 unmapped: 2531328 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:53.537755+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98705408 unmapped: 2523136 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 45
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now 
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect Terminating session with v2:172.18.0.105:6800/155238379
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect No active mgr available yet
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:54.537911+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe491800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604215 at v2:172.18.0.108:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 88 handle_osd_map epochs [89,89], i have 88, src has [1,89]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 72.237083435s of 72.296730042s, submitted: 12
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98631680 unmapped: 2596864 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 46
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect Starting new session with [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: get_auth_request con 0x55aabfce2000 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_configure stats_period=5
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:55.538074+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 962671 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dba000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeb01c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:56.538211+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7d000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:57.538400+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 47
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:58.538632+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:46:59.538792+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:00.539091+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_monmap mon_map magic: 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient:  got monmap 14 from mon.np0005604215 (according to old e14)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: dump:
                                                          epoch 14
                                                          fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                          last_changed 2026-02-01T09:47:31.128772+0000
                                                          created 2026-02-01T07:37:52.883666+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: mon.np0005604215 at [v2:172.18.0.108:3300/0,v1:172.18.0.108:6789/0] went away
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _reopen_session rank -1
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _add_conns ranks=[1,0]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): picked mon.np0005604213 con 0x55aabfd16c00 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): picked mon.np0005604212 con 0x55aabe97f400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): start opening mon connection
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): start opening mon connection
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _finish_auth 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request con 0x55aabfd16c00 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth method 2
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request con 0x55aabe97f400 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth method 2
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_done global_id 14277 payload 293
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _finish_hunting 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: found mon.np0005604213
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604213 at v2:172.18.0.104:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _finish_auth 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:01.141188+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604213 at v2:172.18.0.104:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: ms_handle_reset current mon [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _reopen_session rank -1
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _add_conns ranks=[0,1]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): picked mon.np0005604212 con 0x55aabe97f400 addr [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): picked mon.np0005604213 con 0x55aabcff2000 addr [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): start opening mon connection
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): start opening mon connection
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 ms_handle_reset con 0x55aabfd16c00 session 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request con 0x55aabcff2000 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth method 2
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request con 0x55aabe97f400 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): get_auth_request method 2 preferred_modes [2,1]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth method 2
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): _init_auth already have auth, reseting
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more payload 9
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more payload_len 9
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_reply_more responding with 132 bytes
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient(hunting): handle_auth_done global_id 14277 payload 293
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _finish_hunting 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: found mon.np0005604212
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _finish_auth 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:01.151063+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_monmap mon_map magic: 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient:  got monmap 14 from mon.np0005604212 (according to old e14)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: dump:
                                                          epoch 14
                                                          fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                          last_changed 2026-02-01T09:47:31.128772+0000
                                                          created 2026-02-01T07:37:52.883666+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_config config(7 keys)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: set_mon_vals no callback set
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 47
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.200:6800/2795591711,v1:172.18.0.200:6801/2795591711]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:01.539266+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:02.539496+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:03.539739+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:04.539905+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:05.540090+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98861056 unmapped: 2367488 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:06.540255+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:07.540432+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:08.540662+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:09.540894+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:10.541121+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:11.541274+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:12.541470+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:13.541684+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:14.541849+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:15.542006+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:16.542157+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:17.542380+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:18.542497+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:19.563442+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98869248 unmapped: 2359296 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_monmap mon_map magic: 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient:  got monmap 15 from mon.np0005604212 (according to old e15)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: dump:
                                                          epoch 15
                                                          fsid 33fac0b9-80c7-560f-918a-c92d3021ca1e
                                                          last_changed 2026-02-01T09:47:50.388496+0000
                                                          created 2026-02-01T07:37:52.883666+0000
                                                          min_mon_release 18 (reef)
                                                          election_strategy: 1
                                                          0: [v2:172.18.0.103:3300/0,v1:172.18.0.103:6789/0] mon.np0005604212
                                                          1: [v2:172.18.0.104:3300/0,v1:172.18.0.104:6789/0] mon.np0005604213
                                                          2: [v2:172.18.0.105:3300/0,v1:172.18.0.105:6789/0] mon.np0005604215
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:20.563614+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:21.563763+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:22.563943+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:23.564119+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:24.564328+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:25.564489+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:26.564696+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:27.564880+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:28.565079+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:29.565330+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:30.565493+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 961791 data_alloc: 301989888 data_used: 12963840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:31.565669+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:32.565836+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 heartbeat osd_stat(store_statfs(0x1b8dbb000/0x0/0x1bfc00000, data 0x2c50be6/0x2cd3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:33.566044+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98877440 unmapped: 2351104 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 48
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now 
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect Terminating session with v2:172.18.0.200:6800/2795591711
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect No active mgr available yet
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:34.566519+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 ms_handle_reset con 0x55aabdd7d000 session 0x55aabfaa4960
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 ms_handle_reset con 0x55aabeb01c00 session 0x55aabdbb1e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 ms_handle_reset con 0x55aabe491800 session 0x55aabfc6b0e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 89 handle_osd_map epochs [90,90], i have 89, src has [1,90]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 40.047794342s of 40.113094330s, submitted: 12
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7d000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99024896 unmapped: 2203648 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:35.566679+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 49
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect Starting new session with [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: get_auth_request con 0x55aabcff3000 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_configure stats_period=5
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965991 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98975744 unmapped: 2252800 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db6000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:36.566817+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0591400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99024896 unmapped: 2203648 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 50
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:37.566954+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99024896 unmapped: 2203648 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:38.567100+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99024896 unmapped: 2203648 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 51
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:39.567262+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99041280 unmapped: 2187264 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 52
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.107:6810/1840881422,v1:172.18.0.107:6811/1840881422]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:40.567365+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:41.567503+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:42.567867+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:43.568159+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:44.568511+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:45.568811+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:46.569055+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:47.569257+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:48.569428+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:49.569583+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:50.569760+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:51.569919+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:52.570065+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:53.570255+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:54.570415+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:55.570724+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:56.570938+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:57.571140+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:58.571329+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:47:59.571579+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:00.571778+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:01.571862+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:02.572057+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:03.572253+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:04.572429+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:05.572632+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:06.572834+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:07.573048+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:08.573198+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:09.573386+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:10.573556+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:11.574107+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:12.574772+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:13.576699+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:14.577431+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98746368 unmapped: 2482176 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:15.578113+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:16.578872+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:17.579102+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:18.579600+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:19.579817+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:20.580316+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 301989888 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:21.580859+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:22.581188+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98754560 unmapped: 2473984 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:23.581567+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98762752 unmapped: 2465792 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:24.581717+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98762752 unmapped: 2465792 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:25.581939+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 285212672 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98762752 unmapped: 2465792 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:26.582103+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98762752 unmapped: 2465792 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:27.582426+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:28.582711+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:29.583076+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:30.583276+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 285212672 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:31.583477+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:32.583666+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:33.583908+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98770944 unmapped: 2457600 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:34.584123+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:35.584374+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 285212672 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:36.584535+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:37.584719+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:38.584869+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:39.584957+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:40.585148+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 285212672 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:41.585345+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98779136 unmapped: 2449408 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:42.585571+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:43.585770+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:44.586086+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:45.586344+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 285212672 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:46.586518+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:47.586717+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:48.586971+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:49.587143+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 98787328 unmapped: 2441216 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:50.587341+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 965111 data_alloc: 285212672 data_used: 12976128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 90 handle_osd_map epochs [90,91], i have 90, src has [1,91]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 76.510437012s of 76.573638916s, submitted: 12
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 53
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now 
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect Terminating session with v2:172.18.0.107:6810/1840881422
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect No active mgr available yet
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 ms_handle_reset con 0x55aac0591400 session 0x55aabfc6d680
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 ms_handle_reset con 0x55aabfbcb800 session 0x55aabeaf4f00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 ms_handle_reset con 0x55aabdd7d000 session 0x55aabdbb0f00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db7000/0x0/0x1bfc00000, data 0x2c533b2/0x2cd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabcff3000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99041280 unmapped: 2187264 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:51.587453+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 54
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc reconnect Starting new session with [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: get_auth_request con 0x55aabfbcb800 auth_method 0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_configure stats_period=5
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99041280 unmapped: 2187264 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:52.587648+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97e800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 55
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeb01800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99123200 unmapped: 2105344 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:53.587818+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99123200 unmapped: 2105344 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:54.588016+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99123200 unmapped: 2105344 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:55.588216+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 56
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99123200 unmapped: 2105344 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:56.588369+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 57
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:57.588528+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:58.588702+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:48:59.588901+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:00.589125+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:01.589345+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:02.589500+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:03.589685+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:04.589815+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:05.589983+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:06.590131+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:07.590428+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:08.590667+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:09.590845+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:10.591100+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:11.591377+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:12.591622+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:13.591894+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:14.592096+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:15.592369+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:16.592603+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:17.592809+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:18.592961+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:19.593145+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:20.593359+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:21.593539+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:22.593710+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:23.593942+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:24.594137+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:25.594271+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:26.594458+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:27.594630+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:28.594812+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:29.594985+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:30.595144+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:31.595318+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:32.595532+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:33.595816+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:34.595997+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99328000 unmapped: 1900544 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:35.596195+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:36.596372+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:37.596577+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:38.596736+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:39.596931+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:40.597116+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:41.597396+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:42.597577+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99336192 unmapped: 1892352 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:43.597792+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99344384 unmapped: 1884160 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:44.597944+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99344384 unmapped: 1884160 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:45.598139+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99344384 unmapped: 1884160 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:46.598384+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99344384 unmapped: 1884160 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:47.598613+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99344384 unmapped: 1884160 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:48.598810+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:49.598964+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:50.599167+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:51.599346+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:52.599515+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:53.599774+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:54.599970+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:55.600174+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:56.600355+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:57.600562+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:58.600791+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99352576 unmapped: 1875968 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:49:59.600963+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:00.601123+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:01.601277+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:02.601503+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:03.601671+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:04.601870+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:05.602164+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:06.602352+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:07.602561+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:08.602771+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:09.603013+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:10.603212+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:11.603466+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:12.603680+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:13.603969+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:14.604151+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:15.604344+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 7800.1 total, 600.0 interval
                                                          Cumulative writes: 5755 writes, 24K keys, 5755 commit groups, 1.0 writes per commit group, ingest: 0.02 GB, 0.00 MB/s
                                                          Cumulative WAL: 5755 writes, 912 syncs, 6.31 writes per sync, written: 0.02 GB, 0.00 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 250 writes, 446 keys, 250 commit groups, 1.0 writes per commit group, ingest: 0.42 MB, 0.00 MB/s
                                                          Interval WAL: 250 writes, 125 syncs, 2.00 writes per sync, written: 0.00 GB, 0.00 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99360768 unmapped: 1867776 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:16.604526+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:17.604744+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:18.604931+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:19.605090+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:20.605278+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99368960 unmapped: 1859584 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 969311 data_alloc: 285212672 data_used: 12988416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:21.605458+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99377152 unmapped: 1851392 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:22.605617+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99377152 unmapped: 1851392 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 58
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:23.605821+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99418112 unmapped: 1810432 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 heartbeat osd_stat(store_statfs(0x1b8db2000/0x0/0x1bfc00000, data 0x2c55cd6/0x2cdb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 93.035514832s of 93.098289490s, submitted: 12
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:24.605972+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99434496 unmapped: 1794048 heap: 101228544 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:25.606108+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 91 handle_osd_map epochs [92,92], i have 91, src has [1,92]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 92 ms_handle_reset con 0x55aac125a000 session 0x55aabf9094a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100057088 unmapped: 15859712 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1061926 data_alloc: 285212672 data_used: 13004800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:26.606262+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100057088 unmapped: 15859712 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:27.606387+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100057088 unmapped: 15859712 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:28.606489+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100040704 unmapped: 15876096 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 92 heartbeat osd_stat(store_statfs(0x1b793c000/0x0/0x1bfc00000, data 0x40c8637/0x4152000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:29.606608+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100147200 unmapped: 15769600 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 92 handle_osd_map epochs [92,93], i have 92, src has [1,93]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 handle_osd_map epochs [93,93], i have 93, src has [1,93]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b793c000/0x0/0x1bfc00000, data 0x40c8637/0x4152000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [1,0,0,1])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97cc00 session 0x55aabf909860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:30.606748+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:31.606886+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:32.607037+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:33.607255+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:34.607426+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:35.607583+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:36.607726+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd crush rule ls"} v 0)
Feb 01 10:08:59 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3887349745' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:37.607884+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:38.608003+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:39.608192+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:40.608349+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:41.608480+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:42.608642+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:43.608814+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:44.608976+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:45.609170+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:46.609358+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:47.609539+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:48.609705+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:49.609863+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:50.610035+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:51.610188+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:52.610377+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:53.610676+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:54.610849+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:55.610992+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:56.611151+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:57.611368+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:58.611548+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:50:59.611744+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:00.611935+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:01.612112+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:02.612277+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:03.612529+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:04.612708+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:05.612899+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:06.613095+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:07.616087+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:08.616267+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:09.616445+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:10.616586+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:11.616749+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:12.616937+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:13.617130+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:14.617348+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:15.617503+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:16.617692+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:17.617843+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:18.617979+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:19.618136+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:20.618268+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100204544 unmapped: 15712256 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:21.618477+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100212736 unmapped: 15704064 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:22.618616+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:23.618823+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:24.618997+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:25.619177+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:26.619359+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:27.619550+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:28.619727+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:29.619891+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:30.620051+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:31.620259+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:32.620445+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:33.620618+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:34.620764+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:35.620922+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:36.621059+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:37.621238+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:38.621425+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c6000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:39.621608+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:40.621784+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:41.622022+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1157642 data_alloc: 285212672 data_used: 13017088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:42.622220+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:43.622490+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99975168 unmapped: 15941632 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:44.622653+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148b400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 80.519432068s of 80.790412903s, submitted: 44
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aac148b400 session 0x55aabfc6c5a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c7000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99983360 unmapped: 15933440 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:45.622846+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabfbcb400 session 0x55aabfc845a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7d000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 99983360 unmapped: 15933440 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:46.622989+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabdd7d000 session 0x55aabfc6c000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97cc00 session 0x55aabdb66780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1243818 data_alloc: 285212672 data_used: 13025280
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b74c7000/0x0/0x1bfc00000, data 0x453af78/0x45c7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148b400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aac125a000 session 0x55aabfc6cb40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aac148b400 session 0x55aac1de7e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7c400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97f000 session 0x55aabfc6d680
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7d000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabdd7c400 session 0x55aac1de6780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabdd7d000 session 0x55aabfc6c5a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aac125a000 session 0x55aabfb98b40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97cc00 session 0x55aac1de70e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aac0585000 session 0x55aabfc24d20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100605952 unmapped: 15310848 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:47.623130+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7c400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabdd7c400 session 0x55aabfc24000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100622336 unmapped: 15294464 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7d000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabdd7d000 session 0x55aabfc8d680
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:48.623310+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97cc00 session 0x55aabfbf70e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aac0585000 session 0x55aabfab72c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100483072 unmapped: 15433728 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafe000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:49.623480+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97c800 session 0x55aabfc6a780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100491264 unmapped: 15425536 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7c400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:50.623635+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b6b6f000/0x0/0x1bfc00000, data 0x4e8ffbb/0x4f1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100515840 unmapped: 15400960 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:51.623845+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1248824 data_alloc: 285212672 data_used: 13029376
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 100655104 unmapped: 15261696 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:52.624046+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 101949440 unmapped: 13967360 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:53.624263+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 101949440 unmapped: 13967360 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b6b6f000/0x0/0x1bfc00000, data 0x4e8ffbb/0x4f1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:54.624480+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 102031360 unmapped: 13885440 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:55.624632+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b6b6f000/0x0/0x1bfc00000, data 0x4e8ffbb/0x4f1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 102031360 unmapped: 13885440 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:56.624774+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b6b6f000/0x0/0x1bfc00000, data 0x4e8ffbb/0x4f1f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1260184 data_alloc: 285212672 data_used: 14643200
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 102039552 unmapped: 13877248 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:57.624941+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 102039552 unmapped: 13877248 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:58.625223+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.974133492s of 14.258081436s, submitted: 51
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 102080512 unmapped: 13836288 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:51:59.625512+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 101515264 unmapped: 14401536 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:00.625774+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 106422272 unmapped: 9494528 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:01.625917+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1432976 data_alloc: 285212672 data_used: 15060992
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 110223360 unmapped: 5693440 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:02.626011+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0577000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce3800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 heartbeat osd_stat(store_statfs(0x1b570c000/0x0/0x1bfc00000, data 0x62f0fbb/0x6380000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabe97c800 session 0x55aabfc6cf00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 ms_handle_reset con 0x55aabdd7c400 session 0x55aabf6834a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 109584384 unmapped: 6332416 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:03.626192+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 handle_osd_map epochs [94,94], i have 93, src has [1,94]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 93 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 handle_osd_map epochs [94,94], i have 94, src has [1,94]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 ms_handle_reset con 0x55aabe97cc00 session 0x55aabfc2b860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 ms_handle_reset con 0x55aac125a000 session 0x55aabf6832c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 ms_handle_reset con 0x55aabeafe000 session 0x55aabeaf4f00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 109731840 unmapped: 6184960 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:04.626468+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7c400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 ms_handle_reset con 0x55aabe97c800 session 0x55aabf909860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 109740032 unmapped: 6176768 heap: 115916800 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 heartbeat osd_stat(store_statfs(0x1b56cf000/0x0/0x1bfc00000, data 0x632c6e3/0x63bd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:05.626593+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 94 handle_osd_map epochs [94,95], i have 94, src has [1,95]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 handle_osd_map epochs [95,95], i have 95, src has [1,95]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 ms_handle_reset con 0x55aabe97cc00 session 0x55aabfc33860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcac00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 ms_handle_reset con 0x55aac125a000 session 0x55aabf9094a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 ms_handle_reset con 0x55aabfbcac00 session 0x55aabdbb12c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 ms_handle_reset con 0x55aabdd7c400 session 0x55aabfc2b0e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 126115840 unmapped: 1638400 heap: 127754240 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:06.626769+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 ms_handle_reset con 0x55aabfb48800 session 0x55aabe65a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1727614 data_alloc: 285212672 data_used: 15454208
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113664000 unmapped: 22126592 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:07.626939+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 ms_handle_reset con 0x55aabe97c800 session 0x55aabf9e4b40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 95 handle_osd_map epochs [96,96], i have 95, src has [1,96]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113696768 unmapped: 22093824 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:08.627272+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 96 handle_osd_map epochs [96,97], i have 96, src has [1,97]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aabf993400 session 0x55aabfc254a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 heartbeat osd_stat(store_statfs(0x1b35f2000/0x0/0x1bfc00000, data 0x8405d9d/0x849b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aabfb49c00 session 0x55aabfc25e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1ccd800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aac1ccd800 session 0x55aabdbb1e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113737728 unmapped: 22052864 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.873857498s of 10.199217796s, submitted: 313
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aabe97c800 session 0x55aabdbb10e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:09.627459+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aac0577000 session 0x55aabeaf4780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aabfce3800 session 0x55aabfb9f860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113491968 unmapped: 22298624 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:10.627591+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aabf993400 session 0x55aabfab7a40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 ms_handle_reset con 0x55aabfb48800 session 0x55aabfc6b0e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 97 handle_osd_map epochs [97,98], i have 97, src has [1,98]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 109682688 unmapped: 26107904 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:11.627764+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1470581 data_alloc: 285212672 data_used: 13066240
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 109682688 unmapped: 26107904 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:12.627896+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 109682688 unmapped: 26107904 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:13.628093+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 heartbeat osd_stat(store_statfs(0x1b53d9000/0x0/0x1bfc00000, data 0x661cf32/0x66b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aac125a000 session 0x55aabfc6cb40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcac00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd1800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aabfbcac00 session 0x55aabfc6d860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aabffd1800 session 0x55aabfc2ab40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aac1260c00 session 0x55aabfc6d2c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7d000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aabdd7d000 session 0x55aabf909860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 110223360 unmapped: 25567232 heap: 135790592 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:14.628277+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcac00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aabfbcac00 session 0x55aabf9094a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd1800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aabffd1800 session 0x55aabe65c780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aac125a000 session 0x55aabe65d0e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 124985344 unmapped: 15007744 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:15.628507+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 ms_handle_reset con 0x55aac1260c00 session 0x55aabe65a780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 98 handle_osd_map epochs [98,99], i have 98, src has [1,99]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 125026304 unmapped: 14966784 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:16.628649+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 99 ms_handle_reset con 0x55aac0585800 session 0x55aac1de63c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcac00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 99 ms_handle_reset con 0x55aabfbcac00 session 0x55aac1de6960
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd1800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1668488 data_alloc: 301989888 data_used: 19161088
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 99 ms_handle_reset con 0x55aabffd1800 session 0x55aac1de7e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116727808 unmapped: 23265280 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:17.628860+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 99 heartbeat osd_stat(store_statfs(0x1b40b1000/0x0/0x1bfc00000, data 0x79427cb/0x79dd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 99 handle_osd_map epochs [99,100], i have 99, src has [1,100]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 handle_osd_map epochs [100,100], i have 100, src has [1,100]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabf993000 session 0x55aabfc332c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 111476736 unmapped: 28516352 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:18.629038+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113713152 unmapped: 26279936 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:19.629229+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113713152 unmapped: 26279936 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:20.629413+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.719244957s of 11.567488670s, submitted: 206
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabf992c00 session 0x55aabe5a65a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd0800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 heartbeat osd_stat(store_statfs(0x1b5e0d000/0x0/0x1bfc00000, data 0x5870069/0x5909000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113762304 unmapped: 26230784 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:21.629553+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabffd0800 session 0x55aabdbcf860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1425527 data_alloc: 285212672 data_used: 17313792
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 118194176 unmapped: 21798912 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:22.629697+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabf992c00 session 0x55aabdbcfc20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabfbcb000 session 0x55aabfc252c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeb01000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabeb01000 session 0x55aabfb9c780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0577800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aac0577800 session 0x55aabfb9c1e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7dc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabdd7dc00 session 0x55aabfabf4a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:23.629883+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114040832 unmapped: 25952256 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:24.630037+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114073600 unmapped: 25919488 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:25.630234+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114073600 unmapped: 25919488 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7dc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 ms_handle_reset con 0x55aabdd7dc00 session 0x55aabf6823c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeb01000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 100 handle_osd_map epochs [100,101], i have 100, src has [1,101]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:26.630394+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114180096 unmapped: 25812992 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b5c75000/0x0/0x1bfc00000, data 0x5d7b8f2/0x5e18000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1477258 data_alloc: 285212672 data_used: 17326080
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:27.630530+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114860032 unmapped: 25133056 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:28.630687+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114917376 unmapped: 25075712 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:29.630839+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114917376 unmapped: 25075712 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:30.630958+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115130368 unmapped: 24862720 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:31.631096+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115130368 unmapped: 24862720 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 101 heartbeat osd_stat(store_statfs(0x1b5c6f000/0x0/0x1bfc00000, data 0x5d7f8f2/0x5e1c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.574740410s of 10.970377922s, submitted: 97
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 101 ms_handle_reset con 0x55aabf993400 session 0x55aabe65b4a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 101 ms_handle_reset con 0x55aabe97c800 session 0x55aabfc33860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1482986 data_alloc: 301989888 data_used: 17891328
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:32.631239+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115163136 unmapped: 24829952 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 101 ms_handle_reset con 0x55aabfb49000 session 0x55aabf6834a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 101 handle_osd_map epochs [102,102], i have 101, src has [1,102]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 ms_handle_reset con 0x55aabddd4000 session 0x55aabfc6cf00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:33.631509+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7dc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115179520 unmapped: 24813568 heap: 139993088 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 handle_osd_map epochs [102,102], i have 102, src has [1,102]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 ms_handle_reset con 0x55aabdd7dc00 session 0x55aabfb98d20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 ms_handle_reset con 0x55aabddd4000 session 0x55aabfc2b860
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97c800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 ms_handle_reset con 0x55aabeb01000 session 0x55aabfab7a40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 ms_handle_reset con 0x55aabf992c00 session 0x55aabf909a40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 ms_handle_reset con 0x55aabe97c800 session 0x55aabfa4e780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7dc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:34.631660+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 120897536 unmapped: 25788416 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 handle_osd_map epochs [103,103], i have 102, src has [1,103]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 102 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 103 ms_handle_reset con 0x55aabdd7dc00 session 0x55aac1de61e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:35.631785+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 103 handle_osd_map epochs [103,103], i have 103, src has [1,103]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 120922112 unmapped: 25763840 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 103 handle_osd_map epochs [103,104], i have 103, src has [1,104]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 handle_osd_map epochs [104,104], i have 104, src has [1,104]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabddd4000 session 0x55aabfc845a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:36.631966+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 heartbeat osd_stat(store_statfs(0x1b3e56000/0x0/0x1bfc00000, data 0x7b9352e/0x7c36000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 120561664 unmapped: 26124288 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeb01000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabeb01000 session 0x55aabfa4e5a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1784704 data_alloc: 301989888 data_used: 19013632
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabf992c00 session 0x55aabe65a780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe491400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabe491400 session 0x55aabfab05a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:37.632157+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123535360 unmapped: 23150592 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7dc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabdd7dc00 session 0x55aabe65d0e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:38.632380+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116916224 unmapped: 29769728 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:39.632540+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116678656 unmapped: 30007296 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:40.632665+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabf993400 session 0x55aabfab6b40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 ms_handle_reset con 0x55aabfb49000 session 0x55aabfc301e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116703232 unmapped: 29982720 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 heartbeat osd_stat(store_statfs(0x1b4866000/0x0/0x1bfc00000, data 0x6e75499/0x6f15000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [1])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 104 handle_osd_map epochs [104,105], i have 104, src has [1,105]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.f] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.18] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabddd4000 session 0x55aabfc6ba40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.e] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:41.632809+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115744768 unmapped: 30941184 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.1b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.5] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.a] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[2.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeb01000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabeb01000 session 0x55aabfb965a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7dc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.807853699s of 10.008989334s, submitted: 276
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1534365 data_alloc: 285212672 data_used: 13135872
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabdd7dc00 session 0x55aabfb974a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabddd4000 session 0x55aabfa4f4a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabf993400 session 0x55aabfb9e3c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabfb49000 session 0x55aabfb9e000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabf992c00 session 0x55aabfb98000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabf993000 session 0x55aabf9e52c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdec7800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:42.632942+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 120733696 unmapped: 25952256 heap: 146685952 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 handle_osd_map epochs [105,105], i have 105, src has [1,105]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabdec7800 session 0x55aabfc2bc20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:43.633132+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129097728 unmapped: 25763840 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabfbcb000 session 0x55aabf6830e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aac0590400 session 0x55aabf6823c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:44.633339+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabfb48000 session 0x55aabf6834a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129097728 unmapped: 25763840 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdec7800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 ms_handle_reset con 0x55aabdec7800 session 0x55aabfabe3c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 heartbeat osd_stat(store_statfs(0x1b3b01000/0x0/0x1bfc00000, data 0x7eedc18/0x7f8d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x416f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:45.633524+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129130496 unmapped: 25731072 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 105 handle_osd_map epochs [106,106], i have 105, src has [1,106]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:46.633635+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 handle_osd_map epochs [106,106], i have 106, src has [1,106]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129179648 unmapped: 25681920 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 ms_handle_reset con 0x55aac0590400 session 0x55aabfc6cf00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1521133 data_alloc: 285212672 data_used: 11378688
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 heartbeat osd_stat(store_statfs(0x1b36fb000/0x0/0x1bfc00000, data 0x7ef059a/0x7f92000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:47.633737+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 117039104 unmapped: 37822464 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:48.633854+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 ms_handle_reset con 0x55aabf993000 session 0x55aabfb9c780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 117039104 unmapped: 37822464 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 ms_handle_reset con 0x55aabfbcb000 session 0x55aac1de6780
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:49.634002+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 ms_handle_reset con 0x55aac0590c00 session 0x55aabfc254a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:50.634204+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 heartbeat osd_stat(store_statfs(0x1b7092000/0x0/0x1bfc00000, data 0x455c518/0x45fb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 106 handle_osd_map epochs [107,107], i have 106, src has [1,107]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:51.634396+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1325322 data_alloc: 285212672 data_used: 11390976
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:52.634588+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:53.634801+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:54.635003+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b708e000/0x0/0x1bfc00000, data 0x455ed1c/0x45ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:55.635359+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b708e000/0x0/0x1bfc00000, data 0x455ed1c/0x45ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:56.635584+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1325322 data_alloc: 285212672 data_used: 11390976
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:57.635811+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:58.635967+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:52:59.636141+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 107 heartbeat osd_stat(store_statfs(0x1b708e000/0x0/0x1bfc00000, data 0x455ed1c/0x45ff000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:00.636359+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116350976 unmapped: 38510592 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.254005432s of 19.014335632s, submitted: 186
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 107 handle_osd_map epochs [108,108], i have 107, src has [1,108]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 107 handle_osd_map epochs [108,108], i have 108, src has [1,108]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 108 ms_handle_reset con 0x55aabdd7cc00 session 0x55aabf9092c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:01.636512+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116359168 unmapped: 38502400 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1332337 data_alloc: 285212672 data_used: 11403264
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:02.636662+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116367360 unmapped: 38494208 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 108 handle_osd_map epochs [108,109], i have 108, src has [1,109]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:03.636868+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116375552 unmapped: 38486016 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 handle_osd_map epochs [109,109], i have 109, src has [1,109]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 ms_handle_reset con 0x55aac0590400 session 0x55aabf908b40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:04.637058+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 heartbeat osd_stat(store_statfs(0x1b7085000/0x0/0x1bfc00000, data 0x4563fcf/0x4608000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116383744 unmapped: 38477824 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:05.637238+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116383744 unmapped: 38477824 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 heartbeat osd_stat(store_statfs(0x1b7085000/0x0/0x1bfc00000, data 0x4563fac/0x4607000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:06.637398+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116449280 unmapped: 38412288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1333993 data_alloc: 285212672 data_used: 11415552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:07.637572+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116449280 unmapped: 38412288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:08.637766+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116449280 unmapped: 38412288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:09.637924+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116449280 unmapped: 38412288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:10.638119+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116449280 unmapped: 38412288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 109 handle_osd_map epochs [110,110], i have 109, src has [1,110]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.264369011s of 10.443493843s, submitted: 47
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:11.638319+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7085000/0x0/0x1bfc00000, data 0x4563fac/0x4607000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116457472 unmapped: 38404096 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7082000/0x0/0x1bfc00000, data 0x45667b0/0x460b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1335894 data_alloc: 285212672 data_used: 11415552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:12.638512+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7082000/0x0/0x1bfc00000, data 0x45667b0/0x460b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:13.638719+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7082000/0x0/0x1bfc00000, data 0x45667b0/0x460b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:14.638898+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:15.639096+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:16.639245+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1335894 data_alloc: 285212672 data_used: 11415552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:17.639424+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7082000/0x0/0x1bfc00000, data 0x45667b0/0x460b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:18.639614+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7082000/0x0/0x1bfc00000, data 0x45667b0/0x460b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:19.639783+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:20.639926+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:21.640107+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116465664 unmapped: 38395904 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1335894 data_alloc: 285212672 data_used: 11415552
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce3c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.817627907s of 10.838802338s, submitted: 22
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:22.640256+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116531200 unmapped: 38330368 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 heartbeat osd_stat(store_statfs(0x1b7081000/0x0/0x1bfc00000, data 0x45667e3/0x460d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 110 handle_osd_map epochs [111,111], i have 110, src has [1,111]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 111 handle_osd_map epochs [111,111], i have 111, src has [1,111]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 111 ms_handle_reset con 0x55aabfce3c00 session 0x55aabfab6000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:23.640463+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116547584 unmapped: 38313984 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:24.640638+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116547584 unmapped: 38313984 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:25.640835+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116547584 unmapped: 38313984 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:26.641043+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116547584 unmapped: 38313984 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1345964 data_alloc: 285212672 data_used: 11427840
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:27.641245+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0584000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116563968 unmapped: 38297600 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 111 handle_osd_map epochs [111,112], i have 111, src has [1,112]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 112 handle_osd_map epochs [112,112], i have 112, src has [1,112]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 112 ms_handle_reset con 0x55aac0584000 session 0x55aabfc6b0e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 112 heartbeat osd_stat(store_statfs(0x1b707d000/0x0/0x1bfc00000, data 0x4569101/0x4611000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:28.641476+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:29.641659+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:30.641832+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:31.642002+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1346395 data_alloc: 285212672 data_used: 11440128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 112 heartbeat osd_stat(store_statfs(0x1b7079000/0x0/0x1bfc00000, data 0x456ba40/0x4613000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:32.642197+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:33.642444+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:34.642634+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:35.642820+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 116645888 unmapped: 38215680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 112 handle_osd_map epochs [112,113], i have 112, src has [1,113]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.740426064s of 14.074661255s, submitted: 117
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:36.643024+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1349205 data_alloc: 285212672 data_used: 11440128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:37.643347+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 113 heartbeat osd_stat(store_statfs(0x1b7076000/0x0/0x1bfc00000, data 0x456e244/0x4617000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:38.643560+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:39.643831+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:40.644072+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 113 heartbeat osd_stat(store_statfs(0x1b7076000/0x0/0x1bfc00000, data 0x456e244/0x4617000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:41.644346+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1349205 data_alloc: 285212672 data_used: 11440128
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:42.644546+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 113 heartbeat osd_stat(store_statfs(0x1b7076000/0x0/0x1bfc00000, data 0x456e244/0x4617000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:43.644733+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 113 heartbeat osd_stat(store_statfs(0x1b7076000/0x0/0x1bfc00000, data 0x456e244/0x4617000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1261800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:44.644886+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115875840 unmapped: 38985728 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 113 handle_osd_map epochs [113,114], i have 113, src has [1,114]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 114 handle_osd_map epochs [114,114], i have 114, src has [1,114]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 114 ms_handle_reset con 0x55aac1261800 session 0x55aabfb9c1e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:45.645053+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 114 ms_handle_reset con 0x55aabdd7cc00 session 0x55aabfc33680
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce3c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 114 ms_handle_reset con 0x55aabfce3c00 session 0x55aabfc332c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115916800 unmapped: 38944768 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0584000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.917328835s of 10.077222824s, submitted: 46
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:46.645181+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115916800 unmapped: 38944768 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1359519 data_alloc: 285212672 data_used: 11452416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 114 handle_osd_map epochs [114,115], i have 114, src has [1,115]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 115 handle_osd_map epochs [115,115], i have 115, src has [1,115]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 115 ms_handle_reset con 0x55aac0584000 session 0x55aabf6825a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:47.645336+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0243000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 115 ms_handle_reset con 0x55aac0243000 session 0x55aabfab1e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115924992 unmapped: 38936576 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 115 ms_handle_reset con 0x55aac125a000 session 0x55aabfab0b40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 115 heartbeat osd_stat(store_statfs(0x1b706a000/0x0/0x1bfc00000, data 0x45738d6/0x4623000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabdd7cc00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 115 handle_osd_map epochs [115,116], i have 115, src has [1,116]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:48.645539+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 116 handle_osd_map epochs [116,116], i have 116, src has [1,116]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 116 ms_handle_reset con 0x55aabdd7cc00 session 0x55aabdbcfa40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115941376 unmapped: 38920192 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce3c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 116 handle_osd_map epochs [116,117], i have 116, src has [1,117]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:49.645747+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 ms_handle_reset con 0x55aabfce3c00 session 0x55aabfe85e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115965952 unmapped: 38895616 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:50.645997+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115965952 unmapped: 38895616 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:51.646174+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0243000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 handle_osd_map epochs [117,117], i have 117, src has [1,117]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 115884032 unmapped: 38977536 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 ms_handle_reset con 0x55aac0243000 session 0x55aabfe850e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0584000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 ms_handle_reset con 0x55aac0584000 session 0x55aabfc6cd20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 ms_handle_reset con 0x55aac1260400 session 0x55aabe5a6d20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1ccd400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1462488 data_alloc: 285212672 data_used: 11452416
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 ms_handle_reset con 0x55aac1ccd400 session 0x55aabe5a61e0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd1800
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 ms_handle_reset con 0x55aabffd1800 session 0x55aabe5a7e00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:52.646361+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114573312 unmapped: 40288256 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:53.646768+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114573312 unmapped: 40288256 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 heartbeat osd_stat(store_statfs(0x1b6518000/0x0/0x1bfc00000, data 0x50c67c6/0x5176000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0584400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 heartbeat osd_stat(store_statfs(0x1b6518000/0x0/0x1bfc00000, data 0x50c67c6/0x5176000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:54.646942+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113549312 unmapped: 41312256 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 117 handle_osd_map epochs [118,118], i have 117, src has [1,118]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 118 ms_handle_reset con 0x55aac0584400 session 0x55aabe5a74a0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:55.647124+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 118 handle_osd_map epochs [118,118], i have 118, src has [1,118]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 118 ms_handle_reset con 0x55aabf992c00 session 0x55aabfc22d20
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113688576 unmapped: 41172992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585400
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 118 handle_osd_map epochs [118,119], i have 118, src has [1,119]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.255023956s of 10.042826653s, submitted: 172
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:56.647333+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113778688 unmapped: 41082880 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1478274 data_alloc: 285212672 data_used: 11476992
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125b000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 119 handle_osd_map epochs [119,119], i have 119, src has [1,119]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:57.647545+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113803264 unmapped: 41058304 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:58.647699+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113065984 unmapped: 41795584 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:53:59.647859+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 119 heartbeat osd_stat(store_statfs(0x1b4ce3000/0x0/0x1bfc00000, data 0x68f5937/0x69aa000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 112484352 unmapped: 42377216 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:00.647993+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 111828992 unmapped: 43032576 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:01.648150+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 112992256 unmapped: 41869312 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2101163 data_alloc: 285212672 data_used: 14163968
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 119 handle_osd_map epochs [119,120], i have 119, src has [1,120]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 120 handle_osd_map epochs [120,120], i have 120, src has [1,120]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:02.648343+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 120 ms_handle_reset con 0x55aac125b000 session 0x55aabf909680
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113008640 unmapped: 41852928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:03.648490+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125b000
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113008640 unmapped: 41852928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 120 handle_osd_map epochs [120,121], i have 120, src has [1,121]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:04.648616+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 121 handle_osd_map epochs [121,121], i have 121, src has [1,121]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 121 handle_osd_map epochs [120,121], i have 121, src has [1,121]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 121 ms_handle_reset con 0x55aac125b000 session 0x55aabf908b40
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 113188864 unmapped: 41672704 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 121 handle_osd_map epochs [122,122], i have 121, src has [1,122]
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:05.648724+0000)
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 122 heartbeat osd_stat(store_statfs(0x1b0cd9000/0x0/0x1bfc00000, data 0x98fabc7/0x99b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,1])
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: osd.2 122 ms_handle_reset con 0x55aabf992c00 session 0x55aabfb9e3c0
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114319360 unmapped: 40542208 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:08:59 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:06.648877+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114319360 unmapped: 40542208 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1520231 data_alloc: 285212672 data_used: 14188544
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:07.650970+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114319360 unmapped: 40542208 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:08.651414+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114335744 unmapped: 40525824 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:09.652606+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 114335744 unmapped: 40525824 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:10.652758+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 13.507831573s of 14.247180939s, submitted: 142
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 117530624 unmapped: 37330944 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 122 heartbeat osd_stat(store_statfs(0x1b63ac000/0x0/0x1bfc00000, data 0x522a516/0x52e2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [0,0,1,1,2])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:11.652935+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 122 handle_osd_map epochs [123,123], i have 122, src has [1,123]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121511936 unmapped: 33349632 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 handle_osd_map epochs [123,123], i have 123, src has [1,123]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1635941 data_alloc: 285212672 data_used: 14213120
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:12.653255+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123723776 unmapped: 31137792 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:13.653864+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123723776 unmapped: 31137792 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:14.654080+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123723776 unmapped: 31137792 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:15.654323+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123756544 unmapped: 31105024 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b55ea000/0x0/0x1bfc00000, data 0x5fe1d1a/0x609b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:16.654506+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b55ea000/0x0/0x1bfc00000, data 0x5fe1d1a/0x609b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123920384 unmapped: 30941184 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1647773 data_alloc: 285212672 data_used: 14213120
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:17.655047+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123920384 unmapped: 30941184 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:18.655545+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122929152 unmapped: 31932416 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:19.656346+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122929152 unmapped: 31932416 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:20.657066+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b55ce000/0x0/0x1bfc00000, data 0x6006d1a/0x60c0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122929152 unmapped: 31932416 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:21.657725+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.411849976s of 11.019076347s, submitted: 170
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 ms_handle_reset con 0x55aac0585400 session 0x55aabfc23680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 ms_handle_reset con 0x55aac0585c00 session 0x55aabfc31c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122929152 unmapped: 31932416 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125b400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1423187 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:22.657875+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 ms_handle_reset con 0x55aac125b400 session 0x55aabfab1680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121413632 unmapped: 33447936 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:23.658080+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121413632 unmapped: 33447936 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:24.658222+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:25.658463+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:26.658707+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419436 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:27.658928+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:28.659124+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:29.659328+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:30.659490+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:31.659687+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:32.659856+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419436 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:33.660020+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:34.660206+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:35.660406+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:36.660559+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:37.660673+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419436 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:38.660819+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:39.660987+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:40.661145+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:41.661320+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:42.661460+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419436 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:43.661682+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:44.661823+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:45.661983+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:46.662127+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:47.662454+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419436 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:48.662640+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:49.662835+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:50.663019+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 heartbeat osd_stat(store_statfs(0x1b704e000/0x0/0x1bfc00000, data 0x4587ca8/0x463f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:51.663171+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:52.663371+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1419436 data_alloc: 285212672 data_used: 11513856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:53.663614+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121094144 unmapped: 33767424 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:54.663845+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 123 handle_osd_map epochs [123,124], i have 123, src has [1,124]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 33.232196808s of 33.450531006s, submitted: 59
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 124 handle_osd_map epochs [124,124], i have 124, src has [1,124]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121102336 unmapped: 33759232 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:55.663990+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121102336 unmapped: 33759232 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 124 handle_osd_map epochs [125,125], i have 124, src has [1,125]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 125 handle_osd_map epochs [125,125], i have 125, src has [1,125]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 125 heartbeat osd_stat(store_statfs(0x1b7049000/0x0/0x1bfc00000, data 0x458a5ce/0x4644000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:56.664176+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121118720 unmapped: 33742848 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:57.664335+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1429682 data_alloc: 285212672 data_used: 11526144
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121118720 unmapped: 33742848 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 125 handle_osd_map epochs [125,126], i have 125, src has [1,126]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 126 handle_osd_map epochs [126,126], i have 126, src has [1,126]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1ccc400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 126 ms_handle_reset con 0x55aac1ccc400 session 0x55aabd706b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:58.664474+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121143296 unmapped: 33718272 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 126 heartbeat osd_stat(store_statfs(0x1b7040000/0x0/0x1bfc00000, data 0x458f8e7/0x464d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 126 ms_handle_reset con 0x55aabddd4000 session 0x55aac1de65a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:54:59.664616+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121143296 unmapped: 33718272 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 126 handle_osd_map epochs [127,127], i have 126, src has [1,127]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 127 ms_handle_reset con 0x55aac0590400 session 0x55aabfc8c3c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:00.664787+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121167872 unmapped: 33693696 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:01.664953+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 127 handle_osd_map epochs [128,128], i have 127, src has [1,128]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121176064 unmapped: 33685504 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 128 ms_handle_reset con 0x55aabfb48400 session 0x55aabe65a3c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:02.665118+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1439549 data_alloc: 285212672 data_used: 11530240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121184256 unmapped: 33677312 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:03.665395+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 128 heartbeat osd_stat(store_statfs(0x1b7039000/0x0/0x1bfc00000, data 0x4594a27/0x4654000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121192448 unmapped: 33669120 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:04.665540+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121192448 unmapped: 33669120 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:05.665694+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121192448 unmapped: 33669120 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:06.665845+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121192448 unmapped: 33669120 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 128 handle_osd_map epochs [129,129], i have 128, src has [1,129]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.861256599s of 12.209271431s, submitted: 96
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7039000/0x0/0x1bfc00000, data 0x4594a27/0x4654000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:07.665995+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1442551 data_alloc: 285212672 data_used: 11530240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:08.666154+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:09.666339+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:10.666537+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459722b/0x4658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:11.666690+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:12.667121+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1442551 data_alloc: 285212672 data_used: 11530240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:13.667462+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:14.667688+0000)
Feb 01 10:09:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:00.009 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459722b/0x4658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:15.667843+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets getting new tickets!
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:16.668254+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _finish_auth 0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:16.669624+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459722b/0x4658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:17.668365+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1442551 data_alloc: 285212672 data_used: 11530240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:18.668882+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:19.669044+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121208832 unmapped: 33652736 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:20.669425+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121208832 unmapped: 33652736 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459722b/0x4658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:21.669741+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:22.669947+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1442551 data_alloc: 285212672 data_used: 11530240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:23.670191+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:24.670353+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125b000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 18.020088196s of 18.046535492s, submitted: 16
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 ms_handle_reset con 0x55aac125b000 session 0x55aabfc2a000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:25.670559+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459722b/0x4658000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:26.670727+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:27.670855+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1444973 data_alloc: 285212672 data_used: 11530240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:28.671024+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121200640 unmapped: 33660928 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:29.671250+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459723b/0x4659000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121208832 unmapped: 33652736 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:30.671358+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 heartbeat osd_stat(store_statfs(0x1b7035000/0x0/0x1bfc00000, data 0x459723b/0x4659000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121208832 unmapped: 33652736 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:31.671609+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 129 handle_osd_map epochs [130,131], i have 129, src has [1,131]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfd15400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 131 ms_handle_reset con 0x55aabfd15400 session 0x55aabfbaef00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 131 handle_osd_map epochs [130,131], i have 131, src has [1,131]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121217024 unmapped: 33644544 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:32.671755+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1454229 data_alloc: 285212672 data_used: 11542528
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 131 handle_osd_map epochs [131,132], i have 131, src has [1,132]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 132 handle_osd_map epochs [132,132], i have 132, src has [1,132]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121225216 unmapped: 33636352 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:33.671985+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 132 ms_handle_reset con 0x55aabe97f800 session 0x55aabfc8d4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 132 handle_osd_map epochs [132,133], i have 132, src has [1,133]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 ms_handle_reset con 0x55aac1260c00 session 0x55aac2ae85a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121241600 unmapped: 33619968 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 ms_handle_reset con 0x55aabddd4800 session 0x55aabfc334a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:34.672123+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121249792 unmapped: 33611776 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:35.672281+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97cc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121249792 unmapped: 33611776 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 ms_handle_reset con 0x55aabe97cc00 session 0x55aabfc2bc20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:36.672457+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 heartbeat osd_stat(store_statfs(0x1b7024000/0x0/0x1bfc00000, data 0x45a174f/0x466a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.179263115s of 11.277359962s, submitted: 27
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 ms_handle_reset con 0x55aabddd4800 session 0x55aabfb9cd20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121266176 unmapped: 33595392 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:37.672596+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1462996 data_alloc: 285212672 data_used: 11546624
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbccc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcdc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121274368 unmapped: 33587200 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 ms_handle_reset con 0x55aabfbccc00 session 0x55aabfc301e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:38.672767+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 ms_handle_reset con 0x55aabfbcb000 session 0x55aabfa6e780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121282560 unmapped: 33579008 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 133 handle_osd_map epochs [133,134], i have 133, src has [1,134]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 134 handle_osd_map epochs [134,134], i have 134, src has [1,134]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:39.672918+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 134 ms_handle_reset con 0x55aabfbcdc00 session 0x55aac2ae8780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14f7800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 134 ms_handle_reset con 0x55aac14f7800 session 0x55aabfc23860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121298944 unmapped: 33562624 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:40.673096+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 134 handle_osd_map epochs [134,135], i have 134, src has [1,135]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 handle_osd_map epochs [135,135], i have 135, src has [1,135]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121307136 unmapped: 33554432 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 ms_handle_reset con 0x55aabddd4800 session 0x55aabfb9e1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:41.673254+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 ms_handle_reset con 0x55aabfbcb000 session 0x55aabfb9e5a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbccc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 handle_osd_map epochs [134,135], i have 135, src has [1,135]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 ms_handle_reset con 0x55aabfbccc00 session 0x55aabdbb10e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121307136 unmapped: 33554432 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:42.673435+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 heartbeat osd_stat(store_statfs(0x1b701c000/0x0/0x1bfc00000, data 0x45a6a23/0x4671000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1469055 data_alloc: 285212672 data_used: 11571200
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121307136 unmapped: 33554432 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:43.673625+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 135 handle_osd_map epochs [136,136], i have 135, src has [1,136]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 136 ms_handle_reset con 0x55aac1260800 session 0x55aabf9083c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121315328 unmapped: 33546240 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:44.673818+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1ccc400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 136 heartbeat osd_stat(store_statfs(0x1b7018000/0x0/0x1bfc00000, data 0x45a93b1/0x4675000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121315328 unmapped: 33546240 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:45.674057+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 136 handle_osd_map epochs [136,137], i have 136, src has [1,137]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 137 handle_osd_map epochs [137,137], i have 137, src has [1,137]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 137 ms_handle_reset con 0x55aac1ccc400 session 0x55aabfab72c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 137 ms_handle_reset con 0x55aabddd4800 session 0x55aabfb9d0e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121331712 unmapped: 33529856 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:46.674258+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 137 handle_osd_map epochs [138,138], i have 137, src has [1,138]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.844301224s of 10.416486740s, submitted: 148
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:47.674508+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121364480 unmapped: 33497088 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 138 handle_osd_map epochs [138,138], i have 138, src has [1,138]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1479845 data_alloc: 285212672 data_used: 11567104
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:48.675632+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121389056 unmapped: 33472512 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:49.676638+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121389056 unmapped: 33472512 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:50.681473+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121389056 unmapped: 33472512 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 138 heartbeat osd_stat(store_statfs(0x1b7010000/0x0/0x1bfc00000, data 0x45ae5cd/0x467d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3028400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 138 ms_handle_reset con 0x55aac3028400 session 0x55aabfb99a40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 138 handle_osd_map epochs [139,139], i have 138, src has [1,139]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3028000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 139 handle_osd_map epochs [139,139], i have 139, src has [1,139]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 139 ms_handle_reset con 0x55aac3026400 session 0x55aac2ae92c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 139 handle_osd_map epochs [139,140], i have 139, src has [1,140]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:51.681885+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 140 ms_handle_reset con 0x55aac3028000 session 0x55aabfab6d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121430016 unmapped: 33431552 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 140 handle_osd_map epochs [140,140], i have 140, src has [1,140]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 140 heartbeat osd_stat(store_statfs(0x1b7006000/0x0/0x1bfc00000, data 0x45b38c2/0x4686000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:52.682122+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121438208 unmapped: 33423360 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1488613 data_alloc: 285212672 data_used: 11579392
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 140 handle_osd_map epochs [140,141], i have 140, src has [1,141]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 141 handle_osd_map epochs [141,141], i have 141, src has [1,141]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:53.682279+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121438208 unmapped: 33423360 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 141 ms_handle_reset con 0x55aabddd4000 session 0x55aabfc84d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:54.682484+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121438208 unmapped: 33423360 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:55.682808+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 141 handle_osd_map epochs [141,142], i have 141, src has [1,142]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 142 handle_osd_map epochs [142,142], i have 142, src has [1,142]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121454592 unmapped: 33406976 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7004000/0x0/0x1bfc00000, data 0x45b6223/0x4689000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 142 ms_handle_reset con 0x55aabddd4800 session 0x55aabe65a780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:56.683130+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121479168 unmapped: 33382400 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:57.683569+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121479168 unmapped: 33382400 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1492213 data_alloc: 285212672 data_used: 11595776
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 142 heartbeat osd_stat(store_statfs(0x1b7002000/0x0/0x1bfc00000, data 0x45b8b4f/0x468c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.717591286s of 11.239648819s, submitted: 153
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3029400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:58.683711+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121503744 unmapped: 33357824 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 142 handle_osd_map epochs [143,143], i have 142, src has [1,143]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 143 ms_handle_reset con 0x55aac3029400 session 0x55aabfaa5860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:55:59.683904+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121536512 unmapped: 33325056 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:00.684102+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121552896 unmapped: 33308672 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 143 handle_osd_map epochs [143,144], i have 143, src has [1,144]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 144 handle_osd_map epochs [144,144], i have 144, src has [1,144]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 144 ms_handle_reset con 0x55aac3027800 session 0x55aabfb992c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 144 handle_osd_map epochs [144,145], i have 144, src has [1,145]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:01.684249+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 145 handle_osd_map epochs [144,145], i have 145, src has [1,145]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121569280 unmapped: 33292288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 145 handle_osd_map epochs [144,145], i have 145, src has [1,145]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 145 heartbeat osd_stat(store_statfs(0x1b6ff6000/0x0/0x1bfc00000, data 0x45bc88b/0x4697000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 145 ms_handle_reset con 0x55aabfb48c00 session 0x55aabfc24b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:02.684485+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121569280 unmapped: 33292288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1514838 data_alloc: 285212672 data_used: 11612160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:03.684792+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121569280 unmapped: 33292288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:04.685017+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121569280 unmapped: 33292288 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 145 heartbeat osd_stat(store_statfs(0x1b6ff0000/0x0/0x1bfc00000, data 0x45c19ce/0x469e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0576c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 145 handle_osd_map epochs [146,146], i have 145, src has [1,146]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 146 ms_handle_reset con 0x55aac0576c00 session 0x55aac1de7a40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:05.685201+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121610240 unmapped: 33251328 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 146 heartbeat osd_stat(store_statfs(0x1b6fee000/0x0/0x1bfc00000, data 0x45c21a9/0x46a0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x456f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:06.685421+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121626624 unmapped: 33234944 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 146 handle_osd_map epochs [146,147], i have 146, src has [1,147]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 147 ms_handle_reset con 0x55aabddd4800 session 0x55aac1de7680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 147 handle_osd_map epochs [147,147], i have 147, src has [1,147]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:07.685596+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121634816 unmapped: 33226752 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1533296 data_alloc: 285212672 data_used: 11624448
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3028000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 147 handle_osd_map epochs [148,148], i have 147, src has [1,148]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.564898491s of 10.009436607s, submitted: 110
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 148 ms_handle_reset con 0x55aac3028000 session 0x55aabfc6b2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:08.685733+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121675776 unmapped: 33185792 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 148 ms_handle_reset con 0x55aac1260c00 session 0x55aac1de63c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:09.685885+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121765888 unmapped: 33095680 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 148 heartbeat osd_stat(store_statfs(0x1b6bdd000/0x0/0x1bfc00000, data 0x45c9dce/0x46b0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 148 handle_osd_map epochs [149,149], i have 148, src has [1,149]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 149 handle_osd_map epochs [149,149], i have 149, src has [1,149]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 149 ms_handle_reset con 0x55aac125a800 session 0x55aabfc24b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:10.686090+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122830848 unmapped: 32030720 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:11.686359+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122830848 unmapped: 32030720 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd1800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 149 handle_osd_map epochs [149,150], i have 149, src has [1,150]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:12.686507+0000)
Feb 01 10:09:00 np0005604215.localdomain podman[236852]: time="2026-02-01T10:09:00Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122830848 unmapped: 32030720 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 150 handle_osd_map epochs [150,150], i have 150, src has [1,150]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1549167 data_alloc: 285212672 data_used: 11653120
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 150 handle_osd_map epochs [149,150], i have 150, src has [1,150]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 150 ms_handle_reset con 0x55aabffd1800 session 0x55aabfbafa40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:13.686726+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 150 handle_osd_map epochs [151,151], i have 150, src has [1,151]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122839040 unmapped: 32022528 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 151 handle_osd_map epochs [151,151], i have 151, src has [1,151]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 151 ms_handle_reset con 0x55aabddd4800 session 0x55aabfc223c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:14.686936+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd1800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 151 heartbeat osd_stat(store_statfs(0x1b6bd3000/0x0/0x1bfc00000, data 0x45d194b/0x46ba000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122847232 unmapped: 32014336 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:15.687156+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 151 handle_osd_map epochs [151,152], i have 151, src has [1,152]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 122871808 unmapped: 31989760 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 152 handle_osd_map epochs [152,152], i have 152, src has [1,152]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 152 ms_handle_reset con 0x55aabffd1800 session 0x55aabfc30000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125a800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:16.687399+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121733120 unmapped: 33128448 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 152 handle_osd_map epochs [153,153], i have 152, src has [1,153]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 handle_osd_map epochs [153,153], i have 153, src has [1,153]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 ms_handle_reset con 0x55aac125a800 session 0x55aabfa4ed20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14ce800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 handle_osd_map epochs [152,153], i have 153, src has [1,153]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 heartbeat osd_stat(store_statfs(0x1b6bcf000/0x0/0x1bfc00000, data 0x45d640e/0x46be000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:17.687547+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 153 handle_osd_map epochs [154,154], i have 153, src has [1,154]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121790464 unmapped: 33071104 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 154 handle_osd_map epochs [154,154], i have 154, src has [1,154]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1554652 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 154 ms_handle_reset con 0x55aac14ce800 session 0x55aabfb9e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302b400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.990240097s of 10.003219604s, submitted: 277
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:18.687686+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 154 handle_osd_map epochs [155,155], i have 154, src has [1,155]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121839616 unmapped: 33021952 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 155 ms_handle_reset con 0x55aac302b400 session 0x55aabe5a63c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:19.687862+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121872384 unmapped: 32989184 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 155 heartbeat osd_stat(store_statfs(0x1b6bc9000/0x0/0x1bfc00000, data 0x45db0ee/0x46c1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:20.688019+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121872384 unmapped: 32989184 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 155 handle_osd_map epochs [155,156], i have 155, src has [1,156]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:21.688132+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:22.688250+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1556300 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:23.688627+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 156 heartbeat osd_stat(store_statfs(0x1b6bc9000/0x0/0x1bfc00000, data 0x45dcb7b/0x46c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:24.688771+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 156 heartbeat osd_stat(store_statfs(0x1b6bc9000/0x0/0x1bfc00000, data 0x45dcb7b/0x46c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:25.688963+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:26.689098+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 156 handle_osd_map epochs [157,157], i have 156, src has [1,157]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:27.689255+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559302 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:28.689412+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:29.689567+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:30.689715+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b6bc5000/0x0/0x1bfc00000, data 0x45df37f/0x46c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:31.689819+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:32.689974+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559302 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:33.690466+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:34.690665+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:35.690840+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:36.690987+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b6bc5000/0x0/0x1bfc00000, data 0x45df37f/0x46c8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121880576 unmapped: 32980992 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3028800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:37.691180+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac3028800 session 0x55aabfab1e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 121888768 unmapped: 32972800 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1559302 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:38.691327+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 19.871568680s of 20.153335571s, submitted: 90
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 130285568 unmapped: 24576000 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:39.691468+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123371520 unmapped: 31490048 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:40.691634+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131719168 unmapped: 23142400 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:41.691790+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123338752 unmapped: 31522816 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:42.691935+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b43c5000/0x0/0x1bfc00000, data 0x6ddf3a5/0x6ec9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123355136 unmapped: 31506432 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac3026000 session 0x55aabfb992c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302bc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1944451 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:43.692096+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac302bc00 session 0x55aabfaa5860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 heartbeat osd_stat(store_statfs(0x1b2bc5000/0x0/0x1bfc00000, data 0x85df3a5/0x86c9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131899392 unmapped: 22962176 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:44.692243+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131686400 unmapped: 23175168 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:45.692409+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131776512 unmapped: 23085056 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:46.692549+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123387904 unmapped: 31473664 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac0590400 session 0x55aabe65a780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:47.692691+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123437056 unmapped: 31424512 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2386251 data_alloc: 285212672 data_used: 11665408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3029800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:48.692830+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.264549255s of 10.020630836s, submitted: 81
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131907584 unmapped: 22953984 heap: 154861568 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac3029800 session 0x55aabfab72c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 heartbeat osd_stat(store_statfs(0x1af3c4000/0x0/0x1bfc00000, data 0xbddf3ba/0xbeca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:49.692981+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac0590400 session 0x55aac2ae8780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 132571136 unmapped: 34365440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:50.693132+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 123371520 unmapped: 43565056 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 ms_handle_reset con 0x55aac3026000 session 0x55aabfb9d0e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:51.693322+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131899392 unmapped: 35037184 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 heartbeat osd_stat(store_statfs(0x1aa995000/0x0/0x1bfc00000, data 0x1000e3c1/0x100f9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 157 handle_osd_map epochs [157,158], i have 157, src has [1,158]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:52.693451+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2398c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd1000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 128516096 unmapped: 38420480 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 158 handle_osd_map epochs [158,158], i have 158, src has [1,158]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 158 ms_handle_reset con 0x55aac2398c00 session 0x55aabfb9cd20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3196972 data_alloc: 285212672 data_used: 11677696
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:53.693588+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 158 handle_osd_map epochs [159,159], i have 158, src has [1,159]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 handle_osd_map epochs [159,159], i have 159, src has [1,159]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 ms_handle_reset con 0x55aac1cd1000 session 0x55aabfbafc20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 125583360 unmapped: 41353216 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:54.693755+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 125755392 unmapped: 41181184 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:55.693877+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 125943808 unmapped: 40992768 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3029000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 ms_handle_reset con 0x55aac3029000 session 0x55aabdbb10e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:56.694013+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd1000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 ms_handle_reset con 0x55aac0590400 session 0x55aabdbb0f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 126205952 unmapped: 40730624 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2398c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 heartbeat osd_stat(store_statfs(0x1a1a25000/0x0/0x1bfc00000, data 0x19777785/0x19869000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,1,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 ms_handle_reset con 0x55aac2398c00 session 0x55aac2ae9e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:57.694086+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 134201344 unmapped: 32735232 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4183676 data_alloc: 285212672 data_used: 11694080
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 159 handle_osd_map epochs [159,160], i have 159, src has [1,160]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 160 handle_osd_map epochs [160,160], i have 160, src has [1,160]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 160 ms_handle_reset con 0x55aac3026000 session 0x55aabfc84780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 160 ms_handle_reset con 0x55aabf992400 session 0x55aabfc301e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:58.694235+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.306919098s of 10.050127983s, submitted: 192
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 160 ms_handle_reset con 0x55aac1cd1000 session 0x55aabf682d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 135577600 unmapped: 31358976 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:56:59.694395+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd1000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 160 ms_handle_reset con 0x55aac1cd1000 session 0x55aabfc230e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 127197184 unmapped: 39739392 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 160 handle_osd_map epochs [161,161], i have 160, src has [1,161]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 161 handle_osd_map epochs [161,161], i have 161, src has [1,161]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 161 ms_handle_reset con 0x55aabf992400 session 0x55aabfa6e780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:00.694546+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 135733248 unmapped: 31203328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 161 ms_handle_reset con 0x55aac0590400 session 0x55aac3980780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbccc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 161 handle_osd_map epochs [161,162], i have 161, src has [1,162]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 162 handle_osd_map epochs [162,162], i have 162, src has [1,162]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:01.694655+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 162 ms_handle_reset con 0x55aabfbccc00 session 0x55aac3980d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2398c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 127623168 unmapped: 39313408 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 162 ms_handle_reset con 0x55aac2398c00 session 0x55aabfbae5a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac27b7800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:02.694873+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 127819776 unmapped: 39116800 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4702103 data_alloc: 285212672 data_used: 11714560
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 162 heartbeat osd_stat(store_statfs(0x19ae1d000/0x0/0x1bfc00000, data 0x1fe6b335/0x1ff60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x496f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 162 handle_osd_map epochs [163,163], i have 162, src has [1,163]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:03.695018+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 127918080 unmapped: 39018496 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 163 handle_osd_map epochs [163,163], i have 163, src has [1,163]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 163 ms_handle_reset con 0x55aac27b7800 session 0x55aabfc23680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:04.695190+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 127991808 unmapped: 38944768 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:05.695331+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 136445952 unmapped: 30490624 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 163 handle_osd_map epochs [163,164], i have 163, src has [1,164]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 164 handle_osd_map epochs [164,164], i have 164, src has [1,164]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:06.695521+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 164 handle_osd_map epochs [164,165], i have 164, src has [1,165]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 128286720 unmapped: 38649856 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 165 ms_handle_reset con 0x55aac0590400 session 0x55aabfa6ed20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:07.695660+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129409024 unmapped: 37527552 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5097934 data_alloc: 285212672 data_used: 11739136
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 165 heartbeat osd_stat(store_statfs(0x196204000/0x0/0x1bfc00000, data 0x23df3db9/0x23eea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:08.695813+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.300629616s of 10.060810089s, submitted: 289
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129499136 unmapped: 37437440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:09.695978+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138100736 unmapped: 28835840 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:10.696137+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 165 heartbeat osd_stat(store_statfs(0x193a04000/0x0/0x1bfc00000, data 0x265f3db9/0x266ea000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129826816 unmapped: 37109760 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:11.696277+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 165 handle_osd_map epochs [166,166], i have 165, src has [1,166]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129933312 unmapped: 37003264 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 handle_osd_map epochs [166,166], i have 166, src has [1,166]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:12.696445+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 129990656 unmapped: 36945920 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 5575432 data_alloc: 285212672 data_used: 11751424
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:13.696650+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 heartbeat osd_stat(store_statfs(0x1919ff000/0x0/0x1bfc00000, data 0x285f65f5/0x286ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131112960 unmapped: 35823616 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:14.696871+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131260416 unmapped: 35676160 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:15.697028+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 heartbeat osd_stat(store_statfs(0x18fa00000/0x0/0x1bfc00000, data 0x2a5f65f5/0x2a6ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139788288 unmapped: 27148288 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aac2ec2000 session 0x55aabe5a74a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbccc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aabfbccc00 session 0x55aabfabf0e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd260c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aabd260c00 session 0x55aac106a000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:16.697193+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbccc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aabfbccc00 session 0x55aabfa4f2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 136650752 unmapped: 30285824 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 heartbeat osd_stat(store_statfs(0x18ea00000/0x0/0x1bfc00000, data 0x2b5f65f5/0x2b6ee000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aac0590400 session 0x55aabfa4e1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac27b7400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aac27b7400 session 0x55aabe65af00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0577800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aac0577800 session 0x55aabe5a7e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0591400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:17.697339+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aac0591400 session 0x55aabfb97e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 heartbeat osd_stat(store_statfs(0x18c905000/0x0/0x1bfc00000, data 0x2d6ef62e/0x2d7e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbccc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 ms_handle_reset con 0x55aabfbccc00 session 0x55aac106a780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 132497408 unmapped: 34439168 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 heartbeat osd_stat(store_statfs(0x18c905000/0x0/0x1bfc00000, data 0x2d6ef62e/0x2d7e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 166 handle_osd_map epochs [166,167], i have 166, src has [1,167]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 6218327 data_alloc: 285212672 data_used: 11763712
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 167 handle_osd_map epochs [167,167], i have 167, src has [1,167]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 167 ms_handle_reset con 0x55aac1260800 session 0x55aabfc6cd20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:18.697505+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 132513792 unmapped: 34422784 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0577800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.640030861s of 10.683374405s, submitted: 125
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:19.697671+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 132538368 unmapped: 34398208 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 167 handle_osd_map epochs [167,168], i have 167, src has [1,168]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:20.697784+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 168 handle_osd_map epochs [168,168], i have 168, src has [1,168]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 168 ms_handle_reset con 0x55aac0577800 session 0x55aabdbb1e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131178496 unmapped: 35758080 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 168 handle_osd_map epochs [168,169], i have 168, src has [1,169]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:21.697894+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 ms_handle_reset con 0x55aac0590400 session 0x55aabfabe960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0591400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 handle_osd_map epochs [169,169], i have 169, src has [1,169]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131227648 unmapped: 35708928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 ms_handle_reset con 0x55aac0591400 session 0x55aabfc85c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 heartbeat osd_stat(store_statfs(0x1b1ef5000/0x0/0x1bfc00000, data 0x50f72b4/0x51f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 ms_handle_reset con 0x55aac2ec3c00 session 0x55aac106bc20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:22.698032+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 heartbeat osd_stat(store_statfs(0x1b1ef5000/0x0/0x1bfc00000, data 0x50f72b4/0x51f6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131227648 unmapped: 35708928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 heartbeat osd_stat(store_statfs(0x1b1ef5000/0x0/0x1bfc00000, data 0x50f7212/0x51f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1835212 data_alloc: 285212672 data_used: 11763712
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:23.698182+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 169 handle_osd_map epochs [169,170], i have 169, src has [1,170]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 170 heartbeat osd_stat(store_statfs(0x1b1ef5000/0x0/0x1bfc00000, data 0x50f7212/0x51f5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131227648 unmapped: 35708928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:24.698388+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131227648 unmapped: 35708928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 170 heartbeat osd_stat(store_statfs(0x1b4ef4000/0x0/0x1bfc00000, data 0x50f9ba0/0x51f9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:25.698537+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec5000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 170 ms_handle_reset con 0x55aac2ec5000 session 0x55aabfab10e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131227648 unmapped: 35708928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:26.698763+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 170 handle_osd_map epochs [170,171], i have 170, src has [1,171]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131235840 unmapped: 35700736 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:27.699008+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131235840 unmapped: 35700736 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1845345 data_alloc: 285212672 data_used: 12554240
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:28.699203+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131235840 unmapped: 35700736 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:29.699401+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131235840 unmapped: 35700736 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:30.699569+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 171 heartbeat osd_stat(store_statfs(0x1b4ef0000/0x0/0x1bfc00000, data 0x50fc3c0/0x51fd000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131235840 unmapped: 35700736 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:31.699719+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 171 handle_osd_map epochs [171,172], i have 171, src has [1,172]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.555515289s of 12.212332726s, submitted: 183
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 172 ms_handle_reset con 0x55aac2885000 session 0x55aabfaa50e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131276800 unmapped: 35659776 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:32.699887+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 172 heartbeat osd_stat(store_statfs(0x1b4eea000/0x0/0x1bfc00000, data 0x50fec36/0x5203000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131293184 unmapped: 35643392 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1852692 data_alloc: 285212672 data_used: 12558336
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:33.700117+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 172 handle_osd_map epochs [172,173], i have 172, src has [1,173]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 173 handle_osd_map epochs [173,173], i have 173, src has [1,173]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 173 ms_handle_reset con 0x55aac2ec4800 session 0x55aabeaf4780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131309568 unmapped: 35627008 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:34.700237+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 173 handle_osd_map epochs [174,174], i have 173, src has [1,174]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 173 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 174 handle_osd_map epochs [174,174], i have 174, src has [1,174]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 174 ms_handle_reset con 0x55aabfb49c00 session 0x55aabfab7e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 131309568 unmapped: 35627008 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 174 heartbeat osd_stat(store_statfs(0x1b4edf000/0x0/0x1bfc00000, data 0x5103f36/0x520d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5b0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:35.700392+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 59
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 174 handle_osd_map epochs [175,175], i have 174, src has [1,175]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 175 handle_osd_map epochs [175,175], i have 175, src has [1,175]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 175 ms_handle_reset con 0x55aabfb49c00 session 0x55aabfab0b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137347072 unmapped: 29589504 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:36.700543+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 175 ms_handle_reset con 0x55aac2885000 session 0x55aabfbf7860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 135938048 unmapped: 30998528 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:37.700695+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 175 handle_osd_map epochs [175,176], i have 175, src has [1,176]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 handle_osd_map epochs [176,176], i have 176, src has [1,176]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 ms_handle_reset con 0x55aac2ec3c00 session 0x55aac1de6780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 136839168 unmapped: 30097408 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 ms_handle_reset con 0x55aac1260c00 session 0x55aabfc23860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1948676 data_alloc: 285212672 data_used: 13643776
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:38.700844+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 176 handle_osd_map epochs [176,177], i have 176, src has [1,177]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 177 ms_handle_reset con 0x55aac2fe6800 session 0x55aabfa4f4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 136863744 unmapped: 30072832 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 177 handle_osd_map epochs [177,177], i have 177, src has [1,177]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 177 ms_handle_reset con 0x55aac2fe6800 session 0x55aabf9e54a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:39.701045+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 177 handle_osd_map epochs [177,178], i have 177, src has [1,178]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 60
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 ms_handle_reset con 0x55aabfb49c00 session 0x55aabf6825a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137052160 unmapped: 29884416 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 handle_osd_map epochs [178,178], i have 178, src has [1,178]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 ms_handle_reset con 0x55aac1260c00 session 0x55aabfc221e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:40.701168+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 ms_handle_reset con 0x55aac2885000 session 0x55aabf6823c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137158656 unmapped: 29777920 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 heartbeat osd_stat(store_statfs(0x1b4176000/0x0/0x1bfc00000, data 0x5a693d9/0x5b75000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:41.701375+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137191424 unmapped: 29745152 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.839753151s of 10.950122833s, submitted: 295
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:42.701547+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 178 handle_osd_map epochs [178,179], i have 178, src has [1,179]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 179 handle_osd_map epochs [179,179], i have 179, src has [1,179]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1960326 data_alloc: 285212672 data_used: 13651968
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 179 heartbeat osd_stat(store_statfs(0x1b4172000/0x0/0x1bfc00000, data 0x5a6bdea/0x5b7b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:43.701667+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:44.701801+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 179 handle_osd_map epochs [179,180], i have 179, src has [1,180]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 180 handle_osd_map epochs [180,180], i have 180, src has [1,180]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:45.701992+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4841800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 180 ms_handle_reset con 0x55aac4841800 session 0x55aabfa4e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 180 heartbeat osd_stat(store_statfs(0x1b416e000/0x0/0x1bfc00000, data 0x5a6e724/0x5b7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:46.702199+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 180 handle_osd_map epochs [180,181], i have 180, src has [1,181]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 181 handle_osd_map epochs [181,181], i have 181, src has [1,181]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:09:00 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=false&sync=false HTTP/1.1" 200 155356 "" "Go-http-client/1.1"
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:47.702384+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 181 handle_osd_map epochs [181,182], i have 181, src has [1,182]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 182 handle_osd_map epochs [182,182], i have 182, src has [1,182]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 182 handle_osd_map epochs [181,182], i have 182, src has [1,182]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1971084 data_alloc: 285212672 data_used: 13676544
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:48.702540+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137199616 unmapped: 29736960 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 182 ms_handle_reset con 0x55aac3026800 session 0x55aabfc8c1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:49.702669+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137207808 unmapped: 29728768 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 182 handle_osd_map epochs [183,183], i have 182, src has [1,183]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 183 heartbeat osd_stat(store_statfs(0x1b4165000/0x0/0x1bfc00000, data 0x5a73a08/0x5b87000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:50.702808+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 137248768 unmapped: 29687808 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:51.702941+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 183 handle_osd_map epochs [184,184], i have 183, src has [1,184]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138321920 unmapped: 28614656 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:52.703087+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 184 handle_osd_map epochs [184,185], i have 184, src has [1,185]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.779494286s of 10.100900650s, submitted: 110
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 185 handle_osd_map epochs [185,185], i have 185, src has [1,185]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 185 ms_handle_reset con 0x55aac2885c00 session 0x55aabf908f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 185 heartbeat osd_stat(store_statfs(0x1b415d000/0x0/0x1bfc00000, data 0x5a78baa/0x5b8f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138330112 unmapped: 28606464 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1981434 data_alloc: 285212672 data_used: 13676544
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:53.703387+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138330112 unmapped: 28606464 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 185 ms_handle_reset con 0x55aac2ec2000 session 0x55aabf9083c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 185 heartbeat osd_stat(store_statfs(0x1b4157000/0x0/0x1bfc00000, data 0x5a7b546/0x5b94000, compress 0x0/0x0/0x0, omap 0x649, meta 0x5f0f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:54.703509+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 185 handle_osd_map epochs [185,186], i have 185, src has [1,186]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 186 handle_osd_map epochs [186,186], i have 186, src has [1,186]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 186 ms_handle_reset con 0x55aac3027400 session 0x55aabf908b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138420224 unmapped: 28516352 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:55.704369+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 186 handle_osd_map epochs [187,187], i have 186, src has [1,187]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138420224 unmapped: 28516352 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3029400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 187 ms_handle_reset con 0x55aac3029400 session 0x55aac3980960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:56.704745+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 187 handle_osd_map epochs [188,188], i have 187, src has [1,188]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138477568 unmapped: 28459008 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:57.704901+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 188 ms_handle_reset con 0x55aac2885000 session 0x55aabfb9f0e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138485760 unmapped: 28450816 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 188 handle_osd_map epochs [189,189], i have 188, src has [1,189]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1991806 data_alloc: 285212672 data_used: 13701120
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:58.705330+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138518528 unmapped: 28418048 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 189 heartbeat osd_stat(store_statfs(0x1b3d2a000/0x0/0x1bfc00000, data 0x5a85a9a/0x5ba2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x632f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:57:59.705533+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 189 ms_handle_reset con 0x55aabe97f800 session 0x55aabfb9ef00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138543104 unmapped: 28393472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:00.706027+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138543104 unmapped: 28393472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 189 heartbeat osd_stat(store_statfs(0x1b512c000/0x0/0x1bfc00000, data 0x5a85b25/0x5ba2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:01.706181+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 189 handle_osd_map epochs [189,190], i have 189, src has [1,190]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138543104 unmapped: 28393472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:02.706359+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138543104 unmapped: 28393472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1993856 data_alloc: 285212672 data_used: 13713408
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:03.706540+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.435353279s of 11.038833618s, submitted: 150
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138543104 unmapped: 28393472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 190 heartbeat osd_stat(store_statfs(0x1b5127000/0x0/0x1bfc00000, data 0x5a88400/0x5ba7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:04.706744+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138543104 unmapped: 28393472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 190 ms_handle_reset con 0x55aac2885c00 session 0x55aabfabfc20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:05.706954+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 190 handle_osd_map epochs [191,191], i have 190, src has [1,191]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14ce800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec4800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 191 ms_handle_reset con 0x55aac2ec4800 session 0x55aabfc250e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 191 heartbeat osd_stat(store_statfs(0x1b5120000/0x0/0x1bfc00000, data 0x5a8adad/0x5bad000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138567680 unmapped: 28368896 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:06.707204+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 191 handle_osd_map epochs [192,192], i have 191, src has [1,192]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 192 ms_handle_reset con 0x55aac14ce800 session 0x55aabfb972c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138575872 unmapped: 28360704 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:07.707404+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138575872 unmapped: 28360704 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd1000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2009292 data_alloc: 285212672 data_used: 13733888
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 192 ms_handle_reset con 0x55aac1cd1000 session 0x55aabfaa52c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:08.707548+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 192 handle_osd_map epochs [193,193], i have 192, src has [1,193]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 193 ms_handle_reset con 0x55aac3027400 session 0x55aabfc22960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 193 handle_osd_map epochs [193,193], i have 193, src has [1,193]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 193 ms_handle_reset con 0x55aabddd4400 session 0x55aabfc2bc20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138625024 unmapped: 28311552 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:09.707700+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 193 handle_osd_map epochs [194,194], i have 193, src has [1,194]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 194 ms_handle_reset con 0x55aabfbcb000 session 0x55aabeaf4b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3041000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 194 heartbeat osd_stat(store_statfs(0x1b510f000/0x0/0x1bfc00000, data 0x5a93a03/0x5bbc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 194 ms_handle_reset con 0x55aac3041000 session 0x55aabfc2af00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138665984 unmapped: 28270592 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:10.707853+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 194 ms_handle_reset con 0x55aabddd4400 session 0x55aabfc31a40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138715136 unmapped: 28221440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:11.708014+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 194 handle_osd_map epochs [194,195], i have 194, src has [1,195]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 195 ms_handle_reset con 0x55aabfbcb000 session 0x55aabf9e4960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138731520 unmapped: 28205056 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:12.709601+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138731520 unmapped: 28205056 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2019875 data_alloc: 285212672 data_used: 13746176
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:13.709792+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 195 handle_osd_map epochs [195,195], i have 195, src has [1,195]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.624024391s of 10.003663063s, submitted: 112
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 195 ms_handle_reset con 0x55aac2885800 session 0x55aabfab7680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138731520 unmapped: 28205056 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:14.709943+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 195 handle_osd_map epochs [196,196], i have 195, src has [1,196]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 196 handle_osd_map epochs [196,196], i have 196, src has [1,196]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2884c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4840400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 196 ms_handle_reset con 0x55aac4840400 session 0x55aabfb961e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138764288 unmapped: 28172288 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 196 heartbeat osd_stat(store_statfs(0x1b510d000/0x0/0x1bfc00000, data 0x5a963ae/0x5bc0000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:15.710091+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 196 handle_osd_map epochs [196,197], i have 196, src has [1,197]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 197 handle_osd_map epochs [197,197], i have 197, src has [1,197]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 197 ms_handle_reset con 0x55aac2884c00 session 0x55aabfb981e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 197 ms_handle_reset con 0x55aabddd4400 session 0x55aabfb98b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138838016 unmapped: 28098560 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:16.710254+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 197 heartbeat osd_stat(store_statfs(0x1b510a000/0x0/0x1bfc00000, data 0x5a98cde/0x5bc3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 197 handle_osd_map epochs [198,198], i have 197, src has [1,198]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138846208 unmapped: 28090368 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 198 handle_osd_map epochs [198,198], i have 198, src has [1,198]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:17.710352+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 198 handle_osd_map epochs [198,199], i have 198, src has [1,199]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 199 ms_handle_reset con 0x55aabfbcb000 session 0x55aabfc6ba40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 199 ms_handle_reset con 0x55aac2ec3800 session 0x55aac39805a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139919360 unmapped: 27017216 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 199 heartbeat osd_stat(store_statfs(0x1b5102000/0x0/0x1bfc00000, data 0x5a9e4f1/0x5bcc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2035420 data_alloc: 285212672 data_used: 13758464
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:18.710483+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 199 handle_osd_map epochs [199,200], i have 199, src has [1,200]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140001280 unmapped: 26935296 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 200 ms_handle_reset con 0x55aabfbca400 session 0x55aabfb9e1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:19.710577+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140009472 unmapped: 26927104 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:20.710774+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140009472 unmapped: 26927104 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:21.710932+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 200 handle_osd_map epochs [200,201], i have 200, src has [1,201]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 heartbeat osd_stat(store_statfs(0x1b50fb000/0x0/0x1bfc00000, data 0x5aa3376/0x5bd1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 ms_handle_reset con 0x55aac2ec2400 session 0x55aabd706f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 ms_handle_reset con 0x55aac2ec2c00 session 0x55aac106b860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabddd4400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140017664 unmapped: 26918912 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:22.711254+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 ms_handle_reset con 0x55aabddd4400 session 0x55aabf9e4000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138854400 unmapped: 28082176 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1877406 data_alloc: 285212672 data_used: 11915264
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:23.711440+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 heartbeat osd_stat(store_statfs(0x1b654f000/0x0/0x1bfc00000, data 0x4650b40/0x477c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.862973213s of 10.566443443s, submitted: 196
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138854400 unmapped: 28082176 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:24.711630+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 heartbeat osd_stat(store_statfs(0x1b654f000/0x0/0x1bfc00000, data 0x4650b40/0x477c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138854400 unmapped: 28082176 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:25.711793+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138854400 unmapped: 28082176 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:26.711968+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 201 handle_osd_map epochs [201,202], i have 201, src has [1,202]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138715136 unmapped: 28221440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:27.712125+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138715136 unmapped: 28221440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1879688 data_alloc: 285212672 data_used: 11915264
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:28.712320+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138715136 unmapped: 28221440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:29.712486+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 202 handle_osd_map epochs [202,203], i have 202, src has [1,203]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 203 handle_osd_map epochs [203,203], i have 203, src has [1,203]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138715136 unmapped: 28221440 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 203 heartbeat osd_stat(store_statfs(0x1b654a000/0x0/0x1bfc00000, data 0x4655c74/0x4783000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:30.712621+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 203 handle_osd_map epochs [204,204], i have 203, src has [1,204]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 203 handle_osd_map epochs [204,204], i have 204, src has [1,204]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138739712 unmapped: 28196864 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:31.712783+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 204 heartbeat osd_stat(store_statfs(0x1b6546000/0x0/0x1bfc00000, data 0x4658584/0x4786000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 204 handle_osd_map epochs [205,205], i have 204, src has [1,205]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 205 ms_handle_reset con 0x55aac3026800 session 0x55aabfc30780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138764288 unmapped: 28172288 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:32.712953+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138764288 unmapped: 28172288 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1888218 data_alloc: 285212672 data_used: 11927552
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:33.713139+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 205 heartbeat osd_stat(store_statfs(0x1b6542000/0x0/0x1bfc00000, data 0x465ae94/0x4789000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 205 ms_handle_reset con 0x55aac148b000 session 0x55aabfc2a1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.712132454s of 10.001169205s, submitted: 86
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138788864 unmapped: 28147712 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:34.713266+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138788864 unmapped: 28147712 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:35.713450+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 205 ms_handle_reset con 0x55aabfbcd800 session 0x55aabe65c3c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138788864 unmapped: 28147712 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:36.713603+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 205 handle_osd_map epochs [206,206], i have 205, src has [1,206]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138788864 unmapped: 28147712 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b6542000/0x0/0x1bfc00000, data 0x465ae94/0x4789000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:37.713786+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138788864 unmapped: 28147712 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:38.713920+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1889876 data_alloc: 285212672 data_used: 11927552
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b6540000/0x0/0x1bfc00000, data 0x465d698/0x478d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138805248 unmapped: 28131328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:39.714109+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138805248 unmapped: 28131328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:40.714261+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138805248 unmapped: 28131328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 206 heartbeat osd_stat(store_statfs(0x1b6540000/0x0/0x1bfc00000, data 0x465d698/0x478d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:41.714550+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138805248 unmapped: 28131328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:42.714733+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138805248 unmapped: 28131328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:43.714943+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1894020 data_alloc: 285212672 data_used: 11927552
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.951160431s of 10.006620407s, submitted: 22
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138805248 unmapped: 28131328 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:44.715098+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 206 ms_handle_reset con 0x55aac2ec2000 session 0x55aabfe852c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138821632 unmapped: 28114944 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:45.715347+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 206 handle_osd_map epochs [207,207], i have 206, src has [1,207]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 207 handle_osd_map epochs [207,207], i have 207, src has [1,207]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302ac00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 207 ms_handle_reset con 0x55aac302ac00 session 0x55aabdbcfc20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138821632 unmapped: 28114944 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:46.715482+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 207 ms_handle_reset con 0x55aac0585400 session 0x55aabfb99a40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 207 heartbeat osd_stat(store_statfs(0x1b653f000/0x0/0x1bfc00000, data 0x465d7dc/0x478f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcb000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 207 handle_osd_map epochs [208,208], i have 207, src has [1,208]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 208 ms_handle_reset con 0x55aabfbcb000 session 0x55aabf909e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 208 ms_handle_reset con 0x55aac2885c00 session 0x55aabfa4e780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 208 handle_osd_map epochs [208,208], i have 208, src has [1,208]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138862592 unmapped: 28073984 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:47.715644+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcc800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac27b7000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 208 ms_handle_reset con 0x55aac27b7000 session 0x55aabfab0d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138911744 unmapped: 28024832 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:48.715795+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1905662 data_alloc: 285212672 data_used: 11939840
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 208 handle_osd_map epochs [209,209], i have 208, src has [1,209]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 209 heartbeat osd_stat(store_statfs(0x1b6535000/0x0/0x1bfc00000, data 0x4663204/0x4799000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 209 ms_handle_reset con 0x55aac2fe6800 session 0x55aabfb99680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 209 ms_handle_reset con 0x55aabfbcc800 session 0x55aabfaa4b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0242400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138944512 unmapped: 27992064 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 209 ms_handle_reset con 0x55aac0242400 session 0x55aac106ad20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:49.715970+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 209 handle_osd_map epochs [210,210], i have 209, src has [1,210]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 210 handle_osd_map epochs [210,210], i have 210, src has [1,210]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138936320 unmapped: 28000256 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 210 ms_handle_reset con 0x55aac3027c00 session 0x55aabfc850e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcc800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:50.716112+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 210 ms_handle_reset con 0x55aabfbcc800 session 0x55aabf9083c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 210 handle_osd_map epochs [210,211], i have 210, src has [1,211]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 211 handle_osd_map epochs [211,211], i have 211, src has [1,211]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0242400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138952704 unmapped: 27983872 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 211 ms_handle_reset con 0x55aac0242400 session 0x55aabfbf70e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:51.716273+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 211 handle_osd_map epochs [212,212], i have 211, src has [1,212]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 211 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 handle_osd_map epochs [212,212], i have 212, src has [1,212]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138977280 unmapped: 27959296 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:52.716494+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 heartbeat osd_stat(store_statfs(0x1b6524000/0x0/0x1bfc00000, data 0x466d365/0x47a8000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 ms_handle_reset con 0x55aac3027000 session 0x55aabfb9e5a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138985472 unmapped: 27951104 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:53.716741+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1922975 data_alloc: 285212672 data_used: 11964416
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138985472 unmapped: 27951104 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:54.716897+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.224578857s of 11.037554741s, submitted: 242
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139001856 unmapped: 27934720 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302b800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:55.717033+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 ms_handle_reset con 0x55aac302b800 session 0x55aac3981680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 138977280 unmapped: 27959296 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 ms_handle_reset con 0x55aac2ec3400 session 0x55aabfa4fa40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:56.717164+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcc800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 212 handle_osd_map epochs [213,213], i have 212, src has [1,213]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.5] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.e] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[5.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 213 ms_handle_reset con 0x55aabfbcc800 session 0x55aabfa4f0e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139018240 unmapped: 27918336 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 213 handle_osd_map epochs [213,213], i have 213, src has [1,213]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:57.717342+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 213 handle_osd_map epochs [214,214], i have 213, src has [1,214]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 214 handle_osd_map epochs [214,214], i have 214, src has [1,214]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 214 ms_handle_reset con 0x55aac3027000 session 0x55aabf908000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302b800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139034624 unmapped: 27901952 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 214 ms_handle_reset con 0x55aac302b800 session 0x55aac1de65a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:58.717498+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1943300 data_alloc: 285212672 data_used: 11976704
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 214 ms_handle_reset con 0x55aac2ec3800 session 0x55aabfc225a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 214 heartbeat osd_stat(store_statfs(0x1b6516000/0x0/0x1bfc00000, data 0x4672a76/0x47b5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 214 handle_osd_map epochs [215,215], i have 214, src has [1,215]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 215 handle_osd_map epochs [215,215], i have 215, src has [1,215]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139108352 unmapped: 27828224 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:58:59.717679+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcd000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 215 ms_handle_reset con 0x55aabfbcd000 session 0x55aac106b4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 215 handle_osd_map epochs [216,216], i have 215, src has [1,216]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139157504 unmapped: 27779072 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 216 handle_osd_map epochs [216,216], i have 216, src has [1,216]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 216 ms_handle_reset con 0x55aac2885400 session 0x55aabfab1c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:00.717868+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcc800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 216 handle_osd_map epochs [216,217], i have 216, src has [1,217]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 217 handle_osd_map epochs [217,217], i have 217, src has [1,217]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 217 ms_handle_reset con 0x55aabfbcc800 session 0x55aabfc33c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 217 handle_osd_map epochs [216,217], i have 217, src has [1,217]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 217 ms_handle_reset con 0x55aac3027000 session 0x55aac3980f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139255808 unmapped: 27680768 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302b800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 217 ms_handle_reset con 0x55aac2ec3800 session 0x55aabfc6c960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2884800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:01.718050+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 217 handle_osd_map epochs [218,218], i have 217, src has [1,218]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 handle_osd_map epochs [217,218], i have 218, src has [1,218]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140435456 unmapped: 26501120 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 ms_handle_reset con 0x55aac302b800 session 0x55aabeaf4d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b6508000/0x0/0x1bfc00000, data 0x467a68e/0x47c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 ms_handle_reset con 0x55aac2884800 session 0x55aabfbae1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:02.718250+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 139395072 unmapped: 27541504 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:03.718527+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcc800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 heartbeat osd_stat(store_statfs(0x1b650c000/0x0/0x1bfc00000, data 0x467cb8c/0x47c2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1950647 data_alloc: 285212672 data_used: 11993088
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 ms_handle_reset con 0x55aabfbcc800 session 0x55aabfa6e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 ms_handle_reset con 0x55aac2885400 session 0x55aabfbf7c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 ms_handle_reset con 0x55aac3027000 session 0x55aabf9e4f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 ms_handle_reset con 0x55aac2ec3800 session 0x55aabfc6a780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 218 handle_osd_map epochs [219,219], i have 218, src has [1,219]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140484608 unmapped: 26451968 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:04.718683+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbcc800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.070189476s of 10.047892570s, submitted: 287
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140484608 unmapped: 26451968 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 219 ms_handle_reset con 0x55aabfbcc800 session 0x55aabfab6960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:05.718822+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2884800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 219 handle_osd_map epochs [219,219], i have 219, src has [1,219]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 219 ms_handle_reset con 0x55aac2884800 session 0x55aac1de7c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2885400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 219 heartbeat osd_stat(store_statfs(0x1b650a000/0x0/0x1bfc00000, data 0x467f466/0x47c4000, compress 0x0/0x0/0x0, omap 0x649, meta 0x4f2f9b7), peers [0,1,3,4,5] op hist [0,0,0,2])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 219 ms_handle_reset con 0x55aac2885400 session 0x55aabfc8c960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 140509184 unmapped: 26427392 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:06.718965+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 219 handle_osd_map epochs [220,220], i have 219, src has [1,220]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #44. Immutable memtables: 1.
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 141590528 unmapped: 25346048 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:07.719113+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142639104 unmapped: 24297472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:08.719346+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1965503 data_alloc: 285212672 data_used: 12013568
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142639104 unmapped: 24297472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:09.719524+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 220 handle_osd_map epochs [220,220], i have 220, src has [1,220]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142639104 unmapped: 24297472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:10.719696+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 220 heartbeat osd_stat(store_statfs(0x1b5364000/0x0/0x1bfc00000, data 0x4681e47/0x47ca000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142639104 unmapped: 24297472 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:11.719870+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3041000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 220 ms_handle_reset con 0x55aac3041000 session 0x55aabfc6d2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302ac00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 220 handle_osd_map epochs [221,221], i have 220, src has [1,221]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 handle_osd_map epochs [221,221], i have 221, src has [1,221]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148b000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 ms_handle_reset con 0x55aac148b000 session 0x55aabfc6ad20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 ms_handle_reset con 0x55aac302ac00 session 0x55aabf908f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142696448 unmapped: 24240128 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:12.720033+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 heartbeat osd_stat(store_statfs(0x1b535e000/0x0/0x1bfc00000, data 0x468465b/0x47cf000, compress 0x0/0x0/0x0, omap 0x649, meta 0x60cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4806c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 ms_handle_reset con 0x55aac4806c00 session 0x55aabfc303c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142696448 unmapped: 24240128 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:13.720194+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 ms_handle_reset con 0x55aac0585000 session 0x55aabfab70e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 1975581 data_alloc: 285212672 data_used: 12025856
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142671872 unmapped: 24264704 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:14.720326+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.607603073s of 10.002111435s, submitted: 105
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97d800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 142680064 unmapped: 24256512 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 ms_handle_reset con 0x55aabe97d800 session 0x55aabfb990e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:15.720443+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125b400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 ms_handle_reset con 0x55aac125b400 session 0x55aabfc310e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 221 handle_osd_map epochs [221,222], i have 221, src has [1,222]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 222 handle_osd_map epochs [222,222], i have 222, src has [1,222]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144080896 unmapped: 22855680 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:16.720566+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 222 handle_osd_map epochs [223,223], i have 222, src has [1,223]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97d800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144097280 unmapped: 22839296 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:17.720722+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 ms_handle_reset con 0x55aabe97d800 session 0x55aabf909860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0243000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 heartbeat osd_stat(store_statfs(0x1b48f5000/0x0/0x1bfc00000, data 0x4ce9a51/0x4e39000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143990784 unmapped: 22945792 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 ms_handle_reset con 0x55aac0243000 session 0x55aabf908f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:18.720885+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2047124 data_alloc: 285212672 data_used: 12038144
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0f74800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 ms_handle_reset con 0x55aac0f74800 session 0x55aabfc6d2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd0400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac27b7800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 ms_handle_reset con 0x55aac1cd0400 session 0x55aac1de7c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 ms_handle_reset con 0x55aac27b7800 session 0x55aabfab6960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 223 handle_osd_map epochs [224,224], i have 223, src has [1,224]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97d800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0243000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 224 ms_handle_reset con 0x55aac0243000 session 0x55aabfbf7c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144064512 unmapped: 22872064 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:19.721041+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 224 ms_handle_reset con 0x55aabe97d800 session 0x55aabfc6a780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 224 heartbeat osd_stat(store_statfs(0x1b48ee000/0x0/0x1bfc00000, data 0x4cec31f/0x4e3d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0f74800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 224 ms_handle_reset con 0x55aac0f74800 session 0x55aabeaf4d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 224 handle_osd_map epochs [225,225], i have 224, src has [1,225]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144220160 unmapped: 22716416 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:20.721208+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144228352 unmapped: 22708224 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:21.721356+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd0400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabffd0c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 225 ms_handle_reset con 0x55aabffd0c00 session 0x55aabfab1c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 225 ms_handle_reset con 0x55aac1cd0400 session 0x55aac3980f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce3c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 225 ms_handle_reset con 0x55aabfce3c00 session 0x55aac106b4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144302080 unmapped: 22634496 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:22.721513+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce2800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144302080 unmapped: 22634496 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:23.721710+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2051802 data_alloc: 285212672 data_used: 12058624
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 225 handle_osd_map epochs [226,226], i have 225, src has [1,226]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 handle_osd_map epochs [226,226], i have 226, src has [1,226]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 heartbeat osd_stat(store_statfs(0x1b48f0000/0x0/0x1bfc00000, data 0x4ceec0f/0x4e3e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aabeafec00 session 0x55aabfa4e780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aabfce2800 session 0x55aac1de65a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14f6c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144400384 unmapped: 22536192 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:24.721857+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aac14f6c00 session 0x55aabf9e52c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4840c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aac4840c00 session 0x55aabfb9d2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce2800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce3c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14f6c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144449536 unmapped: 22487040 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:25.935966+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.563915253s of 10.459951401s, submitted: 209
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aabeafec00 session 0x55aabfe85c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aabfce3c00 session 0x55aabfab0d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aac14f6c00 session 0x55aabfe84b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aabfce2800 session 0x55aabfb965a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 ms_handle_reset con 0x55aabfb48c00 session 0x55aabfc24f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144465920 unmapped: 22470656 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 226 handle_osd_map epochs [226,227], i have 226, src has [1,227]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:26.936140+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 227 handle_osd_map epochs [227,227], i have 227, src has [1,227]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144490496 unmapped: 22446080 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 227 heartbeat osd_stat(store_statfs(0x1b48ec000/0x0/0x1bfc00000, data 0x4cf14c7/0x4e3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 227 ms_handle_reset con 0x55aabe97f800 session 0x55aabfb97860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 227 handle_osd_map epochs [228,228], i have 227, src has [1,228]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 227 handle_osd_map epochs [228,228], i have 228, src has [1,228]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:27.936317+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 228 ms_handle_reset con 0x55aabfbca400 session 0x55aac2ae9680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144539648 unmapped: 22396928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2063303 data_alloc: 285212672 data_used: 12070912
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:28.936494+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4840400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 228 ms_handle_reset con 0x55aac4840400 session 0x55aabd707e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144539648 unmapped: 22396928 heap: 166936576 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec5400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 228 ms_handle_reset con 0x55aac2ec5400 session 0x55aabfc8d860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:29.936621+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 228 handle_osd_map epochs [229,229], i have 228, src has [1,229]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 229 handle_osd_map epochs [229,229], i have 229, src has [1,229]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302a000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 229 ms_handle_reset con 0x55aac302a000 session 0x55aabf909e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143360000 unmapped: 27779072 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 229 handle_osd_map epochs [229,230], i have 229, src has [1,230]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:30.936802+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 heartbeat osd_stat(store_statfs(0x1b38da000/0x0/0x1bfc00000, data 0x5cfe1bd/0x5e53000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 ms_handle_reset con 0x55aabe97f800 session 0x55aabf683e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143409152 unmapped: 27729920 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 ms_handle_reset con 0x55aabfbca400 session 0x55aabdb67e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:31.936955+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143278080 unmapped: 27860992 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:32.937167+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143278080 unmapped: 27860992 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2074610 data_alloc: 285212672 data_used: 12083200
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:33.937356+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 ms_handle_reset con 0x55aabfbca800 session 0x55aabfc8c960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143286272 unmapped: 27852800 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 ms_handle_reset con 0x55aabf993400 session 0x55aabfc8c1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:34.937628+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143286272 unmapped: 27852800 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:35.937788+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143286272 unmapped: 27852800 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:36.937925+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 heartbeat osd_stat(store_statfs(0x1b48db000/0x0/0x1bfc00000, data 0x4cfbbb1/0x4e53000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 handle_osd_map epochs [231,231], i have 230, src has [1,231]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 230 handle_osd_map epochs [231,231], i have 231, src has [1,231]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.213954926s of 11.136080742s, submitted: 212
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125b400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aac125b400 session 0x55aabfa6f860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143294464 unmapped: 27844608 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:37.938087+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabe97f800 session 0x55aabfa6e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 148611072 unmapped: 22528000 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabfbca400 session 0x55aabfa6ed20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabf993400 session 0x55aabfc6de00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2155778 data_alloc: 285212672 data_used: 12095488
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:38.938229+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabfbca800 session 0x55aabfc23680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac27b6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aac27b6800 session 0x55aabfbae5a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143433728 unmapped: 27705344 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabe97f800 session 0x55aabfa6e780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:39.938381+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabf993400 session 0x55aabfc301e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143474688 unmapped: 27664384 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:40.938538+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabfbca400 session 0x55aabdbb10e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143532032 unmapped: 27607040 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabfbca800 session 0x55aabfb9cd20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:41.938686+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3028c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aac2ec2800 session 0x55aabfab72c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aac3028c00 session 0x55aabfb992c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143540224 unmapped: 27598848 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:42.938836+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b48d7000/0x0/0x1bfc00000, data 0x4cfe4a6/0x4e56000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143540224 unmapped: 27598848 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2093885 data_alloc: 285212672 data_used: 12095488
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:43.939047+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143540224 unmapped: 27598848 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:44.939196+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 ms_handle_reset con 0x55aabf993400 session 0x55aabfb9e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 143564800 unmapped: 27574272 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 heartbeat osd_stat(store_statfs(0x1b48d5000/0x0/0x1bfc00000, data 0x4cfe644/0x4e59000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:45.939345+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 231 handle_osd_map epochs [232,232], i have 231, src has [1,232]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 handle_osd_map epochs [232,232], i have 232, src has [1,232]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 ms_handle_reset con 0x55aabe97f800 session 0x55aabfab1e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 ms_handle_reset con 0x55aabfbca400 session 0x55aabfc30000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144654336 unmapped: 26484736 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 ms_handle_reset con 0x55aabfbca800 session 0x55aabfc223c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 ms_handle_reset con 0x55aabe97f800 session 0x55aac1de63c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 ms_handle_reset con 0x55aabf993400 session 0x55aabfc6b2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:46.939499+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.003278732s of 10.002965927s, submitted: 227
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 ms_handle_reset con 0x55aabfbca400 session 0x55aabfc8c960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144711680 unmapped: 26427392 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:47.939644+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb48400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144744448 unmapped: 26394624 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2111349 data_alloc: 285212672 data_used: 12107776
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:48.939816+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 232 handle_osd_map epochs [233,233], i have 232, src has [1,233]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 handle_osd_map epochs [233,233], i have 233, src has [1,233]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 ms_handle_reset con 0x55aabd260800 session 0x55aabfe84b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 heartbeat osd_stat(store_statfs(0x1b48cd000/0x0/0x1bfc00000, data 0x4d0112a/0x4e61000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97d800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 ms_handle_reset con 0x55aabfb48400 session 0x55aabf683e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144760832 unmapped: 26378240 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 ms_handle_reset con 0x55aabd260800 session 0x55aac106b4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:49.939966+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 233 handle_osd_map epochs [233,234], i have 233, src has [1,234]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabe97d800 session 0x55aabfb9d2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 handle_osd_map epochs [234,234], i have 234, src has [1,234]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144785408 unmapped: 26353664 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabe97f800 session 0x55aabfab1c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:50.940225+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabf993400 session 0x55aabfc6a780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 heartbeat osd_stat(store_statfs(0x1b48c5000/0x0/0x1bfc00000, data 0x4d06451/0x4e67000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144809984 unmapped: 26329088 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:51.940431+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabfbca400 session 0x55aac1de7c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 heartbeat osd_stat(store_statfs(0x1b48c9000/0x0/0x1bfc00000, data 0x4d063de/0x4e65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabd260800 session 0x55aabfc310e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4840800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aac4840800 session 0x55aabfc303c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf992c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 heartbeat osd_stat(store_statfs(0x1b48c9000/0x0/0x1bfc00000, data 0x4d063de/0x4e65000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144883712 unmapped: 26255360 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabf992c00 session 0x55aac27865a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:52.940622+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd261800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabd261800 session 0x55aac2786960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabf993000 session 0x55aac2786f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144982016 unmapped: 26157056 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:53.940820+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2111668 data_alloc: 285212672 data_used: 12120064
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 144998400 unmapped: 26140672 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 ms_handle_reset con 0x55aabd260800 session 0x55aabfc8cf00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:54.940963+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145014784 unmapped: 26124288 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:55.941091+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 heartbeat osd_stat(store_statfs(0x1b48ce000/0x0/0x1bfc00000, data 0x4d062d0/0x4e60000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145014784 unmapped: 26124288 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 234 handle_osd_map epochs [234,235], i have 234, src has [1,235]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:56.941219+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.305349350s of 10.147751808s, submitted: 191
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 235 handle_osd_map epochs [235,235], i have 235, src has [1,235]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145022976 unmapped: 26116096 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce2800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 235 ms_handle_reset con 0x55aabfce2800 session 0x55aabfaa4f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:57.941345+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 235 handle_osd_map epochs [236,236], i have 235, src has [1,236]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145047552 unmapped: 26091520 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:58.941506+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2123694 data_alloc: 285212672 data_used: 12132352
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1260c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 236 ms_handle_reset con 0x55aac1260c00 session 0x55aabfb97860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 236 heartbeat osd_stat(store_statfs(0x1b48c2000/0x0/0x1bfc00000, data 0x4d0b6ec/0x4e6a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145088512 unmapped: 26050560 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T09:59:59.941689+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec5400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 236 ms_handle_reset con 0x55aac2ec5400 session 0x55aabfab6b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145162240 unmapped: 25976832 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:00.941856+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 236 ms_handle_reset con 0x55aabe97f800 session 0x55aabfb99680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145162240 unmapped: 25976832 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:01.941994+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145162240 unmapped: 25976832 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4840400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:02.942181+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 236 heartbeat osd_stat(store_statfs(0x1b48c3000/0x0/0x1bfc00000, data 0x4d0b6fc/0x4e6b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 236 handle_osd_map epochs [237,237], i have 236, src has [1,237]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 handle_osd_map epochs [237,237], i have 237, src has [1,237]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 ms_handle_reset con 0x55aac4840400 session 0x55aabfb99860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145170432 unmapped: 25968640 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:03.942381+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2130379 data_alloc: 285212672 data_used: 12148736
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145170432 unmapped: 25968640 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:04.942493+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302a000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 ms_handle_reset con 0x55aac302a000 session 0x55aac106af00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 ms_handle_reset con 0x55aac2ec3000 session 0x55aac106ad20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145186816 unmapped: 25952256 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:05.942659+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 heartbeat osd_stat(store_statfs(0x1b48b7000/0x0/0x1bfc00000, data 0x4d0e31b/0x4e76000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2537400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 ms_handle_reset con 0x55aac2537400 session 0x55aabfc225a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3028c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 ms_handle_reset con 0x55aac3028c00 session 0x55aabfab1680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2537400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145252352 unmapped: 25886720 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:06.942803+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 ms_handle_reset con 0x55aac2537400 session 0x55aabfa4e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 heartbeat osd_stat(store_statfs(0x1b48bc000/0x0/0x1bfc00000, data 0x4d0e23d/0x4e72000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 handle_osd_map epochs [238,238], i have 237, src has [1,238]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 237 handle_osd_map epochs [238,238], i have 238, src has [1,238]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 238 ms_handle_reset con 0x55aac2ec3000 session 0x55aabfb98d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.754137039s of 10.272047997s, submitted: 135
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 238 ms_handle_reset con 0x55aabf993400 session 0x55aac1de7e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145358848 unmapped: 25780224 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:07.942928+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 238 handle_osd_map epochs [239,239], i have 238, src has [1,239]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14cec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 239 ms_handle_reset con 0x55aac14cec00 session 0x55aabfaa41e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd261800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0577000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 239 handle_osd_map epochs [239,239], i have 239, src has [1,239]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145367040 unmapped: 25772032 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 239 ms_handle_reset con 0x55aac0577000 session 0x55aabfc30b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 239 heartbeat osd_stat(store_statfs(0x1b48b0000/0x0/0x1bfc00000, data 0x4d135c5/0x4e7c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:08.943078+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2146405 data_alloc: 285212672 data_used: 12165120
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 239 handle_osd_map epochs [239,240], i have 239, src has [1,240]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aabd261800 session 0x55aabfabf680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145383424 unmapped: 25755648 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:09.943243+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabf993400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 handle_osd_map epochs [240,240], i have 240, src has [1,240]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aabf993400 session 0x55aabe65d4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac14cec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2537400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac2537400 session 0x55aabfc2a1e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145383424 unmapped: 25755648 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac14cec00 session 0x55aabfc30780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:10.943359+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac2ec3000 session 0x55aabfc6ad20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145408000 unmapped: 25731072 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:11.943513+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd261800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aabd261800 session 0x55aabfc25e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145424384 unmapped: 25714688 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:12.943980+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145424384 unmapped: 25714688 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:13.944172+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2152595 data_alloc: 285212672 data_used: 12161024
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 heartbeat osd_stat(store_statfs(0x1b48ae000/0x0/0x1bfc00000, data 0x4d1623b/0x4e80000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0f74800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac0f74800 session 0x55aabeaf52c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd1000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac1cd1000 session 0x55aac2786960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145539072 unmapped: 25600000 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:14.944362+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac2fe6800 session 0x55aabfc6b2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4841800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 ms_handle_reset con 0x55aac4841800 session 0x55aabfb9e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1109] ------- DUMPING STATS -------
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl.cc:1111] 
                                                          ** DB Stats **
                                                          Uptime(secs): 8400.1 total, 600.0 interval
                                                          Cumulative writes: 16K writes, 64K keys, 16K commit groups, 1.0 writes per commit group, ingest: 0.05 GB, 0.01 MB/s
                                                          Cumulative WAL: 16K writes, 5709 syncs, 2.98 writes per sync, written: 0.05 GB, 0.01 MB/s
                                                          Cumulative stall: 00:00:0.000 H:M:S, 0.0 percent
                                                          Interval writes: 11K writes, 39K keys, 11K commit groups, 1.0 writes per commit group, ingest: 26.10 MB, 0.04 MB/s
                                                          Interval WAL: 11K writes, 4797 syncs, 2.34 writes per sync, written: 0.03 GB, 0.04 MB/s
                                                          Interval stall: 00:00:0.000 H:M:S, 0.0 percent
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145547264 unmapped: 25591808 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:15.944599+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145580032 unmapped: 25559040 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:16.944762+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 240 handle_osd_map epochs [241,241], i have 240, src has [1,241]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd261800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 241 ms_handle_reset con 0x55aabd261800 session 0x55aabdbb10e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.654898643s of 10.197992325s, submitted: 155
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 145563648 unmapped: 25575424 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 241 handle_osd_map epochs [241,242], i have 241, src has [1,242]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:17.944931+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4841400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146636800 unmapped: 24502272 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 242 ms_handle_reset con 0x55aac4841400 session 0x55aabfc301e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:18.945089+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2234734 data_alloc: 285212672 data_used: 12173312
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 242 ms_handle_reset con 0x55aac2fe6800 session 0x55aabfc6de00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4841800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146677760 unmapped: 24461312 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:19.945250+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 242 heartbeat osd_stat(store_statfs(0x1b3ffe000/0x0/0x1bfc00000, data 0x55c0592/0x572e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x64cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 242 ms_handle_reset con 0x55aac4841800 session 0x55aabfa6e000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146808832 unmapped: 24330240 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:20.945440+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 242 handle_osd_map epochs [242,243], i have 242, src has [1,243]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 243 handle_osd_map epochs [243,243], i have 243, src has [1,243]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146817024 unmapped: 24322048 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:21.945626+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146817024 unmapped: 24322048 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:22.945776+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 243 handle_osd_map epochs [243,244], i have 243, src has [1,244]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 244 handle_osd_map epochs [244,244], i have 244, src has [1,244]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146825216 unmapped: 24313856 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:23.946015+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2181614 data_alloc: 285212672 data_used: 12185600
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 244 handle_osd_map epochs [243,244], i have 244, src has [1,244]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302bc00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 244 ms_handle_reset con 0x55aac302bc00 session 0x55aabfc8c3c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146841600 unmapped: 24297472 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:24.946184+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 244 heartbeat osd_stat(store_statfs(0x1b449e000/0x0/0x1bfc00000, data 0x4d20a53/0x4e90000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 244 handle_osd_map epochs [245,245], i have 244, src has [1,245]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146890752 unmapped: 24248320 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 245 ms_handle_reset con 0x55aabeafec00 session 0x55aabfbf6960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:25.946432+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 245 handle_osd_map epochs [245,246], i have 245, src has [1,246]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 246 handle_osd_map epochs [246,246], i have 246, src has [1,246]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 146915328 unmapped: 24223744 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 246 handle_osd_map epochs [247,247], i have 246, src has [1,247]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain podman[236852]: @ - - [01/Feb/2026:10:09:00 +0000] "GET /v4.9.3/libpod/containers/stats?all=false&interval=1&stream=false HTTP/1.1" 200 18341 "" "Go-http-client/1.1"
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:26.946576+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 247 ms_handle_reset con 0x55aac2ec2c00 session 0x55aabfb990e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 247 handle_osd_map epochs [247,247], i have 247, src has [1,247]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.024949074s of 10.048226357s, submitted: 219
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 147030016 unmapped: 24109056 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:27.946752+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec3c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 247 ms_handle_reset con 0x55aac2ec3c00 session 0x55aac27870e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 147259392 unmapped: 23879680 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:28.946920+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2199206 data_alloc: 285212672 data_used: 12197888
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2399400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 247 handle_osd_map epochs [248,248], i have 247, src has [1,248]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 ms_handle_reset con 0x55aac2399400 session 0x55aabf9e4d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 148389888 unmapped: 22749184 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:29.947072+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 handle_osd_map epochs [248,248], i have 248, src has [1,248]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 heartbeat osd_stat(store_statfs(0x1b4450000/0x0/0x1bfc00000, data 0x4d6812c/0x4edc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4841c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 148488192 unmapped: 22650880 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:30.947244+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 ms_handle_reset con 0x55aac4841c00 session 0x55aabe5a6d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 ms_handle_reset con 0x55aac2fe6800 session 0x55aac1de65a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 248 handle_osd_map epochs [249,249], i have 248, src has [1,249]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 249 ms_handle_reset con 0x55aabeafec00 session 0x55aabfb990e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 249 handle_osd_map epochs [249,250], i have 249, src has [1,250]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 148586496 unmapped: 22552576 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2399400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:31.947387+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 250 ms_handle_reset con 0x55aac2399400 session 0x55aabfc6de00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 250 ms_handle_reset con 0x55aac2ec2c00 session 0x55aabfc6b2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 148766720 unmapped: 22372352 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:32.947569+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 148766720 unmapped: 22372352 heap: 171139072 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:33.947765+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302ac00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2217798 data_alloc: 285212672 data_used: 12214272
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 250 heartbeat osd_stat(store_statfs(0x1b441c000/0x0/0x1bfc00000, data 0x4d975f1/0x4f12000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 174145536 unmapped: 13787136 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:34.947905+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166150144 unmapped: 21782528 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:35.948031+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166281216 unmapped: 21651456 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:36.948171+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 250 heartbeat osd_stat(store_statfs(0x1ad7e3000/0x0/0x1bfc00000, data 0xb9cf34e/0xbb4b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302b000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 250 ms_handle_reset con 0x55aac302b000 session 0x55aabfb9c3c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.144762993s of 10.002573967s, submitted: 228
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafec00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:37.948352+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 154845184 unmapped: 33087488 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:38.948528+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 154861568 unmapped: 33071104 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 250 handle_osd_map epochs [250,251], i have 250, src has [1,251]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 3489626 data_alloc: 285212672 data_used: 12226560
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 ms_handle_reset con 0x55aabeafec00 session 0x55aac06d3a40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:39.948688+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 150790144 unmapped: 37142528 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2399400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 ms_handle_reset con 0x55aac2399400 session 0x55aac27863c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec2c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:40.948839+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 156270592 unmapped: 31662080 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 heartbeat osd_stat(store_statfs(0x1a3fb1000/0x0/0x1bfc00000, data 0x151ff662/0x1537d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [0,0,0,0,0,0,0,0,1])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 ms_handle_reset con 0x55aac2ec2c00 session 0x55aabe65cf00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:41.949020+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152166400 unmapped: 35766272 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 ms_handle_reset con 0x55aac0242000 session 0x55aabfc33e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:42.949223+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 156499968 unmapped: 31432704 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfbca800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 ms_handle_reset con 0x55aabfbca800 session 0x55aac106af00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:43.949413+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 156524544 unmapped: 31408128 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 4806051 data_alloc: 285212672 data_used: 12222464
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 ms_handle_reset con 0x55aac1261c00 session 0x55aabfc23a40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 251 handle_osd_map epochs [251,252], i have 251, src has [1,252]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 252 handle_osd_map epochs [252,252], i have 252, src has [1,252]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:44.949557+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 157638656 unmapped: 30294016 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 252 heartbeat osd_stat(store_statfs(0x19bf8a000/0x0/0x1bfc00000, data 0x1d22715c/0x1d3a3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:45.949721+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166477824 unmapped: 21454848 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 252 handle_osd_map epochs [252,253], i have 252, src has [1,253]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 ms_handle_reset con 0x55aac302ac00 session 0x55aabfc25e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:46.949880+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 154042368 unmapped: 33890304 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 handle_osd_map epochs [253,253], i have 253, src has [1,253]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 ms_handle_reset con 0x55aac0585000 session 0x55aabfc6ad20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148ac00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 ms_handle_reset con 0x55aac148ac00 session 0x55aac3f8c5a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148a800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0590800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 6.449806213s of 10.003729820s, submitted: 427
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 ms_handle_reset con 0x55aac0590800 session 0x55aac3f8cb40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:47.950005+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 153264128 unmapped: 34668544 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 ms_handle_reset con 0x55aac148a800 session 0x55aac3f8c780
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 ms_handle_reset con 0x55aac0585000 session 0x55aabfabf680
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:48.950159+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152911872 unmapped: 35020800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2367024 data_alloc: 285212672 data_used: 12247040
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 253 handle_osd_map epochs [253,254], i have 253, src has [1,254]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:49.950344+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152936448 unmapped: 34996224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 254 handle_osd_map epochs [254,254], i have 254, src has [1,254]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 254 heartbeat osd_stat(store_statfs(0x1b1b3f000/0x0/0x1bfc00000, data 0x4e6dca4/0x4fec000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:50.950504+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152936448 unmapped: 34996224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 254 heartbeat osd_stat(store_statfs(0x1b1b3c000/0x0/0x1bfc00000, data 0x4e709c0/0x4fef000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 254 handle_osd_map epochs [255,255], i have 254, src has [1,255]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 255 handle_osd_map epochs [255,255], i have 255, src has [1,255]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 255 handle_osd_map epochs [255,256], i have 255, src has [1,256]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:51.950659+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152936448 unmapped: 34996224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1261c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 256 ms_handle_reset con 0x55aac1261c00 session 0x55aabfabf4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 256 handle_osd_map epochs [256,256], i have 256, src has [1,256]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 256 handle_osd_map epochs [255,256], i have 256, src has [1,256]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302a400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 256 ms_handle_reset con 0x55aac302a400 session 0x55aabf9e4b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:52.950796+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 256 handle_osd_map epochs [257,257], i have 256, src has [1,257]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152788992 unmapped: 35143680 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:53.951005+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152846336 unmapped: 35086336 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2383518 data_alloc: 285212672 data_used: 12259328
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 handle_osd_map epochs [257,257], i have 257, src has [1,257]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2fe6800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 ms_handle_reset con 0x55aac2fe6800 session 0x55aabfc6d2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:54.951212+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152854528 unmapped: 35078144 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 heartbeat osd_stat(store_statfs(0x1b42f3000/0x0/0x1bfc00000, data 0x4eb7ee2/0x503b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 ms_handle_reset con 0x55aac0585000 session 0x55aabfc6d4a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1261c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 257 handle_osd_map epochs [257,258], i have 257, src has [1,258]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 258 handle_osd_map epochs [258,258], i have 258, src has [1,258]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 258 ms_handle_reset con 0x55aac1261c00 session 0x55aac3980000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:55.951358+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 154034176 unmapped: 33898496 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148a800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 258 heartbeat osd_stat(store_statfs(0x1b42c6000/0x0/0x1bfc00000, data 0x4ee24cf/0x5067000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:56.951528+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 258 handle_osd_map epochs [259,259], i have 258, src has [1,259]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 154025984 unmapped: 33906688 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 ms_handle_reset con 0x55aac148a800 session 0x55aabfc2be00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302a400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 handle_osd_map epochs [259,259], i have 259, src has [1,259]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 ms_handle_reset con 0x55aac302a400 session 0x55aabfc6ab40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.797972679s of 10.007996559s, submitted: 315
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:57.951682+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 153575424 unmapped: 34357248 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 ms_handle_reset con 0x55aabfb49c00 session 0x55aabdbcf860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 ms_handle_reset con 0x55aabfb49c00 session 0x55aabfe84960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:58.951824+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 153714688 unmapped: 34217984 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac0585000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 ms_handle_reset con 0x55aac0585000 session 0x55aabfab1e00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2400677 data_alloc: 285212672 data_used: 12275712
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1261c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:00:59.952014+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 259 handle_osd_map epochs [260,260], i have 259, src has [1,260]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 153853952 unmapped: 34078720 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 260 ms_handle_reset con 0x55aac1261c00 session 0x55aac1de70e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148a800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 260 ms_handle_reset con 0x55aac148a800 session 0x55aabfc234a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac302a400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 260 handle_osd_map epochs [260,260], i have 260, src has [1,260]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 260 ms_handle_reset con 0x55aac302a400 session 0x55aabfc850e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:00.952165+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 260 heartbeat osd_stat(store_statfs(0x1b4259000/0x0/0x1bfc00000, data 0x4f4d64c/0x50d5000, compress 0x0/0x0/0x0, omap 0x649, meta 0x68cf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152911872 unmapped: 35020800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 260 handle_osd_map epochs [260,261], i have 260, src has [1,261]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:01.952349+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 152952832 unmapped: 34979840 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfb49c00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 261 handle_osd_map epochs [261,261], i have 261, src has [1,261]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 261 ms_handle_reset con 0x55aabfb49c00 session 0x55aabfb9d0e0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:02.952498+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 153116672 unmapped: 34816000 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 261 handle_osd_map epochs [262,262], i have 261, src has [1,262]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 262 handle_osd_map epochs [262,263], i have 262, src has [1,263]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:03.952657+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 handle_osd_map epochs [262,263], i have 263, src has [1,263]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 ms_handle_reset con 0x55aac3026000 session 0x55aabfc24d20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 154279936 unmapped: 33652736 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 handle_osd_map epochs [263,263], i have 263, src has [1,263]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2425523 data_alloc: 285212672 data_used: 12300288
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:04.952799+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 155492352 unmapped: 32440320 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabe97f800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 ms_handle_reset con 0x55aabe97f800 session 0x55aac06d2f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:05.952990+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 155492352 unmapped: 32440320 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabeafe800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 ms_handle_reset con 0x55aabeafe800 session 0x55aac27874a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabd260800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 heartbeat osd_stat(store_statfs(0x1b3de5000/0x0/0x1bfc00000, data 0x4fb8e0b/0x5146000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 ms_handle_reset con 0x55aabd260800 session 0x55aabfb9d2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:06.953216+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 263 handle_osd_map epochs [264,264], i have 263, src has [1,264]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 155516928 unmapped: 32415744 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 264 handle_osd_map epochs [264,264], i have 264, src has [1,264]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2536800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 264 ms_handle_reset con 0x55aac2536800 session 0x55aac3f8da40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac4841800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.862019539s of 10.000491142s, submitted: 302
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 264 handle_osd_map epochs [264,265], i have 264, src has [1,265]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:07.953353+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 265 ms_handle_reset con 0x55aac4841800 session 0x55aabfc6cf00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 155574272 unmapped: 32358400 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 265 handle_osd_map epochs [265,265], i have 265, src has [1,265]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:08.953496+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 265 handle_osd_map epochs [266,266], i have 265, src has [1,266]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 266 handle_osd_map epochs [266,266], i have 266, src has [1,266]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3027000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 155598848 unmapped: 32333824 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 266 ms_handle_reset con 0x55aac3027000 session 0x55aabdbb0f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2437281 data_alloc: 285212672 data_used: 12312576
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:09.953650+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 155664384 unmapped: 32268288 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 266 handle_osd_map epochs [267,267], i have 266, src has [1,267]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aabfce2800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 267 handle_osd_map epochs [267,267], i have 267, src has [1,267]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 267 ms_handle_reset con 0x55aabfce2800 session 0x55aabfc32b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:10.953810+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 267 heartbeat osd_stat(store_statfs(0x1b3d4a000/0x0/0x1bfc00000, data 0x5053420/0x51e3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 157147136 unmapped: 30785536 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:11.953958+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 157163520 unmapped: 30769152 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 267 handle_osd_map epochs [268,268], i have 267, src has [1,268]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 268 handle_osd_map epochs [268,268], i have 268, src has [1,268]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:12.954110+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 157597696 unmapped: 30334976 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 268 handle_osd_map epochs [269,269], i have 268, src has [1,269]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:13.954339+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158662656 unmapped: 29270016 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2453965 data_alloc: 285212672 data_used: 12324864
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:14.954485+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158670848 unmapped: 29261824 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 269 handle_osd_map epochs [269,269], i have 269, src has [1,269]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:15.954645+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158703616 unmapped: 29229056 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 269 heartbeat osd_stat(store_statfs(0x1b3cef000/0x0/0x1bfc00000, data 0x50af2bb/0x523f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:16.954828+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 269 handle_osd_map epochs [270,270], i have 269, src has [1,270]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 269 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1a] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158859264 unmapped: 29073408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.0] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[3.1d] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 270 handle_osd_map epochs [270,270], i have 270, src has [1,270]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 8.950061798s of 10.006869316s, submitted: 292
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:17.955019+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158859264 unmapped: 29073408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 270 handle_osd_map epochs [271,271], i have 270, src has [1,271]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 271 handle_osd_map epochs [271,271], i have 271, src has [1,271]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:18.955179+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158220288 unmapped: 29712384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2468645 data_alloc: 285212672 data_used: 12337152
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:19.955372+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 158294016 unmapped: 29638656 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 271 handle_osd_map epochs [272,272], i have 271, src has [1,272]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 272 heartbeat osd_stat(store_statfs(0x1b3c82000/0x0/0x1bfc00000, data 0x511743c/0x52ab000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:20.955536+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 159342592 unmapped: 28590080 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:21.955679+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 159752192 unmapped: 28180480 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 272 handle_osd_map epochs [273,273], i have 272, src has [1,273]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 273 handle_osd_map epochs [273,273], i have 273, src has [1,273]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:22.955839+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 159760384 unmapped: 28172288 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b3c52000/0x0/0x1bfc00000, data 0x514656c/0x52dc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:23.956035+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 159760384 unmapped: 28172288 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2476735 data_alloc: 285212672 data_used: 12349440
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:24.956195+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b3c47000/0x0/0x1bfc00000, data 0x5150d47/0x52e7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 159850496 unmapped: 28082176 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 61
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:25.956382+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160063488 unmapped: 27869184 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 273 heartbeat osd_stat(store_statfs(0x1b3c13000/0x0/0x1bfc00000, data 0x5182a77/0x531b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 273 handle_osd_map epochs [273,274], i have 273, src has [1,274]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:26.956552+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160071680 unmapped: 27860992 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.602342606s of 10.004695892s, submitted: 138
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:27.956699+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160153600 unmapped: 27779072 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:28.956850+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160153600 unmapped: 27779072 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2487679 data_alloc: 285212672 data_used: 12365824
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 handle_osd_map epochs [274,274], i have 274, src has [1,274]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac1cd1000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:29.956985+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160251904 unmapped: 27680768 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:30.957153+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160350208 unmapped: 27582464 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b3be7000/0x0/0x1bfc00000, data 0x51ac511/0x5347000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:31.957338+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160350208 unmapped: 27582464 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:32.957497+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b3be7000/0x0/0x1bfc00000, data 0x51ac576/0x5347000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160350208 unmapped: 27582464 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:33.957703+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160350208 unmapped: 27582464 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2491631 data_alloc: 285212672 data_used: 12365824
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:34.957862+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160350208 unmapped: 27582464 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:35.957988+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160350208 unmapped: 27582464 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b3bc6000/0x0/0x1bfc00000, data 0x51cdc20/0x5368000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:36.958160+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160464896 unmapped: 27467776 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:37.958366+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160473088 unmapped: 27459584 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:38.958523+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 160473088 unmapped: 27459584 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.528889656s of 11.617852211s, submitted: 22
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2494003 data_alloc: 285212672 data_used: 12365824
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b3bc6000/0x0/0x1bfc00000, data 0x51cdc20/0x5368000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac3026000
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:39.958653+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 161521664 unmapped: 26411008 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:40.958862+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 161521664 unmapped: 26411008 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:41.959054+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 161521664 unmapped: 26411008 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:42.959242+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 161546240 unmapped: 26386432 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:43.959429+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 161546240 unmapped: 26386432 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2502137 data_alloc: 285212672 data_used: 12369920
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:44.959629+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 161734656 unmapped: 26198016 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b3b63000/0x0/0x1bfc00000, data 0x522ce0d/0x53cb000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:45.959800+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162144256 unmapped: 25788416 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:46.959954+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162357248 unmapped: 25575424 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:47.960091+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162480128 unmapped: 25452544 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:48.960230+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162701312 unmapped: 25231360 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.649950027s of 10.004927635s, submitted: 74
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2509265 data_alloc: 285212672 data_used: 12365824
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:49.960426+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 163766272 unmapped: 24166400 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:50.960602+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 163766272 unmapped: 24166400 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 heartbeat osd_stat(store_statfs(0x1b3ac3000/0x0/0x1bfc00000, data 0x52d0fbf/0x546b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:51.960722+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162758656 unmapped: 25174016 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:52.960846+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162758656 unmapped: 25174016 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:53.961095+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162766848 unmapped: 25165824 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2516313 data_alloc: 285212672 data_used: 12365824
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:54.961272+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 274 handle_osd_map epochs [274,275], i have 274, src has [1,275]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162471936 unmapped: 25460736 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:55.961515+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162480128 unmapped: 25452544 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:56.961671+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b3a61000/0x0/0x1bfc00000, data 0x5331d38/0x54cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162480128 unmapped: 25452544 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b3a61000/0x0/0x1bfc00000, data 0x5331d38/0x54cc000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:57.961951+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162578432 unmapped: 25354240 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:58.962134+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162717696 unmapped: 25214976 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.618753433s of 10.001223564s, submitted: 75
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2520567 data_alloc: 285212672 data_used: 12378112
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:01:59.962419+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162717696 unmapped: 25214976 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:00.962626+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 275 heartbeat osd_stat(store_statfs(0x1b3a45000/0x0/0x1bfc00000, data 0x534e777/0x54e9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162775040 unmapped: 25157632 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:01.962746+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 275 handle_osd_map epochs [275,276], i have 275, src has [1,276]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162783232 unmapped: 25149440 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:02.962974+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162783232 unmapped: 25149440 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:03.963212+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 162783232 unmapped: 25149440 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 276 handle_osd_map epochs [276,276], i have 276, src has [1,276]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2530157 data_alloc: 285212672 data_used: 12390400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:04.963380+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 163946496 unmapped: 23986176 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 276 heartbeat osd_stat(store_statfs(0x1b39f0000/0x0/0x1bfc00000, data 0x53a0c2e/0x553e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:05.963566+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 163946496 unmapped: 23986176 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:06.963759+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164126720 unmapped: 23805952 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:07.963986+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164126720 unmapped: 23805952 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:08.964196+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164126720 unmapped: 23805952 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.873770714s of 10.005140305s, submitted: 37
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2529613 data_alloc: 285212672 data_used: 12390400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:09.964394+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 276 heartbeat osd_stat(store_statfs(0x1b39f0000/0x0/0x1bfc00000, data 0x53a0c2e/0x553e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x6ccf9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164126720 unmapped: 23805952 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:10.964624+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164143104 unmapped: 23789568 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:11.964790+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164487168 unmapped: 23445504 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:12.964992+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164610048 unmapped: 23322624 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:13.965233+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 164610048 unmapped: 23322624 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 276 handle_osd_map epochs [276,277], i have 276, src has [1,277]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2553383 data_alloc: 285212672 data_used: 12406784
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:14.965406+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 277 heartbeat osd_stat(store_statfs(0x1b273c000/0x0/0x1bfc00000, data 0x54b1527/0x5651000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 165675008 unmapped: 22257664 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 277 handle_osd_map epochs [278,278], i have 277, src has [1,278]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 278 handle_osd_map epochs [278,278], i have 278, src has [1,278]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:15.965575+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 165797888 unmapped: 22134784 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:16.965790+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 165806080 unmapped: 22126592 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 278 heartbeat osd_stat(store_statfs(0x1b270d000/0x0/0x1bfc00000, data 0x54e0f3c/0x567f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:17.966042+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 165961728 unmapped: 21970944 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:18.966204+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166076416 unmapped: 21856256 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.315005302s of 10.029669762s, submitted: 173
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2563805 data_alloc: 285212672 data_used: 12414976
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:19.966393+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166223872 unmapped: 21708800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:20.966561+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166223872 unmapped: 21708800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:21.966725+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 278 handle_osd_map epochs [278,279], i have 278, src has [1,279]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 handle_osd_map epochs [279,279], i have 279, src has [1,279]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b268b000/0x0/0x1bfc00000, data 0x5563222/0x5702000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:22.966938+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:23.967177+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2562813 data_alloc: 285212672 data_used: 12427264
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:24.973664+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b2671000/0x0/0x1bfc00000, data 0x557ca58/0x571b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:25.973875+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:26.974060+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:27.974338+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:28.974483+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166420480 unmapped: 21512192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2560093 data_alloc: 285212672 data_used: 12427264
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:29.974713+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166420480 unmapped: 21512192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:30.974915+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b2672000/0x0/0x1bfc00000, data 0x557cabd/0x571b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166420480 unmapped: 21512192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 12.211768150s of 12.372990608s, submitted: 42
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:31.975046+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166420480 unmapped: 21512192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b2673000/0x0/0x1bfc00000, data 0x557cb22/0x571b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:32.975256+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b2672000/0x0/0x1bfc00000, data 0x557cb55/0x571b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:33.986040+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2560445 data_alloc: 285212672 data_used: 12427264
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:34.986365+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:35.986528+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:36.986718+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166412288 unmapped: 21520384 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:37.986879+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 heartbeat osd_stat(store_statfs(0x1b266f000/0x0/0x1bfc00000, data 0x557cd4e/0x571e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166420480 unmapped: 21512192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:38.987108+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166420480 unmapped: 21512192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2564243 data_alloc: 285212672 data_used: 12427264
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:39.987276+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166002688 unmapped: 21929984 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:40.987479+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 279 handle_osd_map epochs [280,280], i have 279, src has [1,280]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166002688 unmapped: 21929984 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 280 heartbeat osd_stat(store_statfs(0x1b2671000/0x0/0x1bfc00000, data 0x557cde2/0x571d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.854620934s of 10.008685112s, submitted: 46
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:41.987657+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166010880 unmapped: 21921792 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 280 heartbeat osd_stat(store_statfs(0x1b266c000/0x0/0x1bfc00000, data 0x557f86b/0x5721000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:42.987807+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166010880 unmapped: 21921792 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:43.987990+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166010880 unmapped: 21921792 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2568445 data_alloc: 285212672 data_used: 12439552
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:44.988156+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166010880 unmapped: 21921792 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:45.988355+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 280 heartbeat osd_stat(store_statfs(0x1b266c000/0x0/0x1bfc00000, data 0x557f89d/0x5721000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166010880 unmapped: 21921792 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:46.988526+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 280 handle_osd_map epochs [280,281], i have 280, src has [1,281]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:47.988730+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 281 handle_osd_map epochs [281,281], i have 281, src has [1,281]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:48.988911+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 281 heartbeat osd_stat(store_statfs(0x1b2669000/0x0/0x1bfc00000, data 0x558215c/0x5723000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2569795 data_alloc: 285212672 data_used: 12451840
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:49.989103+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 281 heartbeat osd_stat(store_statfs(0x1b266a000/0x0/0x1bfc00000, data 0x55821f0/0x5723000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:50.989282+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:51.989517+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:52.989657+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.118924141s of 11.311182022s, submitted: 40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:53.989866+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2571211 data_alloc: 285212672 data_used: 12451840
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:54.990041+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166027264 unmapped: 21905408 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:55.990198+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 281 heartbeat osd_stat(store_statfs(0x1b2669000/0x0/0x1bfc00000, data 0x558225c/0x5724000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166043648 unmapped: 21889024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:56.990354+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166043648 unmapped: 21889024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 281 heartbeat osd_stat(store_statfs(0x1b2669000/0x0/0x1bfc00000, data 0x5582326/0x5724000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:57.990539+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166043648 unmapped: 21889024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:58.990692+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166043648 unmapped: 21889024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2570313 data_alloc: 285212672 data_used: 12451840
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:02:59.990918+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 281 handle_osd_map epochs [281,282], i have 281, src has [1,282]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166051840 unmapped: 21880832 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 282 heartbeat osd_stat(store_statfs(0x1b2669000/0x0/0x1bfc00000, data 0x55822f8/0x5724000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:00.991118+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166051840 unmapped: 21880832 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 282 heartbeat osd_stat(store_statfs(0x1b2666000/0x0/0x1bfc00000, data 0x5584c7f/0x5728000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:01.991335+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166051840 unmapped: 21880832 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:02.991484+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166051840 unmapped: 21880832 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.428084373s of 10.706546783s, submitted: 53
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:03.991727+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166051840 unmapped: 21880832 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2573827 data_alloc: 285212672 data_used: 12464128
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:04.991944+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 282 handle_osd_map epochs [282,283], i have 282, src has [1,283]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166060032 unmapped: 21872640 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:05.992105+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 handle_osd_map epochs [283,283], i have 283, src has [1,283]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166060032 unmapped: 21872640 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:06.992246+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166060032 unmapped: 21872640 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 heartbeat osd_stat(store_statfs(0x1b2660000/0x0/0x1bfc00000, data 0x5587735/0x572c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 handle_osd_map epochs [284,284], i have 283, src has [1,284]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 283 handle_osd_map epochs [284,284], i have 284, src has [1,284]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:07.992481+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166076416 unmapped: 21856256 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:08.992655+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166076416 unmapped: 21856256 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2586623 data_alloc: 285212672 data_used: 12476416
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:09.992838+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166076416 unmapped: 21856256 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:10.993045+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 284 heartbeat osd_stat(store_statfs(0x1b265b000/0x0/0x1bfc00000, data 0x558a287/0x5733000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166084608 unmapped: 21848064 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:11.993207+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 284 handle_osd_map epochs [285,285], i have 284, src has [1,285]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 166100992 unmapped: 21831680 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 285 handle_osd_map epochs [285,285], i have 285, src has [1,285]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:12.993359+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 285 handle_osd_map epochs [285,286], i have 285, src has [1,286]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167157760 unmapped: 20774912 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:13.993616+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b2651000/0x0/0x1bfc00000, data 0x558f3e0/0x573b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.165566444s of 10.561065674s, submitted: 118
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167174144 unmapped: 20758528 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2595057 data_alloc: 285212672 data_used: 12500992
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:14.993883+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167182336 unmapped: 20750336 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:15.994067+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167190528 unmapped: 20742144 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:16.994266+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 286 heartbeat osd_stat(store_statfs(0x1b264f000/0x0/0x1bfc00000, data 0x558f71b/0x573e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167206912 unmapped: 20725760 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:17.994439+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167215104 unmapped: 20717568 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:18.994631+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167215104 unmapped: 20717568 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2596495 data_alloc: 285212672 data_used: 12500992
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:19.994816+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 286 handle_osd_map epochs [287,287], i have 286, src has [1,287]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167247872 unmapped: 20684800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 287 handle_osd_map epochs [287,287], i have 287, src has [1,287]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:20.995011+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167256064 unmapped: 20676608 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:21.995200+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 287 handle_osd_map epochs [287,288], i have 287, src has [1,288]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 288 heartbeat osd_stat(store_statfs(0x1b264d000/0x0/0x1bfc00000, data 0x559208f/0x573f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167272448 unmapped: 20660224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:22.995384+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167272448 unmapped: 20660224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:23.995658+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.810704231s of 10.122360229s, submitted: 87
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167272448 unmapped: 20660224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2605661 data_alloc: 285212672 data_used: 12525568
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:24.995873+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167272448 unmapped: 20660224 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:25.996186+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167288832 unmapped: 20643840 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 288 heartbeat osd_stat(store_statfs(0x1b2649000/0x0/0x1bfc00000, data 0x55949af/0x5744000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:26.996377+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 288 handle_osd_map epochs [289,289], i have 288, src has [1,289]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167321600 unmapped: 20611072 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 289 handle_osd_map epochs [289,289], i have 289, src has [1,289]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:27.996518+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 289 handle_osd_map epochs [289,290], i have 289, src has [1,290]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 290 heartbeat osd_stat(store_statfs(0x1b2646000/0x0/0x1bfc00000, data 0x5597147/0x5746000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167346176 unmapped: 20586496 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:28.996699+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167354368 unmapped: 20578304 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2614429 data_alloc: 285212672 data_used: 12550144
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:29.996922+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 290 handle_osd_map epochs [290,291], i have 290, src has [1,291]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 20537344 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:30.997122+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 20537344 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:31.997317+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 291 heartbeat osd_stat(store_statfs(0x1b263f000/0x0/0x1bfc00000, data 0x559c8a5/0x574f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x7e6f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167395328 unmapped: 20537344 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:32.997504+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 20529152 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:33.997674+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 20529152 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2615061 data_alloc: 285212672 data_used: 12550144
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.939945221s of 10.341937065s, submitted: 106
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:34.997863+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 291 heartbeat osd_stat(store_statfs(0x1b223f000/0x0/0x1bfc00000, data 0x559c8d9/0x574f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167403520 unmapped: 20529152 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:35.998009+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167428096 unmapped: 20504576 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:36.998164+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 291 heartbeat osd_stat(store_statfs(0x1b223c000/0x0/0x1bfc00000, data 0x559ca63/0x574f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 291 handle_osd_map epochs [292,292], i have 291, src has [1,292]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 291 handle_osd_map epochs [292,292], i have 292, src has [1,292]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 20488192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:37.998337+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167444480 unmapped: 20488192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:38.998490+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 167510016 unmapped: 20422656 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2629797 data_alloc: 285212672 data_used: 12562432
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:39.998630+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 292 handle_osd_map epochs [292,293], i have 292, src has [1,293]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 293 heartbeat osd_stat(store_statfs(0x1b2214000/0x0/0x1bfc00000, data 0x55c1caa/0x5777000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 293 heartbeat osd_stat(store_statfs(0x1b2214000/0x0/0x1bfc00000, data 0x55c1caa/0x5777000, compress 0x0/0x0/0x0, omap 0x649, meta 0x826f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 168566784 unmapped: 19365888 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:40.998754+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 168869888 unmapped: 19062784 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:41.998899+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 169533440 unmapped: 18399232 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:42.999069+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 170065920 unmapped: 17866752 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:43.999242+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [L] New memtable created with log file: #45. Immutable memtables: 2.
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 172048384 unmapped: 15884288 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2647853 data_alloc: 285212672 data_used: 12574720
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.453391075s of 10.010723114s, submitted: 126
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:44.999391+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 172056576 unmapped: 15876096 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 293 heartbeat osd_stat(store_statfs(0x1b0f15000/0x0/0x1bfc00000, data 0x572327a/0x58d9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:45.999553+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 171458560 unmapped: 16474112 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:46.999730+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 293 handle_osd_map epochs [293,294], i have 293, src has [1,294]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 171466752 unmapped: 16465920 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:47.999878+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 172515328 unmapped: 15417344 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:49.000041+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 handle_osd_map epochs [294,294], i have 294, src has [1,294]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 173989888 unmapped: 13942784 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2665349 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:50.000196+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b0e3b000/0x0/0x1bfc00000, data 0x57fb269/0x59b2000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 173989888 unmapped: 13942784 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:51.000334+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b0e3c000/0x0/0x1bfc00000, data 0x57fb207/0x59b1000, compress 0x0/0x0/0x0, omap 0x649, meta 0x940f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 ms_handle_reset con 0x55aac1cd1000 session 0x55aabfc33860
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 ms_handle_reset con 0x55aac3026000 session 0x55aabfc334a0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 175128576 unmapped: 12804096 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:52.000528+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 62
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 177135616 unmapped: 10797056 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:53.000727+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 177135616 unmapped: 10797056 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:54.000947+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 63
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 176947200 unmapped: 10985472 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.384486198s of 10.005270958s, submitted: 540
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2683011 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:55.001128+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 178036736 unmapped: 9895936 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:56.001272+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 178044928 unmapped: 9887744 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:57.001457+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1ccf000/0x0/0x1bfc00000, data 0x5966907/0x5b1d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 178044928 unmapped: 9887744 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:58.001592+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 176824320 unmapped: 11108352 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:03:59.001773+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 176939008 unmapped: 10993664 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2702719 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:00.001957+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1c16000/0x0/0x1bfc00000, data 0x5a20683/0x5bd7000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 176939008 unmapped: 10993664 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:01.002144+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 178651136 unmapped: 9281536 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:02.002340+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 178733056 unmapped: 9199616 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:03.002525+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1b46000/0x0/0x1bfc00000, data 0x5aee778/0x5ca6000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 178896896 unmapped: 9035776 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:04.002828+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 179159040 unmapped: 8773632 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2704373 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:05.003016+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.661733627s of 10.156461716s, submitted: 104
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180273152 unmapped: 7659520 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:06.003393+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180469760 unmapped: 7462912 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:07.003587+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180953088 unmapped: 6979584 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:08.003770+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180502528 unmapped: 7430144 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:09.003903+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1a75000/0x0/0x1bfc00000, data 0x5bc22f2/0x5d79000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [0,0,0,0,2])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 181690368 unmapped: 6242304 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2727449 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:10.004089+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180822016 unmapped: 7110656 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:11.004257+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b19a9000/0x0/0x1bfc00000, data 0x5c8eb47/0x5e45000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180838400 unmapped: 7094272 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:12.004408+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 180854784 unmapped: 7077888 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:13.004581+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 181248000 unmapped: 6684672 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b193b000/0x0/0x1bfc00000, data 0x5cfccc1/0x5eb3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:14.004737+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 181248000 unmapped: 6684672 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2729959 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:15.004899+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.871169090s of 10.339767456s, submitted: 104
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 181403648 unmapped: 6529024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:16.005040+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 181624832 unmapped: 6307840 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b18b2000/0x0/0x1bfc00000, data 0x5d86754/0x5f3c000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:17.005221+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 181641216 unmapped: 6291456 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:18.005389+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b18af000/0x0/0x1bfc00000, data 0x5d88da6/0x5f3f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182689792 unmapped: 5242880 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:19.005594+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182779904 unmapped: 5152768 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2734511 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:20.005767+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182779904 unmapped: 5152768 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:21.005941+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182779904 unmapped: 5152768 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:22.006084+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1894000/0x0/0x1bfc00000, data 0x5da4d4c/0x5f5a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182779904 unmapped: 5152768 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:23.006261+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182779904 unmapped: 5152768 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:24.006493+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182788096 unmapped: 5144576 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2734623 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:25.006645+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182804480 unmapped: 5128192 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.267455101s of 10.529454231s, submitted: 48
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:26.006804+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1893000/0x0/0x1bfc00000, data 0x5da4db1/0x5f5a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182812672 unmapped: 5120000 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:27.006953+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182812672 unmapped: 5120000 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:28.007140+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182820864 unmapped: 5111808 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:29.007377+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182820864 unmapped: 5111808 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2735525 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:30.007541+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1893000/0x0/0x1bfc00000, data 0x5da4f12/0x5f5a000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182820864 unmapped: 5111808 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:31.007677+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182820864 unmapped: 5111808 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:32.007823+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182853632 unmapped: 5079040 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:33.008005+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182853632 unmapped: 5079040 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:34.008208+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182853632 unmapped: 5079040 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:35.008379+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2737715 data_alloc: 285212672 data_used: 12587008
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182861824 unmapped: 5070848 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.873971939s of 10.034460068s, submitted: 31
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:36.008531+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 heartbeat osd_stat(store_statfs(0x1b1891000/0x0/0x1bfc00000, data 0x5da50a6/0x5f5b000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182861824 unmapped: 5070848 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:37.008781+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182861824 unmapped: 5070848 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:38.009095+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182870016 unmapped: 5062656 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:39.009265+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 294 handle_osd_map epochs [295,295], i have 294, src has [1,295]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182870016 unmapped: 5062656 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 295 handle_osd_map epochs [295,295], i have 295, src has [1,295]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 295 handle_osd_map epochs [295,295], i have 295, src has [1,295]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:40.009440+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2741939 data_alloc: 285212672 data_used: 12599296
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182902784 unmapped: 5029888 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:41.009705+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182902784 unmapped: 5029888 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:42.009859+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 295 heartbeat osd_stat(store_statfs(0x1b188f000/0x0/0x1bfc00000, data 0x5da7bf1/0x5f5f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 295 heartbeat osd_stat(store_statfs(0x1b186e000/0x0/0x1bfc00000, data 0x5dc77dd/0x5f7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182902784 unmapped: 5029888 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:43.009993+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 295 heartbeat osd_stat(store_statfs(0x1b186e000/0x0/0x1bfc00000, data 0x5dc77dd/0x5f7f000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 182902784 unmapped: 5029888 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:44.010165+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 183017472 unmapped: 4915200 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:45.010340+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2749749 data_alloc: 285212672 data_used: 12599296
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 183214080 unmapped: 4718592 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:46.010509+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 183361536 unmapped: 4571136 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:47.010666+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 295 handle_osd_map epochs [296,296], i have 295, src has [1,296]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 11.336016655s of 11.649600983s, submitted: 82
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.13] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.11] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.18] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.1b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[6.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 handle_osd_map epochs [296,296], i have 296, src has [1,296]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 183369728 unmapped: 4562944 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:48.010868+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 184655872 unmapped: 3276800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 heartbeat osd_stat(store_statfs(0x1b1798000/0x0/0x1bfc00000, data 0x5e99857/0x6055000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:49.011022+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 184786944 unmapped: 3145728 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:50.011265+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2768563 data_alloc: 285212672 data_used: 12611584
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 184786944 unmapped: 3145728 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:51.011452+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 heartbeat osd_stat(store_statfs(0x1b173b000/0x0/0x1bfc00000, data 0x5ef8290/0x60b3000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 184786944 unmapped: 3145728 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:52.011622+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185081856 unmapped: 2850816 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:53.011778+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:54.011998+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185221120 unmapped: 2711552 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:55.012187+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185327616 unmapped: 2605056 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2775363 data_alloc: 285212672 data_used: 12611584
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:56.012373+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185327616 unmapped: 2605056 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:57.012549+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185499648 unmapped: 2433024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 heartbeat osd_stat(store_statfs(0x1b16ca000/0x0/0x1bfc00000, data 0x5f67795/0x6124000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:58.012726+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185499648 unmapped: 2433024 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.990136147s of 11.342629433s, submitted: 76
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:04:59.012805+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 185679872 unmapped: 2252800 heap: 187932672 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:00.012944+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 186834944 unmapped: 2146304 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2791517 data_alloc: 285212672 data_used: 12611584
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:01.013078+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 186834944 unmapped: 2146304 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 heartbeat osd_stat(store_statfs(0x1b15bd000/0x0/0x1bfc00000, data 0x6074d12/0x6230000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:02.013258+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187179008 unmapped: 1802240 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:03.013419+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187179008 unmapped: 1802240 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:04.013636+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187179008 unmapped: 1802240 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 heartbeat osd_stat(store_statfs(0x1b1599000/0x0/0x1bfc00000, data 0x609aa55/0x6255000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:05.013771+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187187200 unmapped: 1794048 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2789035 data_alloc: 285212672 data_used: 12611584
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 heartbeat osd_stat(store_statfs(0x1b1545000/0x0/0x1bfc00000, data 0x60eeebd/0x62a9000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:06.013930+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187187200 unmapped: 1794048 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:07.014085+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187432960 unmapped: 1548288 heap: 188981248 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac148a800
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:08.014249+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 194682880 unmapped: 11083776 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.573113441s of 10.008808136s, submitted: 85
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:09.014360+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 186318848 unmapped: 19447808 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _renew_subs
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 296 handle_osd_map epochs [297,297], i have 296, src has [1,297]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 297 ms_handle_reset con 0x55aac148a800 session 0x55aabeaf5c20
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2537400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 297 handle_osd_map epochs [297,297], i have 297, src has [1,297]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:10.014661+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187506688 unmapped: 18259968 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2857875 data_alloc: 285212672 data_used: 12623872
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 297 handle_osd_map epochs [297,298], i have 297, src has [1,298]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 298 ms_handle_reset con 0x55aac2537400 session 0x55aabfb98b40
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:11.014925+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187703296 unmapped: 18063360 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 298 heartbeat osd_stat(store_statfs(0x1b14a7000/0x0/0x1bfc00000, data 0x618a62c/0x6345000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:12.015140+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187744256 unmapped: 18022400 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 298 heartbeat osd_stat(store_statfs(0x1b14a7000/0x0/0x1bfc00000, data 0x618a62c/0x6345000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:13.015304+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187760640 unmapped: 18006016 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:14.015534+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187777024 unmapped: 17989632 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:15.015749+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187777024 unmapped: 17989632 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2802193 data_alloc: 285212672 data_used: 12623872
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:16.015903+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187777024 unmapped: 17989632 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:17.016067+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 187777024 unmapped: 17989632 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 298 handle_osd_map epochs [298,299], i have 298, src has [1,299]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.19] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1c] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.8] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.9] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.17] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.10] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.12] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.16] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.b] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.7] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.4] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.15] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.1] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 scrub-queue::remove_from_osd_queue removing pg[4.6] failed. State was: not registered w/ OSD
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:18.016219+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188833792 unmapped: 16932864 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1480000/0x0/0x1bfc00000, data 0x61b22e2/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:19.016383+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188833792 unmapped: 16932864 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.865082741s of 10.326470375s, submitted: 118
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:20.016547+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188841984 unmapped: 16924672 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807387 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:21.016754+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188841984 unmapped: 16924672 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:22.016970+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188841984 unmapped: 16924672 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1480000/0x0/0x1bfc00000, data 0x61b237d/0x636e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:23.017190+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188841984 unmapped: 16924672 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1480000/0x0/0x1bfc00000, data 0x61b237d/0x636e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:24.017415+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188850176 unmapped: 16916480 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:25.017644+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188858368 unmapped: 16908288 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2806889 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:26.017849+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188858368 unmapped: 16908288 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:27.018023+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188858368 unmapped: 16908288 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:28.018165+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:29.018333+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b2476/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 10.081426620s of 10.135598183s, submitted: 10
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:30.018528+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2806889 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b2476/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:31.018683+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:32.018855+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:33.019035+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b2476/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:34.019234+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:35.019355+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188866560 unmapped: 16900096 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2806521 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:36.019563+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188874752 unmapped: 16891904 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:37.019745+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188874752 unmapped: 16891904 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:38.019881+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188874752 unmapped: 16891904 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:39.020136+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188874752 unmapped: 16891904 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b24db/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.976624489s of 10.000726700s, submitted: 4
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b24db/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:40.020360+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188874752 unmapped: 16891904 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2806873 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:41.020566+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188874752 unmapped: 16891904 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:42.020734+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188882944 unmapped: 16883712 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b24db/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:43.020886+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188882944 unmapped: 16883712 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b2540/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:44.021115+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188891136 unmapped: 16875520 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:45.021320+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188891136 unmapped: 16875520 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2806873 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:46.021452+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188891136 unmapped: 16875520 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:47.021587+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188891136 unmapped: 16875520 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b2540/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:48.021723+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188899328 unmapped: 16867328 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:49.021875+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188899328 unmapped: 16867328 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 9.960057259s of 10.001147270s, submitted: 9
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:50.022048+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188899328 unmapped: 16867328 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2808657 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:51.022239+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188899328 unmapped: 16867328 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:52.022415+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1480000/0x0/0x1bfc00000, data 0x61b26a5/0x636e000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:53.022568+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:54.022791+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:55.022978+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:56.023171+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:57.023356+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:58.023484+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:05:59.023687+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188907520 unmapped: 16859136 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:00.023828+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:01.023982+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:02.024152+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:03.024350+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:04.024561+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:05.024712+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:06.024901+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:07.025059+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:08.025256+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:09.025371+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:10.025526+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:11.025704+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:12.025888+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:13.026067+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:14.026279+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:15.026484+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188915712 unmapped: 16850944 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:16.026667+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:17.026810+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:18.027044+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:19.027346+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:20.027495+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 ms_handle_reset con 0x55aabfce2400 session 0x55aabfabf2c0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac125ac00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 ms_handle_reset con 0x55aac148a000 session 0x55aabfab0f00
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: handle_auth_request added challenge on 0x55aac2ec4400
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:21.027718+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:22.027906+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:23.028079+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188923904 unmapped: 16842752 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:24.028257+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:25.028464+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:26.028643+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:27.028831+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:28.029171+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:29.029372+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:30.029532+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:31.029738+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188932096 unmapped: 16834560 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:32.029975+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:33.030135+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:34.030335+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:35.030664+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:36.031013+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:37.031355+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:38.031622+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:39.031867+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188940288 unmapped: 16826368 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:40.032068+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:41.032331+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:42.032553+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:43.032765+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:44.032930+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:45.033073+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:46.033217+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:47.033351+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188948480 unmapped: 16818176 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:48.033483+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:49.033659+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:50.033944+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:51.034076+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:52.034227+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:53.034411+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:54.034590+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:55.034792+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:56.034996+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:57.035205+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:58.035480+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:06:59.035657+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:00.036032+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:01.036224+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:02.036447+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:03.036649+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188956672 unmapped: 16809984 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:04.036934+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:05.037111+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:06.037323+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _send_mon_message to mon.np0005604212 at v2:172.18.0.103:3300/0
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:07.037496+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:08.037648+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:09.037832+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:10.038005+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:11.038169+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188964864 unmapped: 16801792 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:12.038344+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:13.038581+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:14.038754+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:15.038892+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:16.039065+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:17.039210+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:18.039376+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:19.039556+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188973056 unmapped: 16793600 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:20.039707+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188981248 unmapped: 16785408 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807791 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:21.039861+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore(/var/lib/ceph/osd/ceph-2) _kv_sync_thread utilization: idle 92.036148071s of 92.071762085s, submitted: 7
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 ms_handle_reset con 0x55aac3026400 session 0x55aabeaf4960
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189218816 unmapped: 16547840 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:22.040017+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Got map version 64
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: mgrc handle_mgr_map Active mgr is now [v2:172.18.0.108:6810/3516973848,v1:172.18.0.108:6811/3516973848]
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:23.040134+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:24.040347+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:25.040538+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:26.040735+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:27.040886+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:28.041027+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189210624 unmapped: 16556032 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:29.041236+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189218816 unmapped: 16547840 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:30.041404+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189218816 unmapped: 16547840 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:31.041619+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189218816 unmapped: 16547840 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:32.041859+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:33.046329+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:34.048134+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:35.051134+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:36.053639+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:37.056245+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:38.058479+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:39.060038+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:40.060226+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:41.060514+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:42.060946+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:43.061311+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189227008 unmapped: 16539648 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:44.061611+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:45.062026+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:46.062172+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:47.062431+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:48.062571+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:49.062830+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:50.063069+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:51.063253+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189235200 unmapped: 16531456 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:52.063433+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189243392 unmapped: 16523264 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:53.063614+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:54.063805+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:55.064045+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:56.064223+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:57.064464+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:58.064660+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:07:59.064901+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189251584 unmapped: 16515072 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:00.065062+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:01.065214+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:02.065407+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:03.065583+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:04.065788+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:05.065950+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:06.066162+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:07.066328+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189259776 unmapped: 16506880 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:08.066517+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:09.066705+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:10.066880+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:11.067076+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:12.067270+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:13.067511+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:14.067698+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:15.067837+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:16.068030+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189276160 unmapped: 16490496 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:17.068190+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:18.068339+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:19.068524+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:20.068636+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:21.068757+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:22.068873+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:23.069042+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189267968 unmapped: 16498688 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:24.069188+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189276160 unmapped: 16490496 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:25.069312+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.235294
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: rocksdb: commit_cache_size High Pri Pool Ratio set to 0.0384615
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: bluestore.MempoolThread(0x55aabc27bb60) _resize_shards cache_size: 4047413338 kv_alloc: 1744830464 kv_used: 2144 kv_onode_alloc: 285212672 kv_onode_used: 464 meta_alloc: 1677721600 meta_used: 2807423 data_alloc: 285212672 data_used: 12636160
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189276160 unmapped: 16490496 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: osd.2 299 heartbeat osd_stat(store_statfs(0x1b1481000/0x0/0x1bfc00000, data 0x61b26d4/0x636d000, compress 0x0/0x0/0x0, omap 0x649, meta 0x840f9b7), peers [0,1,3,4,5] op hist [])
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:26.069453+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 189292544 unmapped: 16474112 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:27.069599+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'config diff' '{prefix=config diff}'
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'config diff' '{prefix=config diff}' result is 0 bytes
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'config show' '{prefix=config show}'
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'config show' '{prefix=config show}' result is 0 bytes
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'counter dump' '{prefix=counter dump}'
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'counter dump' '{prefix=counter dump}' result is 0 bytes
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'counter schema' '{prefix=counter schema}'
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'counter schema' '{prefix=counter schema}' result is 0 bytes
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188981248 unmapped: 16785408 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:28.069781+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: prioritycache tune_memory target: 5709082009 mapped: 188801024 unmapped: 16965632 heap: 205766656 old mem: 4047413338 new mem: 4047413338
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: tick
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_tickets
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: monclient: _check_auth_rotating have uptodate secrets (they expire after 2026-02-01T10:08:29.069911+0000)
Feb 01 10:09:00 np0005604215.localdomain ceph-osd[31357]: do_command 'log dump' '{prefix=log dump}'
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr services", "format": "json-pretty"} v 0)
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1174075010' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:09:00 np0005604215.localdomain rsyslogd[760]: imjournal from <localhost:ceph-osd>: begin to drop messages due to rate-limiting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3561195748' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v801: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd crush tree", "show_shadow": true} v 0)
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/700763619' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1335435229' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2491969472' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3887349745' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2878408046' entity='client.admin' cmd={"prefix": "mgr metadata", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1559629843' entity='client.admin' cmd={"prefix": "osd crush rule ls"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1174075010' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1731493815' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2696677792' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1583657233' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/377540482' entity='client.admin' cmd={"prefix": "osd crush show-tunables"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/350204156' entity='client.admin' cmd={"prefix": "mgr module ls", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3561195748' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/465828579' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3924112788' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/700763619' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 01 10:09:00 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:00.774 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_write.cc:2098] [default] New memtable created with log file: #55. Immutable memtables: 0.
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.793398) [db/db_impl/db_impl_compaction_flush.cc:2832] Calling FlushMemTableToOutputFile with column family [default], flush slots available 1, compaction slots available 1, flush slots scheduled 1, compaction slots scheduled 0
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:856] [default] [JOB 31] Flushing memtable with next log file: 55
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940540793454, "job": 31, "event": "flush_started", "num_memtables": 1, "num_entries": 2570, "num_deletes": 251, "total_data_size": 3892603, "memory_usage": 4135424, "flush_reason": "Manual Compaction"}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:885] [default] [JOB 31] Level-0 flush table #56: started
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940540805184, "cf_name": "default", "job": 31, "event": "table_file_creation", "file_number": 56, "file_size": 2492283, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 34446, "largest_seqno": 37011, "table_properties": {"data_size": 2482918, "index_size": 5678, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 2629, "raw_key_size": 22529, "raw_average_key_size": 21, "raw_value_size": 2463030, "raw_average_value_size": 2356, "num_data_blocks": 248, "num_entries": 1045, "num_filter_entries": 1045, "num_deletions": 251, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769940353, "oldest_key_time": 1769940353, "file_creation_time": 1769940540, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 56, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/flush_job.cc:1019] [default] [JOB 31] Flush lasted 11820 microseconds, and 3271 cpu microseconds.
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.805225) [db/flush_job.cc:967] [default] [JOB 31] Level-0 flush table #56: 2492283 bytes OK
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.805240) [db/memtable_list.cc:519] [default] Level-0 commit table #56 started
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.807510) [db/memtable_list.cc:722] [default] Level-0 commit table #56: memtable #1 done
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.807523) EVENT_LOG_v1 {"time_micros": 1769940540807520, "job": 31, "event": "flush_finished", "output_compression": "NoCompression", "lsm_state": [1, 0, 0, 0, 0, 0, 1], "immutable_memtables": 0}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.807538) [db/db_impl/db_impl_compaction_flush.cc:299] [default] Level summary: base level 6 level multiplier 10.00 max bytes base 268435456 files[1 0 0 0 0 0 1] max score 0.25
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/db_impl/db_impl_files.cc:463] [JOB 31] Try to delete WAL files size 3880830, prev total WAL file size 3880830, number of live WAL files 2.
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000052.log immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.808146) [db/db_impl/db_impl_compaction_flush.cc:3165] [default] Manual compaction from level-0 to level-6 from '7061786F73003133303532' seq:72057594037927935, type:22 .. '7061786F73003133333034' seq:0, type:0; will stop at (end)
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1995] [default] [JOB 32] Compacting 1@0 + 1@6 files to L6, score -1.00
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:2001] [default]: Compaction start summary: Base version 31 Base level 0, inputs: [56(2433KB)], [54(21MB)]
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940540808197, "job": 32, "event": "compaction_started", "compaction_reason": "ManualCompaction", "files_L0": [56], "files_L6": [54], "score": -1, "input_data_size": 24928745, "oldest_snapshot_seqno": -1}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/compaction/compaction_job.cc:1588] [default] [JOB 32] Generated table #57: 15110 keys, 23525072 bytes, temperature: kUnknown
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940540915983, "cf_name": "default", "job": 32, "event": "table_file_creation", "file_number": 57, "file_size": 23525072, "file_checksum": "", "file_checksum_func_name": "Unknown", "smallest_seqno": 0, "largest_seqno": 0, "table_properties": {"data_size": 23436708, "index_size": 49479, "index_partitions": 0, "top_level_index_size": 0, "index_key_is_user_key": 1, "index_value_is_delta_encoded": 1, "filter_size": 37829, "raw_key_size": 404842, "raw_average_key_size": 26, "raw_value_size": 23178342, "raw_average_value_size": 1533, "num_data_blocks": 1845, "num_entries": 15110, "num_filter_entries": 15110, "num_deletions": 0, "num_merge_operands": 0, "num_range_deletions": 0, "format_version": 0, "fixed_key_len": 0, "filter_policy": "bloomfilter", "column_family_name": "default", "column_family_id": 0, "comparator": "leveldb.BytewiseComparator", "merge_operator": "", "prefix_extractor_name": "nullptr", "property_collectors": "[]", "compression": "NoCompression", "compression_options": "window_bits=-14; level=32767; strategy=0; max_dict_bytes=0; zstd_max_train_bytes=0; enabled=0; max_dict_buffer_bytes=0; use_zstd_dict_trainer=1; ", "creation_time": 1769939270, "oldest_key_time": 0, "file_creation_time": 1769940540, "slow_compression_estimated_data_size": 0, "fast_compression_estimated_data_size": 0, "db_id": "c098c70d-588d-409e-9f3c-16c3b4da1135", "db_session_id": "HRI08R8OB38WGRLS0V9F", "orig_file_number": 57, "seqno_to_time_mapping": "N/A"}}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [db/version_set.cc:4390] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.916181) [db/compaction/compaction_job.cc:1663] [default] [JOB 32] Compacted 1@0 + 1@6 files to L6 => 23525072 bytes
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.922078) [db/compaction/compaction_job.cc:865] [default] compacted to: base level 6 level multiplier 10.00 max bytes base 268435456 files[0 0 0 0 0 0 1] max score 0.00, MB/sec: 231.2 rd, 218.1 wr, level 6, files in(1, 1) out(1 +0 blob) MB in(2.4, 21.4 +0.0 blob) out(22.4 +0.0 blob), read-write-amplify(19.4) write-amplify(9.4) OK, records in: 15632, records dropped: 522 output_compression: NoCompression
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.922093) EVENT_LOG_v1 {"time_micros": 1769940540922086, "job": 32, "event": "compaction_finished", "compaction_time_micros": 107842, "compaction_time_cpu_micros": 28687, "output_level": 6, "num_output_files": 1, "total_output_size": 23525072, "num_input_records": 15632, "num_output_records": 15110, "num_subcompactions": 1, "output_compression": "NoCompression", "num_single_delete_mismatches": 0, "num_single_delete_fallthrough": 0, "lsm_state": [0, 0, 0, 0, 0, 0, 1]}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000056.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940540922387, "job": 32, "event": "table_file_deletion", "file_number": 56}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: [file/delete_scheduler.cc:74] Deleted file /var/lib/ceph/mon/ceph-np0005604215/store.db/000054.sst immediately, rate_bytes_per_sec 0, total_trash_size 0 max_trash_db_ratio 0.250000
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: EVENT_LOG_v1 {"time_micros": 1769940540923910, "job": 32, "event": "table_file_deletion", "file_number": 54}
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.808066) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.924134) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.924143) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.924157) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.924168) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: rocksdb: (Original Log Time 2026/02/01-10:09:00.924172) [db/db_impl/db_impl_compaction_flush.cc:1903] [default] Manual compaction starting
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr versions", "format": "json-pretty"} v 0)
Feb 01 10:09:00 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2298434486' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd erasure-code-profile ls"} v 0)
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2383924303' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:01.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:01.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:01.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858
Feb 01 10:09:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:01.101 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "mgr stat", "format": "json-pretty"} v 0)
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.106:0/3273266590' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:01.127 274321 DEBUG nova.compute.manager [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944
Feb 01 10:09:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69926 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd metadata"} v 0)
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1735651924' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:09:01 appctl.go:174: call(dpif-netdev/pmd-rxq-show): please specify an existing datapath
Feb 01 10:09:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:09:01 np0005604215.localdomain openstack_network_exporter[239388]: ERROR   10:09:01 appctl.go:174: call(dpif-netdev/pmd-perf-show): please specify an existing datapath
Feb 01 10:09:01 np0005604215.localdomain openstack_network_exporter[239388]: 
Feb 01 10:09:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69941 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: pgmap v801: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2560591169' entity='client.admin' cmd={"prefix": "osd crush tree", "show_shadow": true} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2569369202' entity='client.admin' cmd={"prefix": "mgr services", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2298434486' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2383924303' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2443551822' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3273266590' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1962519534' entity='client.admin' cmd={"prefix": "osd erasure-code-profile ls"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3812679772' entity='client.admin' cmd={"prefix": "mgr stat", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/4058022875' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1735651924' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2801920324' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/1894660587' entity='client.admin' cmd={"prefix": "osd metadata"} : dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49587 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd utilization"} v 0)
Feb 01 10:09:01 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1972905958' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69956 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60262 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49599 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69965 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60268 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain systemd[1]: Starting Hostname Service...
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69971 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain systemd[1]: Started Hostname Service.
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49605 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60280 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v802: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49611 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69983 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60289 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.69995 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.69926 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.69941 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2734535729' entity='client.admin' cmd={"prefix": "mgr versions", "format": "json-pretty"} : dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2623266101' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.49587 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1972905958' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.69956 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2422416916' entity='client.admin' cmd={"prefix": "osd utilization"} : dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.60262 -' entity='client.admin' cmd=[{"prefix": "orch host ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mon[298604]: from='client.49599 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:02 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60298 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49617 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49626 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:03.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.70010 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.411 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.412 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.413 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceilometer_agent_compute[232200]: 2026-02-01 10:09:03.414 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60316 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49641 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.70025 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "quorum_status"} v 0)
Feb 01 10:09:03 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1156660098' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60337 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:03 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49656 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "versions"} v 0)
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1461980216' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60352 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49662 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.69965 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.60268 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.69971 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.49605 -' entity='client.admin' cmd=[{"prefix": "telemetry channel ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.60280 -' entity='client.admin' cmd=[{"prefix": "orch device ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: pgmap v802: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.49611 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.69983 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.60289 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.69995 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.60298 -' entity='client.admin' cmd=[{"prefix": "orch ls", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.49617 -' entity='client.admin' cmd=[{"prefix": "telemetry collection ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.49626 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.70010 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/651103449' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1156660098' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3421620959' entity='client.admin' cmd={"prefix": "quorum_status"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1603977727' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1461980216' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v803: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "health", "detail": "detail", "format": "json-pretty"} v 0)
Feb 01 10:09:04 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4127433708' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 01 10:09:04 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60376 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "osd tree", "format": "json-pretty"} v 0)
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1937133833' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.012 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.099 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.100 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.126 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.127 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.127 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.127 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Auditing locally available compute resources for np0005604215.localdomain (node: np0005604215.localdomain) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.128 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon).osd e299 _set_new_cache_sizes cache_size:1020054731 inc_alloc: 343932928 full_alloc: 348127232 kv_alloc: 318767104
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.60316 -' entity='client.admin' cmd=[{"prefix": "orch ls", "export": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.49641 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.70025 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.60337 -' entity='client.admin' cmd=[{"prefix": "orch ps", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.49656 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.60352 -' entity='client.admin' cmd=[{"prefix": "orch status", "detail": true, "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.49662 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2287445398' entity='client.admin' cmd={"prefix": "versions"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/3101612127' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/325505573' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4127433708' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3538103465' entity='client.admin' cmd={"prefix": "health", "detail": "detail", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1937133833' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1986013335' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/3241099924' entity='client.admin' cmd={"prefix": "osd tree", "format": "json-pretty"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1825957733' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2237729374' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.520 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.393s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.671 274321 WARNING nova.virt.libvirt.driver [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.672 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Hypervisor/Node resource view: name=np0005604215.localdomain free_ram=11354MB free_disk=41.836978912353516GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.672 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.672 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.751 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.751 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Final resource view: name=np0005604215.localdomain phys_ram=15738MB used_ram=512MB phys_disk=41GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.776 274321 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 24 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "config dump"} v 0)
Feb 01 10:09:05 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3707425710' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 01 10:09:05 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:05.971 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running cmd (subprocess): ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384
Feb 01 10:09:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.60463 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.70148 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: pgmap v803: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.60376 -' entity='client.admin' cmd=[{"prefix": "orch upgrade status", "target": ["mon-mgr", ""], "format": "json-pretty"}]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2237729374' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd='mon_status' args=[]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='admin socket' entity='admin socket' cmd=mon_status args=[]: finished
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/893445289' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/3707425710' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/746426064' entity='client.admin' cmd={"prefix": "config dump"} : dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.60463 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: from='client.70148 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "format": "json"} v 0)
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/2702330094' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:09:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:06.417 274321 DEBUG oslo_concurrency.processutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] CMD "ceph df --format=json --id openstack --conf /etc/ceph/ceph.conf" returned: 0 in 0.446s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422
Feb 01 10:09:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:06.426 274321 DEBUG nova.compute.provider_tree [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed in ProviderTree for provider: d5eeed9a-e4d0-4244-8d4e-39e5c8263590 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180
Feb 01 10:09:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:06.456 274321 DEBUG nova.scheduler.client.report [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Inventory has not changed for provider d5eeed9a-e4d0-4244-8d4e-39e5c8263590 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 16.0}, 'MEMORY_MB': {'total': 15738, 'reserved': 512, 'min_unit': 1, 'max_unit': 15738, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 41, 'reserved': 1, 'min_unit': 1, 'max_unit': 41, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940
Feb 01 10:09:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:06.458 274321 DEBUG nova.compute.resource_tracker [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Compute_service record updated for np0005604215.localdomain:np0005604215.localdomain _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995
Feb 01 10:09:06 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:06.458 274321 DEBUG oslo_concurrency.lockutils [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.786s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423
Feb 01 10:09:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(audit) log [DBG] : from='client.49740 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:06 np0005604215.localdomain ceph-mgr[278126]: log_channel(cluster) log [DBG] : pgmap v804: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df", "detail": "detail"} v 0)
Feb 01 10:09:06 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/1729257029' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "df"} v 0)
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/4059810226' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Feb 01 10:09:07 np0005604215.localdomain kernel: cfg80211: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Feb 01 10:09:07 np0005604215.localdomain kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Feb 01 10:09:07 np0005604215.localdomain kernel: cfg80211: failed to load regulatory.db
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/2702330094' entity='client.openstack' cmd={"prefix": "df", "format": "json"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.49740 -' entity='client.admin' cmd=[{"prefix": "device ls", "target": ["mon-mgr", ""]}]: dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/1420909750' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: pgmap v804: 177 pgs: 177 active+clean; 234 MiB data, 1.4 GiB used, 41 GiB / 42 GiB avail
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/1729257029' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.106:0/2244901449' entity='client.admin' cmd={"prefix": "df", "detail": "detail"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.107:0/2681306489' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: from='client.? 172.18.0.108:0/4059810226' entity='client.admin' cmd={"prefix": "df"} : dispatch
Feb 01 10:09:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:07.459 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:07 np0005604215.localdomain nova_compute[274317]: 2026-02-01 10:09:07.460 274321 DEBUG oslo_service.periodic_task [None req-e04f1a51-10c3-4ec1-bb94-d41625939b91 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: mon.np0005604215@2(peon) e15 handle_command mon_command({"prefix": "fs dump"} v 0)
Feb 01 10:09:07 np0005604215.localdomain ceph-mon[298604]: log_channel(audit) log [DBG] : from='client.? 172.18.0.108:0/3222399205' entity='client.admin' cmd={"prefix": "fs dump"} : dispatch
